Universal Language Model Fine-tuning for Text Classification

We propose Universal Language Model. Fine-tuning (ULMFiT) an effective trans- fer learning method that can be applied to any task in NLP



A Neural Probabilistic Language Model

A goal of statistical language modeling is to learn the joint probability function of sequences of words in a language. This is intrinsically difficult because 



Cross-lingual Language Model Pretraining

We show that cross-lingual language models can provide significant improvements on the perplexity of low-resource languages. 5. We make our code and pretrained 



Variable-Length Sequence Language Model for Large Vocabulary

19 oct. 2006 One of the most successful language models used in speech recognition is the n-gram model which assumes that the statistical dependencies ...



When Being Unseen from mBERT is just the Beginning: Handling

Handling New Languages With Multilingual Language Models. Benjamin Muller†*. Antonios Anastasopoulos‡ tation experiments to get usable language model-.



Enhancing lexical cohesion measure with confidence measures

30 nov. 2011 Enhancing lexical cohesion measure with confidence measures semantic relations and language model interpolation for multimedia spoken con-.



Release of largest trained open-science multilingual language

12 juil. 2022 BLOOM is the largest multilingual language model to be trained 100% openly and transparently. AI models of its kind.



Recurrent Neural Network Based Language Model

A new recurrent neural network based language model (RNN. LM) with applications to speech recognition is presented. Re- sults indicate that it is possible 



N-gram Language Models

language model. LM ities to sentences and sequences of words the n-gram. An n-gram is a sequence n-gram of n words: a 2-gram (which we'll call bigram) is a