Class phrase models for language modeling
Citations Over TimeTop 10% of 2002 papers
Abstract
Previous attempts to automatically determine multi-words as the basic unit for language modeling have been successful for extending bigram models to improve the perplexity of the language model and/or the word accuracy of the speech decoder. However, none of these techniques gave improvements over the trigram model so far, except for the rather controlled ATIS task (McCandless & Glass, 1994). We therefore propose an algorithm that minimizes the perplexity of a bigram model directly. The new algorithm is able to reduce the trigram perplexity and also achieves word accuracy improvements in the Verbmobil task. It is the natural counterpart of successful word classification algorithms for language modeling that minimize the leaving-one-out bigram perplexity. We also give some details on the usage of class-finding techniques and m-gram models, which can be crucial to successful applications of this technique.
Related Papers
- → Scalable backoff language models(2002)86 cited
- → Algorithms for bigram and trigram word clustering(1995)21 cited
- → Evaluation of a language model using a clustered model backoff(2002)8 cited
- → Improving language models by using distant information(2007)3 cited
- → Statistical Analysis of Chinese Language and Language Modeling Based on Huge Text Corpora1(2000)