Language modeling using stochastic automata with variable length contexts
โ Scribed by Jianying Hu; William Turin; Michael K. Brown
- Publisher
- Elsevier Science
- Year
- 1997
- Tongue
- English
- Weight
- 299 KB
- Volume
- 11
- Category
- Article
- ISSN
- 0885-2308
No coin nor oath required. For personal study only.
โฆ Synopsis
It is well known that language models are effective for increasing the accuracy of speech and handwriting recognizers, but large language models are often required to achieve low model perplexity (or entropy) and still have adequate language coverage. We study three efficient methods for variable order stochastic language modeling in the context of the stochastic pattern recognition problem. Two of these methods are previous techniques from recent literature, and one is a new method based on a successful text compression technique. We give results of a comparative analysis, and demonstrate that the best performance is achieved by extending one of the previous techniques using elements from the newly developed method.
๐ SIMILAR VOLUMES