Whole-sentence exponential language models: a vehicle for linguistic-statistical integration
✍ Scribed by Ronald Rosenfeld; Stanley F. Chen; Xiaojin Zhu
- Book ID
- 102566859
- Publisher
- Elsevier Science
- Year
- 2001
- Tongue
- English
- Weight
- 178 KB
- Volume
- 15
- Category
- Article
- ISSN
- 0885-2308
No coin nor oath required. For personal study only.
✦ Synopsis
We introduce an exponential language model which models a whole sentence or utterance as a single unit. By avoiding the chain rule, the model treats each sentence as a "bag of features", where features are arbitrary computable properties of the sentence. The new model is computationally more efficient, and more naturally suited to modeling global sentential phenomena, than the conditional exponential (e.g. maximum entropy) models proposed to date. Using the model is straightforward. Training the model requires sampling from an exponential distribution. We describe the challenge of applying Monte Carlo Markov Chain and other sampling techniques to natural language, and discuss smoothing and step-size selection. We then present a novel procedure for feature selection, which exploits discrepancies between the existing model and the training corpus. We demonstrate our ideas by constructing and analysing competitive models in the Switchboard and Broadcast News domains, incorporating lexical and syntactic information.