Login
Computer Science @ Rochester
Monday, November 03, 2003
11:00 AM
CSB 209
Eugene Charniak
Brown U.
Syntax-Based Language Modeling for Machine Translation
Formally a language model is a probability distribution over all string in a language. Practically they are used to improve the output of speech recognition systems and, more recently, language-translation programs. Language models can be very simple, as in the tried and true trigram model, where one estimates the probability of the next word as a function of just the two previous words. However more recent research has investigated language models based upon statistical parsing algorithms. In this talk I describe some experiments in which such a syntax-based model has been added to an already existing language-translation system. The resulting system exhibits a dramatically improved capability of returning grammatical sentences rather than syntactic fruit salad---the percentage of translations which are both meaning preserving and syntactically correct is up by 47%. I also stress the elegance of the resulting system, as the two programs (language and translation models) are quite tightly and naturally integrated. In some otherwise unwarranted speculation I suggest the system as a possible model for language-generation, and how a single syntactic system can inform both parsing and generation.