mix LM
Hongqin Liu
hliu at inzigo.com
Tue Sep 3 09:04:24 PDT 2002
Hi,
First I appreciate the quick response from Andreas, the guy with the
Long Quan sword.
His first suggestion reminds me the mixture LM. Actually I made some
tests on the interpolattion approach, including class + word LM. I
always found that the perplexity (and WER) is a linear function of the
interpolation parameter (Lambda), so the best results are always at the
ends, which makes the interpolation trivil. Did I miss something, or it
is the case for some domains?
Best,
Hongqin
More information about the SRILM-User
mailing list