mix LM

Hongqin Liu hliu at inzigo.com
Tue Sep 3 10:12:22 PDT 2002


Andreas,

Sorry for bothering  you again. I was trying to use 'compute-best-mix' to
get the weight (lambda), but got:

fatal: division by zero attempted

The inputs for it were two ppl files from class and word based models,
respectively by ngram ->

./compute-best-mix /home/hliu/language_model/word/lm.word.3.ppl
/home/hliu/language_model/class/lm.class.3.ppl


I guess I missed something in using this script (compute-best-mix)?

Best HL.






Stolcke wrote:

> In message <3D74DD88.8E7C2BD8 at inzigo.com>you wrote:
>
> > His first suggestion reminds me the mixture LM. Actually I made some
> > tests on the interpolattion approach, including class + word LM. I
> > always found that the perplexity (and WER) is a linear function of the
> > interpolation parameter (Lambda), so the best results are always at the
> > ends, which makes the interpolation trivil. Did I miss something, or it
> > is the case for some domains?
> >
>
> Hongqin,
>
> how did you find the best interpolation weight?  I hope you didn't
> use trial-and-error and used the compute-best-mix script instead.
> In my experience the perplexity is not a linear function of lambda,
> unless maybe your class-based LM is very bad.  Rather, ppl should be
> U-shaped function as lambda varies between 0 and 1.
>
> --Andreas




More information about the SRILM-User mailing list