[SRILM User List] Why modified Kneser-Ney much slower than Good-Turing using make-big-lm?

Meng Chen chenmengdx at gmail.com
Thu Aug 2 02:30:56 PDT 2012

Hi, I am training LM using *make-batch-counts*, *merge-batch-counts* and *
make-big-lm*. I compared the modified Kneser-Ney and Good-Turing smoothing
algorithm in *make-big-lm*, and found that the training speed is much
slower by modified Kneser-Ney. I checked the debug information, and found
that it run *make-kn-counts* and *merge-batch-counts*, which cost most of
the time. I wonder if the extra two steps could run in *make-batch-counts*,
so it could save much time.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.speech.sri.com/pipermail/srilm-user/attachments/20120802/26ac966f/attachment.html>

More information about the SRILM-User mailing list