[SRILM User List] Pruning of KN-smoothed models

Andreas Stolcke stolcke at speech.sri.com
Tue Sep 28 14:38:36 PDT 2010


Ciprian Chelba and colleagues have a nice paper at Interspeech 
showing how KN smoothing interacts badly with N-gram pruning,
especially if the pruning is severe.  The reason is that the 
ngram history marginal probabilities are poorly estimated by 
the lower-order distributions produced by KN smoothing.

To remedy the problem pointed out in the paper, I added a way to
specify a separate model for computing the history marginals,
different from the model being pruned.  For example, when pruning a
KN-smoothed 4-gram model M you could specify a GT-smoothed 3-gram H
using

	ngram -lm M -prune ... -prune-history-lm H

(the history LM only needs to be of order one less than the
model being pruned, so it can be much smaller).

This also gives you the option of pruning an LM in a way that 
is targeted at a specific domain.  For example, if you
have a large LM M and want to create a smaller version that
works well in some domain for which you have a specialized LM
D, you would use

	ngram -lm M -prune ... -prune-history-lm D 

This makes sure you retain the N-grams that matter for your
target domain.  Of course D should not be KN-smoothed model!

The new option is implemented in the beta version on the download
server, and a test case is in $SRILM/lm/test/tests/ngram-prune-history-lm.

Comments welcome.

--Andreas



More information about the SRILM-User mailing list