[SRILM User List] perplexity-discounting

Md. Akmal Haidar akmalcuet00 at yahoo.com
Thu Oct 8 08:28:36 PDT 2009


Dear Srilm Users,

I have a question for perplexity computation..

I create an interpolated language model  which gives lower perplexity than general baseline language model. I used witten-bell  discounting for both.

But when i used kn discounting the baseline language models shows greater perplexity than the interpolated one.

Can anybody tell me, why?

Why not kn discounting also give lower perplexity...?

Thanks & Regards

Akmal


      
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.speech.sri.com/pipermail/srilm-user/attachments/20091008/d1557378/attachment.html>


More information about the SRILM-User mailing list