Memory problem with ngram-count

Kevin Duh duh at ee.washington.edu
Fri Aug 13 14:52:33 PDT 2004


Hi,

I'm running into some memory limitations with ngram-count and am 
wondering if anyone has any suggestions.

I have a very large text file (more than 1GB) as input to ngram-count. I 
divided this text into smaller files and used the 'make-batch-counts' 
and 'merge-batch-counts' commands to create a large count-file. Then, I 
tried to use 'ngram-count -read myfile.counts -lm ...' to estimate a 
language model. I receive the following error:

ngram-count: /SRILM/include/LHash.cc:127: void LHash<KeyT, 
DataT>::alloc(unsigned int) [with KeyT = VocabIndex, DataT = 
Trie<VocabIndex, unsigned int>]: Assertion `body != 0' failed.

Does anyone have any suggestions for solving this problem?

Thanks in advance,
Kevin

-----------------------------
Kevin Duh
Graduate Research Assistant
Dept. of Electrical Engineering
University of Washington
http://ssli.ee.washington.edu/people/duh



More information about the SRILM-User mailing list