Memory issues
David Mas
David.Mas at limsi.fr
Wed Nov 27 06:45:17 PST 2002
Hi,
I'm a french PhD Student, using the toolkit to compute ngram and
class-ngram models on Hub4 and Hub5 data.
I recently tried to mix several models with ngram -mix-lm, which works
fine except for big models (learned on Hub4).
It seems to be matter of memory. So I used the -memuse option to have an
idea of the memory load.
But this option doesn't reflect the actual load of the memory. It says
900M when a top running of the same machine gives a amount a 2,5G used.
So my 2 questions are :
- is it normal that the -memuse option gives a wrong result ?
- is it normal that the toolkit use so much memory, or have I done
something wrong in the installation ?
Any help is welcome.
David Mas
--
David Mas
LIMSI/CNRS, groupe TLP
Tel : 01 69 85 80 05
http://www.limsi.fr/Individu/mas/
More information about the SRILM-User
mailing list