I have a model which is around 18G. I am trying to quantize it, but I get this error:
./fastText-0.9.2/fasttext quantize -output /mnt/models/model -input /mnt/data/train.txt -thread 92
terminate called after throwing an instance of 'std::length_error'
what(): vector::_M_default_append
Aborted (core dumped)
Any idea why this error occurs and how to resolve it?