nlp - Pickel Error while storing Doc2vec gensim model -


i trying save gensim doc2vec model. model trained on 9m document vectors , vocabulary of around 1m words. getting pickel error. "top" shows program uses around 13gb of ram. think since need re-train model new documents , when required, saving parameters necessary.

traceback (most recent call last):  file "doc_2_vec.py", line 61, in <module>  model.save("/data/model_wl_videos/model",pickle_protocol=2)  file "/home/meghana/.local/lib/python2.7/site-packages/gensim/models/word2vec.py", line 1406, in save super(word2vec, self).save(*args, **kwargs)  file "/home/meghana.negi/.local/lib/python2.7/site-packages/gensim/utils.py", line 504, in save pickle_protocol=pickle_protocol)  file "/home/meghana/.local/lib/python2.7/site-packages/gensim/utils.py", line 376, in _smart_save pickle(self, fname, protocol=pickle_protocol)  file "/home/meghana/.local/lib/python2.7/site-packages/gensim/utils.py", line 930, in pickle _pickle.dump(obj, fout, protocol=protocol) 

memoryerror


Comments

Popular posts from this blog

ubuntu - PHP script to find files of certain extensions in a directory, returns populated array when run in browser, but empty array when run from terminal -

php - How can i create a user dashboard -

javascript - How to detect toggling of the fullscreen-toolbar in jQuery Mobile? -