python 2.7 - Elastic search memoryerror on bulk insert -


im inserting 5000 records @ once elastic search total size of these records is: 33936 (i got using sys.getsizeof())

elastic search version: 1.5.0 python 2.7 ubuntu

here following error

traceback (most recent call last):   file "run_indexing.py", line 67, in <module>     index_policy_content(datatable, source, policyids)   file "run_indexing.py", line 60, in index_policy_content     bulk(elasticsearch_instance, actions)   file "/usr/local/lib/python2.7/dist-packages/elasticsearch/helpers.py", line 148, in bulk     ok, item in streaming_bulk(client, actions, **kwargs):   file "/usr/local/lib/python2.7/dist-packages/elasticsearch/helpers.py", line 107, in streaming_bulk     resp = client.bulk(bulk_actions, **kwargs)   file "/usr/local/lib/python2.7/dist-packages/elasticsearch/client/utils.py", line 70, in _wrapped     return func(*args, params=params, **kwargs)   file "/usr/local/lib/python2.7/dist-packages/elasticsearch/client/__init__.py", line 568, in bulk     params=params, body=self._bulk_body(body))   file "/usr/local/lib/python2.7/dist-packages/elasticsearch/transport.py", line 259, in perform_request     body = body.encode('utf-8') memoryerror 

please me resolve issue.

thanks & regards, afroze

if had guess, i'd memory error happening within python loads , serializes data. try cutting way on batch sizes until works, , binary search upward until fails again. should figure out safe batch size use.

(other useful information might want include: amount of memory in server you're running python process on, amount of memory elasticsearch server node(s), amount of heap allocated java.)


Comments

Popular posts from this blog

c++ - Difference between pre and post decrement in recursive function argument -

php - Nothing but 'run(); ' when browsing to my local project, how do I fix this? -

php - How can I echo out this array? -