i am new in python and Elasticsearch i write a python code that read data from very large json file and index some attributes in Elasricsearch.
import elasticsearch
import json
es = elasticsearch.Elasticsearch() # use default of localhost, port 9200
with open('j.json') as f:
n=0
for line in f:
try:
j_content = json.loads(line)
event_type = j_content['6000000']
device_id = j_content['6500048']
raw_event_msg= j_content['6000012']
event_id = j_content["0"]
body = {
'6000000': str(event_type),
'6500048': str(device_id),
'6000012': str(raw_event_msg),
'6000014': str(event_id),
}
n=n+1
es.index(index='coredb', doc_type='json_data', body=body)
except:
pass
but it's too slow and i have many free hardware resources. how can i improve performance of code with multi thread or bulk ?