In our elastic index we have daily news documents and we are running aggregations for these documents. But, after 2 consecutive run elasticsearch returns not enough memory exception. Now we have increased the heap size for elastic but is there any solution other then increasing the ram for elastic?
The field properties which is used for aggregations;
"detail_stop": {
"type": "string",
"store": true,
"analyzer": "stop_analyzer"
}
The query for aggregation;
{
"from": 0,
"size": 5000,
"query": {
"bool": {
"must": [
{
"range": {
"date": {
"gte": "now-0d/d"
}
}
}
]
}
},
"aggs": {
"words": {
"terms": {
"size": 5000,
"field": detail_stop,
"min_doc_count": 3
}
}
}
}
Currently we have an elastic cluster with 1 node(8core 2.5ghz , 32gb) and ES_HEAP_SIZE = 16g(elastic have 16gb memory). How can we reduce usage of memory and increase performance?