2

I had like to update the logdate column for ALL records in a specific index. From what I have read so far, it seems that it is not possible? I am correct?

Here's a sample of a document:

{
            "_index": "logstash-01-2015",
            "_type": "ufdb",
            "_id": "AU__EvrALg15uxY1Wxf9",
            "_score": 1,
            "_source": {
               "message": "2015-08-14 06:50:05 [31946] PASS  level2      10.249.10.70    level2     ads       http://ad.360yield.com/unpixel.... GET",
               "@version": "1",
               "@timestamp": "2015-09-24T11:17:57.389Z",
               "type": "ufdb",
               "file": "/usr/local/ufdbguard/logs/ufdbguardd.log",
               "host": "PROXY-DEV",
               "offset": "3983281700",
               "logdate": "2015-08-14T04:50:05.000Z",
               "status": "PASS",
               "group": "level2",
               "clientip": "10.249.10.70",
               "category": "ads",
               "url": "http://ad.360yield.com/unpixel....",
               "method": "GET",
               "tags": [
                  "_grokparsefailure"
               ]
            }
         }
2
  • What do you mean by "update"? Change the value to some other string? Convert it to a date object instead of a string? Copy the value to @timestamp? or ? Commented Sep 25, 2015 at 17:27
  • I had like to change the logdate field to another date for ALL documents. In MySQL, I would have done something like this: UPDATE logstash SET logdate= "2015-09-20T04:50:05.000Z" but it ElasticSearch it seems that I can only update one document by supplying its _id. Commented Sep 27, 2015 at 6:43

2 Answers 2

2

You are correct, that is not possible.

There's been an open issue asking Update by Query for long time, and I'm not sure it's going to be implemented anytime soon since it is very problematic for the underlying lucene engine. It requires deleting all documents and reindexing them.

An Update by Query Plugin is available on github, but it's experimental and I never tried it.

UPDATE 2018-05-02

The original answer is quite old. Update By Query is now supported.

Sign up to request clarification or add additional context in comments.

Comments

0

You can use the partial update API.

To test it, I created a trivial index:

PUT /test_index

Then created a document:

PUT /test_index/doc/1
{
   "message": "2015-08-14 06:50:05 [31946] PASS  level2      10.249.10.70    level2     ads       http://ad.360yield.com/unpixel.... GET",
   "@version": "1",
   "@timestamp": "2015-09-24T11:17:57.389Z",
   "type": "ufdb",
   "file": "/usr/local/ufdbguard/logs/ufdbguardd.log",
   "host": "PROXY-DEV",
   "offset": "3983281700",
   "logdate": "2015-08-14T04:50:05.000Z",
   "status": "PASS",
   "group": "level2",
   "clientip": "10.249.10.70",
   "category": "ads",
   "url": "http://ad.360yield.com/unpixel....",
   "method": "GET",
   "tags": [
      "_grokparsefailure"
   ]
}

Now I can do a partial update on the document with:

POST /test_index/doc/1/_update
{
    "doc": {
        "logdate": "2015-09-25T12:20:00.000Z"
    }
}

If I retrieve the document:

GET /test_index/doc/1

I will see that the logdate property has been updated:

{
   "_index": "test_index",
   "_type": "doc",
   "_id": "1",
   "_version": 2,
   "found": true,
   "_source": {
      "message": "2015-08-14 06:50:05 [31946] PASS  level2      10.249.10.70    level2     ads       http://ad.360yield.com/unpixel.... GET",
      "@version": "1",
      "@timestamp": "2015-09-24T11:17:57.389Z",
      "type": "ufdb",
      "file": "/usr/local/ufdbguard/logs/ufdbguardd.log",
      "host": "PROXY-DEV",
      "offset": "3983281700",
      "logdate": "2015-09-25T12:20:00.000Z",
      "status": "PASS",
      "group": "level2",
      "clientip": "10.249.10.70",
      "category": "ads",
      "url": "http://ad.360yield.com/unpixel....",
      "method": "GET",
      "tags": [
         "_grokparsefailure"
      ]
   }
}

Here is the code I used to test it:

http://sense.qbox.io/gist/236bf271df6d867f5f0c87eacab592e41d3095cf

1 Comment

Thanks for your answer. I think I have not explained correctly. I have checked the update API but from what I can see is that you have to supply the _id field in order to run the update. In my case, I have like to run an update on all documents to make the 'logdate' the same.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.