3

I have a log file that's an array of objects that looks something like this:

[
  { "cate1": "data1a", "cate2": "data2a" },
  { "cate1": "data1b", "cate2": "data2b" },
  { "cate1": "data1c", "cate2": "data2c" }
]

and I need each object in the array to be a separate entry in Elasticsearch and each "cate" to be a field. My current logstash.conf file is:

input {
  tcp {
    port => 5000
  }
}

## Add your filters / logstash plugins configuration here

filter {
  json {
    source => "message"
    target => "event"
  }
  mutate {
    gsub => ["message","\]",""]
    gsub => ["message","\[",""]
  }
}

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
  }
}

but it tags each line with "_jsonparsefailure" except the first entry and it parses the square brackets as well. How would I go about configuring Logstash to do this properly?

1
  • Please show us what the output looks like, as well as what your config/output were when you tried the json codec Commented Jul 21, 2016 at 18:09

1 Answer 1

3

Instead of using the json filter, you should look into using the json codec on your input. It seems to do exactly what you want:

This codec may be used to decode (via inputs) and encode (via outputs) full JSON messages. If the data being sent is a JSON array at its root multiple events will be created (one per element).

It would look something like this:

input {
  tcp {
    port => 5000
    codec => json{ }
  }
}
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.