So, I'm trying to configure logstash to fetch JSON data from a public API and insert into Elasticsearch.
The data looks like this:
{
"Meta Data": {
"1. Information": "Daily Aggregation",
"2. Name": "EXAMPLE",
"3. Last Refreshed": "2018-04-06"
},
"Time Series": {
"2018-04-06": {
"1. Value1": "20",
"2. Value 2": "21",
"3. Value 3": "20",
"4. Value 4": "21",
"5. Value 5": "47"
},
"2018-04-05": {
"1. open": "21",
"2. high": "21",
"3. low": "21",
"4. close": "21",
"5. volume": "88"
},
"2018-04-04": {
"1. open": "20",
"2. high": "20",
"3. low": "20",
"4. close": "20",
"5. volume": "58"
},
"2018-04-03": {
"1. Value1": "20",
"2. Value 2": "21",
"3. Value 3": "20",
"4. Value 4": "21",
"5. Value 5": "47"
},
...
}
}
I don't care about the metadata, I want each object inside the "Time Series" to become a different event to be sent to Elasticsearch. I just don't know how to do it.
So far, I just got the input configuration right...
input {
http_poller {
urls => {
test1 => "https://www.public-facing-api.com/query?function=TIME_SERIES_DAILY&name=EXAMPLE"
#headers => {
# Accept => "application/json"
#}
}
request_timeout => 60
# Supports "cron", "every", "at" and "in" schedules by rufus scheduler
schedule => { cron => "* * * * * * UTC"}
codec => "json"
}
}
filter {
json {
source => "message"
target => "parsedMain"
}
json {
source => "[parsedMain][Time Series]"
target => "parsedContent"
}
}
output {
stdout { codec => rubydebug }
}
But it just prints everything as a single object.
I would also like to capture the date, which is the name of each nested object, and set it to ES timestamp. Also, the id as %{date}_%{name}.
Does anyone know how to do it?