3

I am getting Mapper Parsing Exception while trying to parse logs using Logstash and outputting to ElasticSearch with error message: "field name can not contain dot (.)"
This was working fine with old Logstash version and I have recently upgraded to Logstash 2.0:
My Logstash config file looks like:

input {
  kafka {
    topic_id => "topic1"
    zk_connect => "111.222.333.444:2181"
    type => "log_type"
    reset_beginning => true
  }
}

output {
#  stdout { codec => rubydebug }

  elasticsearch {
    hosts => "127.0.0.1:9200"
    flush_size => 200
    idle_flush_time => 1
    index => "index-name-%{+YYYY.MM.dd}"
   }
}

1 Answer 1

3

As specified on https://discuss.elastic.co/t/field-name-cannot-contain/33251/5, field names can not support dots in their names.
As specified on the page, an easy fix will be to replace all dots with underscores using following filter:

filter {
  ruby {
        code => "
          event.to_hash.keys.each { |k| event[ k.sub('.','_') ] = event.remove(k) if k.include?'.' }
        "
    }
}
Sign up to request clarification or add additional context in comments.

1 Comment

Sadly, instead of cleaning up their own code, they introduced a de_dot filter that does this. elastic.co/blog/introducing-the-de_dot-filter

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.