5

I've been searching and reading docs on how to import mysql data to ES and get the best solution by creating logstash file and running it every minute. But at last, it wasn't efficient compared to my requirement(I have to run each query on each table in logstash which it's not practical).

I have to import the whole MySQL Database to Elasticsearch including mappings,tables relationships and data every time a user handles my application (there are many users on the platform).

I've already read this link but it's not helping me. Any suggestions ? Thanks

1 Answer 1

1
  • Insert storms data into MySQL
  • Import Data to Elasticsearch using Logstash
  • Create Kibana Dashboard

You can use logstash and its jdbc input to read from your DB and push json to elasticsearch. https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

https://qbox.io/blog/migrating-mysql-data-into-elasticsearch-using-logstash

go through these and follow the instruction here.

Sign up to request clarification or add additional context in comments.

2 Comments

Thank you @Wijayanga for this proposition, I already created logstash file and migrated one table to Elasticsearch. My purpose is to migrate whole database (or many tables) in one execution -> in the same logstash file. Do you have any idea for this process?
I think you can use multiple pipelines to accomplish this. However, I do not sure about check the configuration here. elastic.co/guide/en/logstash/current/… you cannot use multiple input of jdbc in logstash discuss.elastic.co/t/…

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.