2

I am using python-elasticsearch module, I read in the python-elasticsearch documentation that you can log all the underlying HTTP requests as command line curl commands:

elasticsearch.trace can be used to log requests to the server in the form of curl commands using pretty-printed json that can then be executed from command line. Because it is designed to be shared (for example to demonstrate an issue) it also just uses localhost:9200 as the address instead of the actual address of the host. If the trace logger has not been configured already it is set to propagate=False so it needs to be activated separately.

For python-elasticsearch module, how do you enable this curl logging?

I tried:

  • setting the global logger to logging.basicConfig(level=logging.DEBUG) but that didn't output the curl
  • I tried getting the elasticsearch.trace logger and setting that logger's level to logging.DEBUG and then setting es_trace_logger.propagate = True but neither of those worked
3
  • Try initializing the elasticsearch logger and set its level to DEBUG Commented Oct 12, 2018 at 16:34
  • @Jay so logger = logging.getLogger('elasticsearch'); logger.setLevel(logging.DEBUG)? have you tried this and know it works? Commented Oct 12, 2018 at 17:12
  • Hmm, the elasticsearch logger will only show you the GET/POST request and the corresponding response. The elasticsearch.trace logger will show the same info in a better manner as a curl request and in prettified json. Commented Oct 13, 2018 at 3:50

1 Answer 1

11

I think one crucial step which you might be missing is adding a handler to the elasticsearch.trace logger.

import logging
es_trace_logger = logging.getLogger('elasticsearch.trace')
es_trace_logger.setLevel(logging.DEBUG)
handler = logging.StreamHandler()
es_trace_logger.addHandler(handler)

So here I have added a StreamHandler to the logger, so all the logs will go to stdout. You can add a different handler if needed as per your use case like FileHandler for example.

Here is a sample debug log for the same -

curl -XGET 'http://localhost:9200/my_index/_search?pretty' -d '{
  "size": 100
}'
#[200] (1.311s)
#{
#  "_shards": {
#    "failed": 0,
#    "successful": 6,
#    "total": 6
#  },
#  "hits": {
#    "hits": [
#      {
#        "_id": "FLKSD0SDFJJSDF7D518319DE5EEBB5d5b07044",

Having this logger will log the whole request and response for every request that we do, so sometimes these logs can be overwhelming, but are pretty good for debugging.

For the same request, the corresponding elasticsearch logger will output something like this -

GET http://my_es_host:9200/my_index/_search [status:200 request:1.528s]
> {"size": 100}
< {"took":21,"timed_out":false,"_shards":{"total":6,"successful":6,"failed":0},"hits":{"total":112,"max_score":1.0,"hits":[{"_index":"my_index","_
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.