I am passing data to Elasticsearch (ES) through a Python script. First, I secured ES with a self-signed certificate and everything works as expected. Then, I switched to a more trusted certificate (Let's Encrypt). Note, that I can reach my ES cluster without any problems. The Let's Encrypt cert is trusted by my browser + by an application that is talking to ES, no problem. But when I try to pass data from my Python script to ES with the new certificate, I get following error:
urllib3.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",)
I would have expected this error with a self signed cert but not with Let's encrypt. The only way I can avoid it, is by changing the settings to verify=False, which is no long-term solution.
Before I received the error message mentioned above, I got following error:
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777))
I found a workaround for this by doing pip install requests. However, afterwards I receive the first error I mentioned (bad handshake). I know that this means that the certificate is not trusted. But how can this be, if it works for a self-signed cert but not for a Let's Encrypt cert that is trusted by a browser and an app? E.g. if I call ES on https://my-IP:9200, no warning is given by my browser, while a warning will be given with the self-signed cert.
Some additional info
- python3
- urllib3 1.25.7
- certifi 2019.9.11
- Ubuntu 18.04
So, basically everything is up-to-date. I also tried a suggested solution by downgrading certifi and/or urllib3, but it doesn't work. One suggestion is to downgrade urllib3 below version 1.25 (but as I said, it doesn't work in my case).
Any ideas?