I am new to docker. I am working on a small program to upload docs to elastic search domain, i set up in aws. I tested my code to make sure I can make connection with the elastic service i have set up ( see sample below). I am adding code to push docs from my local drive to the server. how can i set up docker, (without incurring too much additional cost), and package this code as an image and deploy it into , say EC2 server or something. I am still filing in the details, but i can have an ec2 instance , reading and processing the data and pushing it into the elastic service. can someone suggest steps or if my approach is missing anything?
import json
from elasticsearch import Elasticsearch, RequestsHttpConnection
from requests_aws4auth import AWS4Auth
my_region = 'us-east-1'
my_service = 'es'
my_eshost = 'search-jiudomain-bqfy4dd5xuljut33l6jdz7gkqi.us-east-1.es.amazonaws.com'
aws_auth = AWS4Auth( 'AKIA***','******', my_region, my_service)
es = Elasticsearch(hosts = [{'host': my_eshost, 'port': 443}],
http_auth=aws_auth, use_ssl=True, verify_certs=True, connection_class=RequestsHttpConnection)
print(json.dumps(es.info()))
AWS_*environment variables would be better practice then hard-coding credentials in code.