1

I am new to docker. I am working on a small program to upload docs to elastic search domain, i set up in aws. I tested my code to make sure I can make connection with the elastic service i have set up ( see sample below). I am adding code to push docs from my local drive to the server. how can i set up docker, (without incurring too much additional cost), and package this code as an image and deploy it into , say EC2 server or something. I am still filing in the details, but i can have an ec2 instance , reading and processing the data and pushing it into the elastic service. can someone suggest steps or if my approach is missing anything?

import json
from elasticsearch import Elasticsearch, RequestsHttpConnection
from requests_aws4auth import AWS4Auth

my_region = 'us-east-1'
my_service = 'es'
my_eshost = 'search-jiudomain-bqfy4dd5xuljut33l6jdz7gkqi.us-east-1.es.amazonaws.com'


aws_auth = AWS4Auth( 'AKIA***','******', my_region, my_service)

es = Elasticsearch(hosts = [{'host': my_eshost, 'port': 443}],
    http_auth=aws_auth, use_ssl=True, verify_certs=True, connection_class=RequestsHttpConnection)

print(json.dumps(es.info()))
1
  • Please consider deleting and recreating the AWS credentials you used here (and shared publicly). Using the AWS_* environment variables would be better practice then hard-coding credentials in code. Commented Jul 1, 2020 at 11:35

1 Answer 1

1

The question seems broad but few details that I can add

For these steps you will need additional effort but plus point with these approach AWS will manage everything for you.

Now if we look into your query

how can i set up docker, (without incurring too much additional cost)

If you are in development phase and still learning Docker, then better to docker-compose or simply docker in the existing EC2 machine.

All you need

  • Create Docker image for your above
  • Create docker-compose file
  • Run the docker image in the existing Ec2 machine
Sign up to request clarification or add additional context in comments.

2 Comments

thanks for the detailed steps. to iterate, set up the docker locally using python docker image. with container orchestration tool, is like a repo , to store and even manage deployment of these images. one follow up question, how to set up trigger , (this is just an example) such that say , when i upload a file to S3, these images are deployed to ec2 and code reads and process these files. ? in short, do we create & store images until, we need to run jobs. at that point , we manually or set up some trigger to provision aws resources and deploy these images on to the ec2 servers?
Aws does not provide way to deploy on self mange container on ec2, on ecs you can do that thing with code deploy, also you will store docker image on ECR, it's docker register by aws, you can configure cloud watch events on image update

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.