0

I have 4 linux EC2 instances created from the same AMI that I use to process files in S3. I run the same python script on each instance. It takes a directory of files in S3 to process and a number telling it which files it is supposed to process. Say mydir contains myfile1 ... myfile8.

On instance 0 I call: python process.py mydir 0 This causes it to process myfile1 and myfile5.

On instance 1 I call: python process.py mydir 1 This causes it to process myfile2 and myfile2.

And so on.

Inside the script I do: keys = keys[pid::4] where pid is the argument from the command line.

I redistribute changes to my python script by syncing from S3.

Is there a simple way to automate this further? I would like to press one button and say dir=yourdir and have it sync code from s3 and run on each instance.

1 Answer 1

1

You can try using Fabric.
Example taken from Fabric documentation:

from fabric import Connection
result = Connection('web1.example.com').run('uname -s', hide=True)
msg = "Ran {0.command!r} on {0.connection.host}, got stdout:\n{0.stdout}"
print(msg.format(result))

# Output:
# Ran 'uname -s' on web1.example.com, got stdout:
# Linux
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.