4

Code:

#!/usr/bin/env python

import boto.ec2

conn_ec2 = boto.ec2.connect_to_region('us-east-1') # access keys are environment vars

my_code = """#!/usr/bin/env python

import sys

sys.stdout = open('file', 'w')
print 'test'
"""
reservation = conn_ec2.run_instances(image_id = 'ami-a73264ce',
                                     key_name = 'backendkey',
                                     instance_type = 't1.micro',
                                     security_groups = ['backend'],
                                     instance_initiated_shutdown_behavior = 'terminate',
                                     user_data = my_code)

The instance is initiated with the proper settings (it's the public Ubuntu 12.04, 64-bit, image) and I can SSH into it normally. The user-data script seems to be loaded correctly: I can see it in /var/lib/cloud/instance/user-data.txt (and also in /var/lib/cloud/instance/scripts/part-001) and on the EC2 console.

But that's it, the script doesn't seem to be executed. Following this answer I checked the /var/log/cloud-init.log file but it doesn't seem to contain any error messages related to my script (well, maybe I'm missing something - here is a gist with the contents of cloud-init.log).

What am I missing?

7
  • 1
    is your file being created ? Have you tried a full path /tmp/file so to check that the file is being created. Commented Jan 16, 2014 at 0:02
  • Nope, file is not created. Not even with the full path (thanks for the reminder, btw). Commented Jan 16, 2014 at 1:13
  • 1
    Did you try running a bash script instead? or you can try creating an instance from the AWS console (passing user data from the AWS console) and see if there's a problem with your script. Commented Jan 16, 2014 at 1:25
  • Just did. It worked! I used my_code = '''#!/bin/sh mkdir /home/ubuntu/testfolder''', dropped the encoding, and when I SSH'd into the machine testfolder was there. I wonder why the Python script won't work though. I suppose I could save it to an S3 bucket, then call it from a bash script, but still, it bugs me that I can't have the Python script work directly. Commented Jan 16, 2014 at 1:39
  • 1
    Here's another oddity: on Ubuntu, `my_code = '''#!/bin/sh mkdir /home/ubuntu/testfolder''' only works if it's split in two lines, with the shebang line by itself as the first line. But on Amazon Linux it works as a one-liner without any problems. Commented Jan 16, 2014 at 21:29

2 Answers 2

4

This is probably not relevant anymore, but yet. I've just used boto with ubuntu and user data, although the documenation says that the user data has to be base64 encoded, it only worked for me if I pass the 64 bit paramter as regular string.

I read the content of user data from file (using fh.read()) and then just pass this as the user_data paramter to run_instances.

Sign up to request clarification or add additional context in comments.

1 Comment

you are right. Same thing happened to me also. When I tried with base64 string it didn't worked for me. But when I passed plain string it worked fine. thanks at last I know how to do it.
2

I think it's not working for you because user data can't use any shebang like you used "#!/usr/bin/env python" On the help page http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html there are two examples one is the standard "#!/bin/bash", and another one looks artificial "#cloud-config". Probably it's only 2 available shebangs. The bash one works for me.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.