30

I want to connect to and execute a process on a remote server using Python. I want to be able to get the return code and stderr (if any) of the process. Has anyone ever done anything like this before. I have done it with ssh, but I want to do it from Python script.

Cheers.

2

3 Answers 3

43

Use the ssh module called paramiko which was created for this purpose instead of using subprocess. Here's an example below:

from paramiko import SSHClient
client = SSHClient()
client.load_system_host_keys()
client.connect("hostname", username="user")
stdin, stdout, stderr = client.exec_command('program')
print "stderr: ", stderr.readlines()
print "pwd: ", stdout.readlines()

UPDATE: The example used to use the ssh module, but that is now deprecated and paramiko is the up-to-date module that provides ssh functionality in python.

Sign up to request clarification or add additional context in comments.

Comments

27

Well, you can call ssh from python...

import subprocess
ret = subprocess.call(["ssh", "user@host", "program"]);

# or, with stderr:
prog = subprocess.Popen(["ssh", "user@host", "program"], stderr=subprocess.PIPE)
errdata = prog.communicate()[1]

5 Comments

You'll need to make sure that you have key-based authentication to the remote host, and the proper keys. Otherwise, you'll be prompted for a password.
nice, apparently I'm out of date - was going to recommend os.spawnlp, but i see the subprocess module was introduced to replace it.
Thanks dude. When you exec a command over ssh the return code and stderr is that of the exec'd command, if ssh doesn't have any issues itself. I can the popen class and the communicate() method to tap into it from there.
Use the ssh module instead of calling subprocess. Even at the time this answer was posted the former version of the ssh module (called Paramiko) existed.
Better to use Fabric or paramiko as detailed in the following answers. Fabric is higher level interface than paramiko.
5

Maybe if you want to wrap the nuts and bolts of the ssh calls you could use Fabric This library is geared towards deployment and server management, but it could also be useful for these kind of problems.

Also have a look at Celery. This implements a task queue for Python/Django on various brokers. Maybe an overkill for your problem, but if you are going to call more functions on multiple machines it will save you a lot of headache managing your connections.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.