After searching around, I defined a function to execute command like in terminal:
import shlex
import subprocess
def execute_cmd(cmd):
p = subprocess.Popen(shlex.split(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
for line in iter(p.stdout.readline, b''): # b'' here for python3
sys.stdout.write(line.decode(sys.stdout.encoding))
error = p.stderr.read().decode()
if error:
raise Exception(error)
It works fine(output is realtime), when i
execute_cmd('ping -c 5 www.google.com')
However, when i use execute_cmd to run a python script, the output will print out until the process is done.
execute_cmd('python test.py')
script: test.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import time
print('hello')
time.sleep(2)
print('hello, again')
How can i fix it? thanks!
Sorry for not explaining why 'catch the stdout and then write it to stdout again'. Here i really want to do is catching script outputs to logger, the logger output them to screen(StreamHandler) and log file(FileHandler). I builded and tested the logger part, now the 'execute' part. And ignore the stdout= parameter seems not work.
pipeline:
- setup logger;
- redirect STDOUT, STDERR to logger;
- execute scripts;
Because of step 2, if i ignore stdout= parameter, the outputs of scripts will still output to STDOUT, and will not log in file.
Maybe i can set stdout= to logger?
stdoutand then write it tostdoutagain? You can just ignore that parameter.stdout=parameter will not work.