0

After searching around, I defined a function to execute command like in terminal:

import shlex
import subprocess
def execute_cmd(cmd):
    p = subprocess.Popen(shlex.split(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE)

    for line in iter(p.stdout.readline, b''):  # b'' here for python3
        sys.stdout.write(line.decode(sys.stdout.encoding))

    error = p.stderr.read().decode()
    if error:
        raise Exception(error)

It works fine(output is realtime), when i

execute_cmd('ping -c 5 www.google.com')

However, when i use execute_cmd to run a python script, the output will print out until the process is done.

execute_cmd('python test.py')

script: test.py

#!/usr/bin/env python
# -*- coding: utf-8 -*-

import time

print('hello')
time.sleep(2)
print('hello, again')

How can i fix it? thanks!


Sorry for not explaining why 'catch the stdout and then write it to stdout again'. Here i really want to do is catching script outputs to logger, the logger output them to screen(StreamHandler) and log file(FileHandler). I builded and tested the logger part, now the 'execute' part. And ignore the stdout= parameter seems not work.

pipeline:

  1. setup logger;
  2. redirect STDOUT, STDERR to logger;
  3. execute scripts;

Because of step 2, if i ignore stdout= parameter, the outputs of scripts will still output to STDOUT, and will not log in file.

Maybe i can set stdout= to logger?

2
  • Why you catch the stdout and then write it to stdout again? You can just ignore that parameter. Commented May 17, 2018 at 7:57
  • @Sraw Sorry for not explaining that, I really want to do is catching script outputs to logger, the logger output them to screen(StreamHandler) and log file(FileHandler). I builded and tested logger, now the 'execute' part. And ignore the stdout= parameter will not work. Commented May 17, 2018 at 8:12

1 Answer 1

1

This is a common problem of the underlying output system, notably on Linux or other Unix-like systems. The io library is smart enough to flush output on each \n when it detects that output is directed to a terminal. But this automatic flush does not occur when output is redirected to a file or a pipe, probably for performance reasons. It is not really a problem when only the data matters, but it leads to weird behaviour when timing matters too.

Unfortunately I know no simple way to fix it from the caller program(*). The only possibility is to have the callee force flushing on each line/block or to use unbuffered output:

#!/usr/bin/env python
# -*- coding: utf-8 -*-

import time
import sys

print('hello')
sys.stdout.flush()
time.sleep(2)
print('hello, again')

(*) The bullet proof way would be to use a pseudo-terminal. The caller controls the master side and passes the client side to the callee. The library will detect a terminal and will automatically flushes on each line. But it is no longer portable outside the Unix world and is not really a simple way.

Sign up to request clarification or add additional context in comments.

1 Comment

The reason behind problem is more important than solutions, thx for show me the rabbit hole.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.