0

Given a bash command line:

./getRawStream.jar | ./processRaw.py > product.csv 2> product.stderr

Is it possible for ./processRaw.py to find the return value of ./getRawStream.jar and abort the rest of the chain? I am trying to avoid using python's subprocess module at all costs because the final output of ./getRawStream can be on the order of 10s or 100's of gigabytes in size and want to cut down how many places the stream goes into memory before being filtered down by processRaw.py

1
  • ./getRawStream.jar > fileToEvaluateByProcessRaw.stodout Commented Jan 7, 2011 at 23:16

2 Answers 2

4

That's not how pipes work. Pipes deal with input and output, not with the programs generating them.

That said, subprocess can use pipes too - if you pass stdout=subprocess.PIPE you'll get a stdout stream for the process that you can read incrementally. Of course, if you care about return codes, you have to wait for the first process to end anyways, which means there's not really a lot of difference in buffering - the entire output of the first command is going to be buffered before you see a return code.

Sign up to request clarification or add additional context in comments.

1 Comment

I was hoping $? magic bash variable might be something available beyond just the shell, oh well, at least I know that paths a dead end.
0

Something like this may help:

./getRawStream.jar > stage1.stodout
if [ ... ];
then
    ./processRaw.py < stage1.stodout > product.csv 2> product.stderr
fi
rm -f stage1.stodout

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.