2

I have a very simple command that I would like to execute in parallel rather than sequential.

>for i in ../data/*; do ./run.sh $i done

run.sh processes the input files from the ../data directory and I would like to perform this process all at the same time using a shell script rather than a Python program or something like that. Is there a way to do this using GNU Parallel?

1
  • parallel ./run.sh ::: ../data/* Commented Aug 14, 2014 at 13:18

3 Answers 3

2

You can try this:

shopt -s nullglob
FILES=(../data/*)
[[ ${#FILES[@]} -gt 0 ]] && printf '%s\0' "${FILES[@]}" | parallel -0 --jobs 2 ./run.sh
Sign up to request clarification or add additional context in comments.

Comments

1

I have not used GNU Parallel but you can use & to run your script in the background. Add a wait (optional) later if you want to wait for all the scripts to finish.

for i in ../data/*; do ./run.sh $i & done
# Below wait command is optional
wait
echo "All scripts executed"

1 Comment

Aaaaaaand, KABOOM! Forgot about the &. I would still like to see this done using Parallel.
1

You can try this:

find ../data -maxdepth 1 -name '[^.]*' -print0 | parallel -0 --jobs 2 ./run.sh

The name argument of the find command is needed because you used shell globbing ../data/* in your example and so we need to ignore files starting with a dot.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.