3

I am running a bash script as like that:

for i in {0..3250000..50000}
  do
    wget "http://xxx/select?q=*:*&row_size=50000&start=$i" -O $i.csv
  done

Every time when I send a request I have to wait to finish it and write to a file and after that it continues to looping. However I want to do it asynchronously. I mean that it will send a request and loop without waiting response. However when a response comes it will do the proper thing.

How can I do that?

0

2 Answers 2

3

You can use xargs:

printf '%s\0' {0..50000..3250000} |
    xargs -0 -I {} -n 1 -P 20 \
    wget 'http://xxx/select?q=*:*&row_size=50000&start={}' -O {}.csv

The -0 selects the NULL character as delimiter, -I {} replaces {} with the argument, -n 1 hands over a single argument to wget and -P 20 processes 20 requests at a time, in parallel.

Alternatively you can append a & to your command line to execute it in background and wait for the processes to finish.

Sign up to request clarification or add additional context in comments.

Comments

1

You can execute it with the following. & executes command in background.

<cmd> &

But your loop looks huge and it'll be running in background. The control will be returned to the script. so to avoid execution issues you should write some function to check whether the background operation has exited or not.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.