What is the best way to execute 5 curl requests in parallel from a bash script? I can't run them in serial for performance reasons.
Asked
Active
Viewed 1.3e+01k times
4 Answers
33
Use '&' after a command to background a process, and 'wait' to wait for them to finish. Use '()' around the commands if you need to create a sub-shell.
#!/bin/bash
curl -s -o foo http://example.com/file1 && echo "done1" &
curl -s -o bar http://example.com/file2 && echo "done2" &
curl -s -o baz http://example.com/file3 && echo "done3" &
wait
Anton Cohen
- 1,152
10
xargs has a "-P" parameter to run processes in parallel. For example:
wget -nv http://en.wikipedia.org/wiki/Linux -O- | egrep -o "http://[^[:space:]]*.jpg" | xargs -P 10 -r -n 1 wget -nv
Reference: http://www.commandlinefu.com/commands/view/3269/parallel-file-downloading-with-wget
Fan___
- 201
0
Here's a curl example with xargs:
$ cat URLS.txt | xargs -P 10 -n 1 curl
The above example should curl each of the URLs in parallel, 10 at a time. The -n 1 is there so that xargs only uses 1 line from the URLS.txt file per curl execution.
What each of the xargs parameters do:
$ man xargs
-P maxprocs
Parallel mode: run at most maxprocs invocations of utility at once.
-n number
Set the maximum number of arguments taken from standard input for
each invocation of utility. An invocation of utility will use less
than number standard input arguments if the number of bytes
accumulated (see the -s option) exceeds the specified size or there
are fewer than number arguments remaining for the last invocation of
utility. The current default value for number is 5000.