Unix – Pipe output of cat to cURL to download a list of files

curlunix

I have a list URLs in a file called urls.txt. Each line contains 1 URL. I want to download all of the files at once using cURL. I can't seem to get the right one-liner down.

I tried:

$ cat urls.txt | xargs -0 curl -O

But that only gives me the last file in the list.

Best Solution

This works for me:

$ xargs -n 1 curl -O < urls.txt

I'm in FreeBSD. Your xargs may work differently.

Note that this runs sequential curls, which you may view as unnecessarily heavy. If you'd like to save some of that overhead, the following may work in bash:

$ mapfile -t urls < urls.txt
$ curl ${urls[@]/#/-O }

This saves your URL list to an array, then expands the array with options to curl to cause targets to be downloaded. The curl command can take multiple URLs and fetch all of them, recycling the existing connection (HTTP/1.1), but it needs the -O option before each one in order to download and save each target. Note that characters within some URLs ] may need to be escaped to avoid interacting with your shell.

Or if you are using a POSIX shell rather than bash:

$ curl $(printf ' -O %s' $(cat urls.txt))

This relies on printf's behaviour of repeating the format pattern to exhaust the list of data arguments; not all stand-alone printfs will do this.

Note that this non-xargs method also may bump up against system limits for very large lists of URLs. Research ARG_MAX and MAX_ARG_STRLEN if this is a concern.