Table of Contents

Download

wget -c http://...

-t0 : nombre de tentatives (0 = infini) -c : reprend téléchargement interrompu

How to download a list of URLs using more than one process (say wget) at the time?

First, create a file with URLs – one URL per line. Let’s call the file url.txt.

Then we need to create N wget processes, each downloading one URL at the time. Thanks to xargs it is trivial:

cat url.txt | xargs -n 1 -P 10 wget

-n 1 will make xargs run command (wget) with only one argument at the time -P 10 will create 10 parallel processes

Download from youtube

python yt-dlp -x --split-chapters https://...
youtube-dl -ciw --playlist-items 60-515 --extract-audio --audio-format mp3 --restrict-filenames https://...

Others