This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| linux-download [2022/10/12 09:07] – glebelg | linux-download [2022/10/12 09:12] (current) – glebelg | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | Download | + | ====Download |
| - | wget -t0 -c http://... | + | wget -c http://... |
| - | How to download a list of URLs using more than one process (say wget) at the time? | + | -t0 : nombre de tentatives (0 = infini) |
| - | First, create a file with URLs – one URL per line. Let’s call the file url.txt. Then we need to create N wget processes, each downloading one URL at the time. Thanks to xargs it is trivial: | + | -c : reprend téléchargement interrompu |
| + | |||
| + | ====How to download a list of URLs using more than one process (say wget) at the time?==== | ||
| + | |||
| + | First, create a file with URLs – one URL per line. Let’s call the file url.txt. | ||
| + | |||
| + | Then we need to create N wget processes, each downloading one URL at the time. Thanks to xargs it is trivial: | ||
| cat url.txt | xargs -n 1 -P 10 wget | cat url.txt | xargs -n 1 -P 10 wget | ||
| Line 10: | Line 16: | ||
| -P 10 will create 10 parallel processes | -P 10 will create 10 parallel processes | ||
| - | | + | ====Download from youtube==== |
| + | |||
| + | | ||
| - | youtube-dl -ciw --playlist-items 60-515 --extract-audio --audio-format mp3 --restrict-filenames https://www.youtube.com/ | + | youtube-dl -ciw --playlist-items 60-515 --extract-audio --audio-format mp3 --restrict-filenames https://... |
| + | |||
| + | ==== Others ==== | ||