I am planning to download a large amount of files (each about ~20-250MB) with a total size of a few hundred GB from an FTP server that I get about ~250-350 KB/s of download speed. I expect it to take about 5 days pr. 100GB. The issue is just that my download is likely to get interrupted for several reasons including maintenance that may reboot the machine that I'm using to download.
Given a text file with all the file paths that I'm going to download (1 path per line), what is the easiest way to do this reliably (i.e. without having to start completely over in case of failure, or having to figure out which files are only partially downloaded, etc..)? Re-downloading just a few single files is not a problem. Ideally I'm looking for a command that I can just re-run in case of failure.
The Linux distribution is Rocks 6.1 (Emerald Boa).
[link][10 comments]