This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
A lot of these directories take quite some time to download, and I have to keep stopping and restarting the downloads. I know there's wget commands I can use to resume downloads, combined with no-clobber and timestamping, but I've run into cases where I get a part of a file when I cancel a download, and it doesn't redownload the next time because the file already exists. The alternative would be to have it redownload everything, but that would obviously take a lot of time.
Anyway, I'm working on a script that compares filesizes to see if the specific file didn't complete downloading. I've got the script working by passing specific file URLs in, but want to have it run on each file that downloads when I wget an entire directory. I need the comparison to run before it actually wgets the file, and I can have it wget the file in the script, meaning I need a way to get a list or map of all the files in a directory and loop the script over each file.
Thoughts?
Subreddit
Post Details
- Posted
- 6 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/opendirecto...