There is an online HTTP directory that I have access to. I have tried to download all sub-directories and files via wget. But, the problem is that when wget downloads sub-directories it downloads the index.html file which contains the list of files in that directory without downloading the files themselves. Is there a way to download the sub-directories and files without depth limit (as if the directory I want to download is just a folder which I want to copy to my computer).
|
3 Answers
81
|
Solution
Explanation:
| ||
13
|
I was able to get this to work thanks to this post utilizing VisualWGet.(https://sites.google.com/site/visualwget/a-download-manager-gui-based-on-wget-for-windows) It worked great for me. The important part seems to be to check the
-recursive flag (see image).
Also found that the
-no-parent flag is important, othewise it will try to download everything. |