There is an online HTTP directory that I have access to. I have tried to download all sub-directories and files via wget. But, the problem is that when wget downloads sub-directories it downloads the index.html file which contains the list of files in that directory without downloading the files themselves. Is there a way to download the sub-directories and files without depth limit (as if the directory I want to download is just a folder which I want to copy to my computer).
|
3 Answers
81
|
Solution
Explanation:
| ||
13
|
I was able to get this to work thanks to this post utilizing VisualWGet.(https://sites.google.com/site/visualwget/a-download-manager-gui-based-on-wget-for-windows) It worked great for me. The important part seems to be to check the
-recursive flag (see image).
Also found that the
-no-parent flag is important, othewise it will try to download everything. |
2 comments:
Do you have a spam problem on this website; I also am a blogger, and I was curious about your situation; many of us have created some nice methods and we are looking to swap strategies with other folks, please shoot me an email if interested.
Hello! This is my 1st comment here so I just wanted to give a quick shout out and say I really enjoy reading through your articles. Can you suggest any other blogs/websites/forums that go over the same subjects? Thanks!
Post a Comment