Software for downloading complete websites?
One of the websites host plenty of historical news papers (archive). I would like to be able to download the enitre site so I can read the papers in offline mode.
Is there a free software that lets me do it? Thanks. |
What's the site? It may or may not be possible, depending on what the site uses.
|
This isn't the easiest way, but I've used wget, which is a traditional utility for this purpose. It's command line only, but there must be some GUI tool to do what you want somewhere.
http://freecode.com/projects/wget You have to figure out the options, like level of recursion etc., but I was able to download an entire archive of radioplays from a website. Here's a sample command line I came up with: Code:
wget -nd -t 0 -T 15 "http://movies.apple.com/movies/fox/avatar/avatar2009aug0820a-tsr_h1080p.mov" -U "QuickTime/7.6 (qtver=7.6;os=Windows NT 6.0Service Pack 1)" --header "Host: movies.apple.com" It will grab all files off one page. |
When you use a tool like that you're abusing the resources of that site. You might find yourself on the other side of a firewall depending on the site. Site scraping is not something webmasters appreciate whether it's legitimate user or not and many monitor for such activity.
|
Thanks to all for the answers.
|
Site design, images and content © 2002-2024 The Digital FAQ, www.digitalFAQ.com
Forum Software by vBulletin · Copyright © 2024 Jelsoft Enterprises Ltd.