in reply to Making Html Pages offline available

I use my webreaper script to download entire websites that I want to view offline. It's pure Perl. I don't have a feature to stop at a certain level, though. It also doesn't do any link re-writing for those sites that like to use absolute URLs everywhere.

As mentioned earlier, wget can handle this too. It's -r feature recursively downloads a site, and -l sets the desired depth (5 being the default). Here's the example from its documentation:

wget -r -l2 -P/tmp ftp://wuarchive.wustl.edu/

Neither of those support javascript though.

--
brian d foy <brian@stonehenge.com>
Subscribe to The Perl Review