in reply to Best way to recursively grab a website
If you don't mind one system call, you could go with wget, an excellent tool to download an entire website. Command line option allow to restrict downloads to a single site, a certain depth and what not. All in all, a very valuable tool. It can be found at http://www.gnu.org/software/wget/wget.html.
Did I mention it's free software (a GNU project to be precise)?
Hope this helps, -gjb-
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Best way to recursively grab a website
by ghenry (Vicar) on Mar 29, 2005 at 12:21 UTC |