If you don't mind one system call, you could go with wget, an excellent tool to download an entire website. Command line option allow to restrict downloads to a single site, a certain depth and what not. All in all, a very valuable tool. It can be found at http://www.gnu.org/software/wget/wget.html.
Did I mention it's free software (a GNU project to be precise)?
Hope this helps, -gjb-
In reply to Re: Best way to recursively grab a website
by gjb
in thread Best way to recursively grab a website
by ghenry
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |