I'm sure this is one of those things that has already been coded and is out there somewhere. I'm looking for something that given a URL will spider the site and store a copy of the site locally.
I have a pretty good idea of how to go about writing one. I'm just lazy. I figure if no one can find something out there, I can always take linklint and make it save the files after it checks it (is that a good idea?).
Thanks in advance.
In reply to website archiver by xorl
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |