in reply to Capturing web pages and making them static

This is not exactly trivial. In addition to having to fetch the pages, you need to parse them, find all the links to additional resources, download these resources, and change the links to point to the local copies.

You probably don't want to write this yourself. Of the tools available on most any Unix box, wget is capable of doing this for you. If you want something written in Perl, try w3mir.

Makeshifts last the longest.

  • Comment on Re: Capturing web pages and making them static