Consider the useful Album script which recursively scans a directory and builds thumbnails and HTML pages. You run it once, move the contents to your web server and when you need to update - repeat. (example)
Conversely, i use a crawler for the DBIx::XHTML_Table site. I run mod_perl locally on my machine and when i make updates, i run a mirror on unlocalhost.com that contacts my local machine and downloads the HTML files.
If you install a server on your client's PC, you shouldn't have to worry about changing IP numbers as long as all of the links are relative and not absolute - for example, don't use links in your HTML pages such as "http://somedomain.com/index.html" - use "/index.html" instead.
jeffa
L-LL-L--L-LL-L--L-LL-L-- -R--R-RR-R--R-RR-R--R-RR B--B--B--B--B--B--B--B-- H---H---H---H---H---H--- (the triplet paradiddle with high-hat)
In reply to (jeffa) 2Re: Viewing website locally
by jeffa
in thread Generating static HTML pages to view a dynamic website locally
by malaga
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |