Hi Malk!
If you click the link on the web site, it would take you to another page. Some of these links are failing. The whole idea is, through perl, download whole site ( i do not know what's going to be the best, everything in one file or in different files ) , find out what's failing, try to fix up and then reproduce the web site again. It does make sense ?
Thank you very much!!
PS: I do not speak english very well, could you tell me what do you mean with .sig ? | [reply] |
Note that there are about a million "link checkers" out for free on the 'Net, and I'd wager half of them are written in Perl. Unfortunately, I can't think of any off-hand, but a few simply-crafted net searches should turn you up with something. These don't involve downloading the entire site, though, just testing the validity of the links and generating reports. If that's all you need, perhaps you'd be better off with a pre-existing solution. Maybe somebody else here has a pointer to a few of these...?
| [reply] |
| [reply] |