You could probably create one yourself using LWP::Simple
and HTML::LinkExtor, but there are probably already a number
of these scripts in existance. I think I saw one called
WGET.
--
<http://www.dave.org.uk>
"Perl makes the fun jobs fun
and the boring jobs bearable" - me
| [reply] |
Hi Dave,
I am very new with perl. Could you tell me where can I find WGET script ?
Thank you very much!!
| [reply] |
| [reply] |
Heya Anonymous..
I'm not sure what others make of your query, but, I'm a little stumped as to exactly what you're asking.
What, exactly, are you referring to by 'downloading whole sites into HTML format'??
What format is the site in at the moment, if not HTML?
A little more info would help us make a valid answer methinks.
Cheers,
Malk
*I lost my .sig. Do you have a spare?* | [reply] |
Hi Malk!
If you click the link on the web site, it would take you to another page. Some of these links are failing. The whole idea is, through perl, download whole site ( i do not know what's going to be the best, everything in one file or in different files ) , find out what's failing, try to fix up and then reproduce the web site again. It does make sense ?
Thank you very much!!
PS: I do not speak english very well, could you tell me what do you mean with .sig ?
| [reply] |
Note that there are about a million "link checkers" out for free on the 'Net, and I'd wager half of them are written in Perl. Unfortunately, I can't think of any off-hand, but a few simply-crafted net searches should turn you up with something. These don't involve downloading the entire site, though, just testing the validity of the links and generating reports. If that's all you need, perhaps you'd be better off with a pre-existing solution. Maybe somebody else here has a pointer to a few of these...?
| [reply] |