MatheusAA has asked for the wisdom of the Perl Monks concerning the following question:
I think it's a memory issue, cause the crawler stops just after a line like this:
$fileContent = ` cat $DIR_INPUT/$FILE `;
I'd really, really apreciate if anyone could help.
thanx in advance.
UPDATE:
MidLifeXis, it stops right after that line. if I try to print the var $fileContent after the cat command, I get an incomplete html page source.
here's some more code:
$DIR_INPUT="./CAPTURE"; while($i <= $npages){ $FILE = $i . $IN_FILE; &wget($URL . $PAG . $i, $FILE, '', ''); #sub that call the wg +et &capture($FILE); } ... sub capture{ my $FILE = $_[0]; $fileContent = ` cat $DIR_INPUT/$FILE `; print $fileContent; # Here's where it stops ... }
As I said before, after some iterations of the while loop, the program suddenly stops, and the print I get on screen is an incomplete page source.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Perl Script stops in middle of execution, no error messages
by Perlbotics (Archbishop) on Nov 11, 2008 at 17:53 UTC | |
by MatheusAA (Initiate) on Nov 12, 2008 at 18:02 UTC | |
|
Re: Perl Script stops in middle of execution, no error messages
by MidLifeXis (Monsignor) on Nov 11, 2008 at 17:12 UTC | |
|
Re: Perl Script stops in middle of execution, no error messages
by MidLifeXis (Monsignor) on Nov 11, 2008 at 17:52 UTC | |
|
Re: Perl Script stops in middle of execution, no error messages
by swampyankee (Parson) on Nov 11, 2008 at 17:34 UTC |