jthomas has asked for the wisdom of the Perl Monks concerning the following question:
Hi experts, Have you ever experienced Out of memory problem while using HTML::TableExtract. I'm having little large html files, still i didn't expect this to happen Would you be able to suggest some workarounds for this. I'm using this subroutine in another for loop.
sub zParseHTMLFiles ($$) { my ( $lrefFileList, $lrefColNames ) = @_; my @ldata; foreach my $lFile (@$lrefFileList) { my $lTableExtract = HTML::TableExtract->new( headers => [@$lre +fColNames] ); chomp($lFile); $lTableExtract->parse_file($lFile); foreach my $ls ( $lTableExtract->tables ) { foreach my $lrow ( $lTableExtract->rows ) { chomp( @$lrow[$#$lrow] ); push( @ldata, $lrow ); } } } return \@ldata; }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Out of memory while using HTML::TableExtract
by GrandFather (Saint) on Jan 06, 2011 at 08:16 UTC | |
|
Re: Out of memory while using HTML::TableExtract
by Anonymous Monk on Jan 06, 2011 at 08:17 UTC | |
|
Re: Out of memory while using HTML::TableExtract
by djerius (Beadle) on Jan 06, 2011 at 20:12 UTC |