Hello monks, I'm using XML::Twig on Windows 7 (Strawberry Perl v5.16.3) to parse some pretty large XML files (e.g. ~800MB) and getting the old "out of memory!" error message. The script works fine on smaller files. Probably a dumb question, but am I just SOL? Is there any way to slurp through the file in manageable chunks? I'm using a remote server that I don't own so adding more memory is not an option.
(also tried XML::Simple)
Here's my code if it helps, but it's not really doing anything complicated or interesting.
use XML::Twig; use strict; my $file = shift; # set up the XML parser: my $twig= XML::Twig->new( comments => 'keep', twig_handlers => { row => \&row_processing }, pretty_print => 'indented', ); print " parsing $file...\n"; $twig->parsefile($file); $twig->purge; sub row_processing { my($twig, $rows)= @_; print "a row\n"; }
thanks -- Scott
In reply to out of memory! parsing large XML file by slugger415
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |