slugger415 has asked for the wisdom of the Perl Monks concerning the following question:
Hello monks, I'm using XML::Twig on Windows 7 (Strawberry Perl v5.16.3) to parse some pretty large XML files (e.g. ~800MB) and getting the old "out of memory!" error message. The script works fine on smaller files. Probably a dumb question, but am I just SOL? Is there any way to slurp through the file in manageable chunks? I'm using a remote server that I don't own so adding more memory is not an option.
(also tried XML::Simple)
Here's my code if it helps, but it's not really doing anything complicated or interesting.
use XML::Twig; use strict; my $file = shift; # set up the XML parser: my $twig= XML::Twig->new( comments => 'keep', twig_handlers => { row => \&row_processing }, pretty_print => 'indented', ); print " parsing $file...\n"; $twig->parsefile($file); $twig->purge; sub row_processing { my($twig, $rows)= @_; print "a row\n"; }
thanks -- Scott
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: out of memory! parsing large XML file ( xml_pp )
by Anonymous Monk on Feb 13, 2014 at 04:54 UTC | |
by slugger415 (Monk) on Feb 13, 2014 at 18:12 UTC | |
|
Re: out of memory! parsing large XML file
by choroba (Cardinal) on Feb 13, 2014 at 10:40 UTC | |
|
Re: out of memory! parsing large XML file
by pajout (Curate) on Feb 13, 2014 at 10:52 UTC |