Dear monks
I am parsing a big XML file (~500Mb) with the following script. I run out of memory even if I set the uption HUGE
#!/usr/bin/env perl use strict; use warnings; use XML::LibXML; print "Importing...\n"; my $file = 'my.xml'; my $dom = XML::LibXML->load_xml(location => $file, huge => 1,); foreach my $termEntry ($dom->findnodes('/martif/text/body/termEntry')) + { foreach my $lang_set ($termEntry->findnodes('langSet')) { my $language = $lang_set->getAttribute('xml:lang'); foreach my $term_grp ($lang_set->findnodes('./tig')){ my $term = $term_grp->findvalue('./term'); print "$language: $term\n"; } } } print "Done!\n"; exit;
The script works perfectly with smaller files. Any suggestions?
In reply to XML::LibXML out of memory by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |