Two mostly equivalent data sources. Now the two benchmarks (I am using tcsh's time command here, showing system, user, elapsed time and maximum memory):use constant RECS => 1000000; { open my $fh, ">/tmp/bla.xml" or die; select $fh; print "<addresses>\n"; for (1..RECS) { print <<EOF; <address> <name>John Smith</name> <city>London</city> </address> EOF } print "</addresses>\n"; } { require Storable; my @addresses; for (1..RECS) { push @addresses, { name => "John Smith", city => "London" }; } Storable::nstore(\@addresses, "/tmp/bla.st"); }
So naive parsing of XML is much worse in both memory allocation and CPU time than loading the same Storable file. I guess that most other fast serializers like YAML::Syck or JSON::XS will give similar results.$ ( set time = ( 0 "%U+%S %E %MK" ) ; time perl -MStorable -e 'retriev +e "/tmp/bla.st"' ) 1.980+0.384 0:02.41 193974K $ ( set time = ( 0 "%U+%S %E %MK" ) ; time perl -MXML::LibXML -e 'XML: +:LibXML->new->parse_file("/tmp/bla.xml")->documentElement' ) 6.037+1.876 0:08.15 643952K
In reply to Re^3: Memory Efficient XML Parser
by eserte
in thread Memory Efficient XML Parser
by perlgoon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |