I was just surprised that you could use XML::DOM at all on files of that size. And it looks like you can't actually, a 1gb XML file would take at least 8gb in memory using XML::DOM. So it might be interesting to know how you did it. What I meant was that if you had been able to do it, by throwing large amounts of memory at the problem, then XML::LibXML would have been an option.
With XML::Twig you can very easily extract the k/v pairs:
my $t= XML::Twig->new( twig_roots => { SigData => sub { push @keys, $_->field( 'Key'); push @values, $_->field( 'Value'); $_->purge; } }, ) ->parsefile("my_big_fat_xml_file.xml");
Of course the @keys and @values arrays are going to be huge too, so you might still want to add a few GB of RAM to your machine, but at least the XML structure will never take up more than a few bytes.
Other possible options are XML::Rules (I expect jenda to show up and give you an example as soon as he wakes up, and maybe the new XML::Reader, which seems quite appropriate. XML::LibXML's pull mode might also be appropriate, but I have never used it so I can't comment on it.
In reply to Re^3: XML processing taking too much time
by mirod
in thread XML processing taking too much time
by koti688
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |