msalerno has asked for the wisdom of the Perl Monks concerning the following question:

I need to parse quite a few rrd xports ranging in size from 100K to 100MB, there will be up to 15 forks running concurrently reading XML in from different URLs. I have been doing lots of testing with different methods and I want to try libXML::Reader, which is where my problem begins.

My first test was the libXML push parser using a LWP callback to parse_chunk. This was lightning fast, but memory intensive. Theres always the possibility of 15 forks reading 100MB worth of data...

My next test was with XML::Twig. Using twig_roots I'm able to overcome the possible memory issues, but the time to process the XML triples.

I want to try out XML::LibXML::Reader, but the libXML modules use their own internal url functions rather than LWP. The URLs require http auth and libXML just ignores the username:password part of the URL and fails.

I could try adding a loop in a LWP callback whenever $reader->read != 1, but the code would probably get ugly quickly.

my $url = 'http://'.$username.':'.$password.'@'.$host.'/rrd_updates?start='.$starttime;

Hopefully i'm just missing something. Does anyone know how to get libXML to process the full URL natively?

Thanks

Replies are listed 'Best First'.
Re: XML Parsing with libXML
by ikegami (Patriarch) on Feb 08, 2011 at 22:41 UTC

    If you wanted to use LWP, you could feed Reader via a pipe.

    my $reader = XML::LibXML::Reader->new( IO => $file_handle, ... );

    LWP would have to be run in a separate thread (Coro or threads) or process. It might be fastest to fork and use wget or curl instead of LWP.