in reply to Re: Memory Efficient XML Parser
in thread Memory Efficient XML Parser

Thank you for all your replies.
Unfortunately I'm not 100% positive that it is an XML issue, however based on my benchmarks I'm pretty sure it is. The server has 1.25 GB of RAM. It experiences relatively large loads throughout the day (~5 requests per second).
The main script on the server receives an XML packet from another server. The script takes the XML and parses it using XML::Records. The parsed records are used to build up a MySQL query. This query is then executed and the script is ended. The MySQL is on another server entirely (obviously with its own set of dedicated resources).
Here is a snippet of pretty much all the script is doing:
for my $subpkg (@$pkg) { $sql .= "$delim($subpkg->{field1},$subpkg->{field2},$subpkg->{fiel +d3})"; $delim = ","; } $query = $db->do($sql);
I first thought it may be an issue with string concatenation when building up the query. However my testing has shown that concatenating strings in perl was just as efficient as running a join on an array.
The only thing I notice consistently is that when I send a 1MB XML packet to the script, it uses low memory (~4000K), however when I send it a 30MB XML packet, it uses almost triple the amount of memory (sometimes as high as 20000K).
I know that the server will most likely need a memory upgrade however I want to make sure the script is running as efficiently as possible. I'm currently looking into using XML:Twig.

Replies are listed 'Best First'.
Re^3: Memory Efficient XML Parser
by Jenda (Abbot) on Dec 13, 2007 at 12:21 UTC

    Wait a second, could you show us a bit more of the code? It looks as if you were first extracting all data in the XML, building a huge string and then tried to shove the whole string into the database. I'm not surprised the script and the database need a lot of memory and CPU time to cope with that!

    It would be much better to parse one row, insert it to the database using a prepare()d statement handle, forget its data, parse the next one ... and if you want to optimize it and don't mind that it's a tiny little bit more complex, open the database connection with AutoCommit=>0 and commit only after every 1000 rows (you may need to do some benchmarking to find the right number here).

      You're actually exactly right however the database server is not have any issues executing this large query. In fact it is one of the fastest queries that runs on my server with the optimization that is done. I found that it is less efficient to loop through hundreds of queries one by one, opposed to running one large query filled with joins. I'm now parsing the XML with a simple regexp and it uses less than half the memory when using the XML:Records module. Thank you for everyones help, but it appears going back to the basics have solved my memory issues for the time being.

        Could you show us the code? I seriously doubt shoving a huge SQL string that has to be all parsed and compiled by the server is quicker than preparing a statement and then sending the server just the values for each row. Especially if you only commit reasonably sized batches.

        Parsing XML with regexps is something that might look like it works now, but unless you have strict control over whatever produces the XML you may run into serious problems.

        Show us the code!