Thank you for all your replies.
Unfortunately I'm not 100% positive that it is an XML issue, however based on my benchmarks I'm pretty sure it is. The server has 1.25 GB of RAM. It experiences relatively large loads throughout the day (~5 requests per second).
The main script on the server receives an XML packet from another server. The script takes the XML and parses it using XML::Records. The parsed records are used to build up a MySQL query. This query is then executed and the script is ended. The MySQL is on another server entirely (obviously with its own set of dedicated resources).
Here is a snippet of pretty much all the script is doing:
for my $subpkg (@$pkg)
{
$sql .= "$delim($subpkg->{field1},$subpkg->{field2},$subpkg->{fiel
+d3})";
$delim = ",";
}
$query = $db->do($sql);
I first thought it may be an issue with string concatenation when building up the query. However my testing has shown that concatenating strings in perl was just as efficient as running a join on an array.
The only thing I notice consistently is that when I send a 1MB XML packet to the script, it uses low memory (~4000K), however when I send it a 30MB XML packet, it uses almost triple the amount of memory (sometimes as high as 20000K).
I know that the server will most likely need a memory upgrade however I want to make sure the script is running as efficiently as possible. I'm currently looking into using XML:Twig.