Good day good monks.
At the end of October, I was kindly helped, (
Using the result from XML::XQL::solve), to get a program working. This program has functioned perfectly every 2 hours since.
I had a second need, basically doing the same job as the first, but retrieving the XML from a different site.
I retrieve and decompress the XML without incident, however, when this new version of the program tries to process the XML, the program crashes.
The line that crashes the program is this...
$usernodeset = $xpu->find("/users/user[teamid=$CONFIG{teamid}]");
... which is identical to the line in the working script.
When the script gets to that line, the program screen I/O stops but the task is still running. Perl.exe is using 40-50% of the CPU, and I can see the pagefile usage rising incrementally. After 5-6 minutes, I get an Application Error message box which says...
The instruction at "0x28089a3d" referenced memory at "0x00000004". The memory could not be "written".
... clicking okay the console prompt returns.
I have visually checked the XML and it seems perfectly valid, and in the same format as that obtained from the other site. What is different is the size. The working program is processing a ~9MB XML buffer, that which is failing, a ~32MB buffer.
I notice the page file usage initially rises in a large jump, but then in slow increments until the task seems to have reached about 2GB which is perhaps significant.
The documentation does not apply a size limit to the XML to be searched.
Have I hit a limit or is this message indicative of something that I am missing?
*** UPDATE ***
Following the suggestions below, I have suceeded in getting both a Rules version and a Twig version to work. I have settled on the Twig version as it seems to run a little faster.
The Twig parser also crashes if I do not use the purge command in the twig_handler routine, so there is an undocumented limit there as well.