1. I will, as soon as possible, run Memtst86 to verify the hardware. I agree, this is worthwhile, but unlikely.
2. I expect my Perl is not the most up-to-date, (not a long way away, but possibly somewhat crusty). I can certainly investigate this. Similarly the module, although I did download the module last autumn, I do not know how frequently this is updated. Again, I can investigate this.
3. Whilst I may be able to try this on other OS's, it will not solve my problem. The target OS is XP, that is a constraint.
If this is a size issue, I could possibly chop the XML buffer into a number of smaller buffers by traditional means, then call the processing routine repeatedly each invocation processing say 1000 records.
I create the object with an xml->abc parameter. Is there a formal destructor or reset method for this object or do I need to create a new object for each inquiry? The documentation at CPAN does not show a destructor.
| [reply] |
Well, another option is to just use another module, I recommend, XML::Twig as another possibility. It can deal with large files very well because you can flush (clear from memory) the parts of the document do not need anymore. This would reduce the size of the document as well.
| [reply] |
I am investigating Twig, thanks for the steer.
| [reply] |
I am investigating Twig. Thanks for the steer.
| [reply] |
Just to update, memtest86 showed the RAM to be fine. The XPath I have is the latest available. My Perl core was slightly out of date, 5.8.7, I have updated to 5.8.8 this evening. The error persists.
| [reply] |