Hello Monks, I have a CGI script using XML::Twig to parse <row> elements an XML file. I've discovered through a lot of testing that I get a "500 Internal Server Error" in the browser (Firefox) if the number of rows is too big (more than 4689 to be exact). I know it's not a problem with row #4690 in the file because I've tested it on different files and it crashes at the same number of rows.
I checked the HTTP server's error log and saw this:
Premature end of script headers: my-script.plI'm using both the CGI and XML::Twig Perl modules. Is it running out of memory?
Is there a way to capture the error (other than what's in the server log) when Perl fails? The "500" error doesn't tell me much.
Here's the little bit of code that parses the file, and prints HTML when done:
use CGI; use CGI::Pretty; use CGI::Carp qw( fatalsToBrowser ); use XML::Twig; use strict; # set up the parser: my $twig= XML::Twig->new( twig_handlers => { row => \&row_processing }, ); $twig->parsefile("$outputDir/my-zos-shorter.xml"); print $q->header, $q->start_html(-title=>$TITLE, -style=>{'src'=>$stylesheet}), "<a name='top'></a>\n", $q->h2('Here are your results!'), $q->end_html;
BTW as a test I also had the row_processing subroutine/handler print every row it finds into a file as it goes along, and it makes it all the way through to the last one, so it's actually parsing everything before it barfs. I am clueless as to what's going on.
Any suggestions on how to troubleshoot this or about what the problem is would be most welcome.
thanks, Scott
In reply to XML::Twig size problem by slugger415
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |