I'm writing a web spider, using LWP::RobotUA, and running it on WinXP (ActivePerl). When I retrieve certain webpages, the anchor parsing (which uses HTML::TokeParser) just dies, in the middle of printing out a progress report, if I have tracing turned on, or later, if not. I have handled all the sigs, and the (single, common) handler is never invoked (though I have tried to manually invoke a few sigs, and they seem to work fine). Failure seems to be 100% repeatable, so far, stopping at the exact same char of printout in each case.
I would suspect that memory allocation failure is at the root of this, but have no idea how to confirm that suspicion... the failing pages (5 so far, completely unrelated to each other) all seem fairly long - over 80k. However, most other, much longer (200k+) pages parse quite successfully.
Is there any known way to trap a memory allocation failure?
Is there any known way to trap any other normally silent failure? (and what might fall into this category?)
Thanks, for any ideas or pointers. I am a noob to perl, but have >40 years experience programming... so I probably have some incorrect assumptions that are blinding me.
Dick Martin Shorter
In reply to Why would a Perl script stop with no informatives? by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |