Hello,
I am using Parse::MediaWikiDump package to process static XML dumps of 3 languages of Wikipedia. For each page in English, the script tries to get the corresponding pages in the other 2 languages. the script run perfectly and successfully get required pages but I got segmentation fault error after a while and the process does not complete. Perl debugger point to the line:
while(defined($frPage = $frPages->next)) {
which cause the segmentation fault !!!.
I read a lot about the reasons of segmentation fault (memory limit, a bug in package, .... etc), there is not a specific way to know what is the reason.
it is very strange that this statement worked thousands of time then cause segmentation fault !. I am afraid it is a problem of memory or it is a bug in cpan parse::MediaWikiDump package.
any comments, tips, will be appreciated
thanks
best regards,
Motaz
In reply to Parse::MediaWikiDump segmentation fault by mksaad
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |