mksaad has asked for the wisdom of the Perl Monks concerning the following question:
Hello,
I am using Parse::MediaWikiDump package to process static XML dumps of 3 languages of Wikipedia. For each page in English, the script tries to get the corresponding pages in the other 2 languages. the script run perfectly and successfully get required pages but I got segmentation fault error after a while and the process does not complete. Perl debugger point to the line:
while(defined($frPage = $frPages->next)) {
which cause the segmentation fault !!!.
I read a lot about the reasons of segmentation fault (memory limit, a bug in package, .... etc), there is not a specific way to know what is the reason.
it is very strange that this statement worked thousands of time then cause segmentation fault !. I am afraid it is a problem of memory or it is a bug in cpan parse::MediaWikiDump package.
any comments, tips, will be appreciated
thanks
best regards,
Motaz
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Parse::MediaWikiDump segmentation fault
by Khen1950fx (Canon) on Jan 03, 2012 at 12:45 UTC | |
by mksaad (Initiate) on Jan 03, 2012 at 14:56 UTC | |
|
Re: Parse::MediaWikiDump segmentation fault
by grondilu (Friar) on Jan 03, 2012 at 12:15 UTC | |
by mksaad (Initiate) on Jan 03, 2012 at 14:52 UTC |