As others here said (hippo) you need to clarify whether it happens with the same file all the time.
You say:
The script seems to be working okay when XMLList.txt contains, say 10-15 lines. Anything more and perl script seems to abruptly stop parsing half-way and the shell script (which calls the perl script) returns a "27000838 Segmentation fault(coredump)" error.
But maybe there is a bad file half-way the list which is never processed when you process the first 10-15 files.
So, modify your code to print the filename each time, and also the number of files processed to ease you like:
my $fcount = 1; while (my $xmlfile = <$list>) { chomp $xmlfile; print STDERR "$0 : about to process file # $fcount = '$xmlfile'\n" +; $twig->parsefile($xmlfile); print STDERR "$0 : file $xmlfile' processed OK.\n"; $fcount++; }
Note: it is not either bugs or bad file(s). It could be bugs and bad file(s)
In reply to Re: Segmentation coredump issue
by bliako
in thread Segmentation coredump issue
by Kal87
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |