Since Perl doesn't complain about lack of indentation in code we can cross that one off the list. Did it give you an actual message such as "Out of memory!"?, or did it just silently fail?
How big is 2.txt?
Totally unrelated to a core dump, but a good idea nevertheless: As you're not checking for failure in your opens, you should put use autodie; at the top of your script near your use strict; line. That way you'll know if a file fails to open.
On the topic of memory, you can cut your consumption considerably if you don't store both an array and a scalar each with their own copy of the input file. Something like this:
my $d = do { local $/ = undef; <$read>; };
This totally eliminates @e. If the input file is huge, you save a lot of memory. If it's really huge, it won't matter.
Dave
In reply to Re: segmentation fault (core dumped!)
by davido
in thread segmentation fault (core dumped!)
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |