in reply to segmentation fault (core dumped!)
Since Perl doesn't complain about lack of indentation in code we can cross that one off the list. Did it give you an actual message such as "Out of memory!"?, or did it just silently fail?
How big is 2.txt?
Totally unrelated to a core dump, but a good idea nevertheless: As you're not checking for failure in your opens, you should put use autodie; at the top of your script near your use strict; line. That way you'll know if a file fails to open.
On the topic of memory, you can cut your consumption considerably if you don't store both an array and a scalar each with their own copy of the input file. Something like this:
my $d = do { local $/ = undef; <$read>; };
This totally eliminates @e. If the input file is huge, you save a lot of memory. If it's really huge, it won't matter.
Dave
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: segmentation fault (core dumped!)
by Anonymous Monk on Jul 03, 2012 at 05:35 UTC | |
by davido (Cardinal) on Jul 03, 2012 at 05:49 UTC | |
by Anonymous Monk on Jul 03, 2012 at 06:36 UTC | |
by Anonymous Monk on Jul 03, 2012 at 06:19 UTC | |
by davido (Cardinal) on Jul 03, 2012 at 08:24 UTC | |
by marto (Cardinal) on Jul 03, 2012 at 08:41 UTC |