When you slurp files, you need memory to store those files. Performance may degrade depending on how much memory you use and how much work is performed.
It makes sense that
@complete = <INPUT> is faster because it keeps slurping into (and overriding) the same array. Less memory is used than the first example.
When you
push @complete, @partial, you're
copying every "line" from
@partial into
@complete, thus using more memory and doing more work than the second example.
Try this instead:
push(@complete, $_) while <INPUT>;
It pushes every line read from INPUT onto the array. This is better than
push(@complete, <INPUT>); which creates an intermediate list.
We can get fancy with:
my @complete = do { local (*ARGV, $/); @ARGV = @list; <> };
Or better, take advantage of someone else's good work:
File::Slurp
Other comments:
- Be strict: use warnings; use strict;
- Use 3-argument open if you can. Die with the error $!.
- Close the input filehandle when you're done. Use a lexical input filehandle if you can.
- No need to double-quote "$file".
So, something like:
use warnings;
use strict;
my @list = ...
my @complete;
for my $file (@list)
{
open(INPUT, "<", $file) or die $!;
push(@complete, $_) while <INPUT>;
close(INPUT);
}