in reply to Re^2: Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!
in thread Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!

I just realised I completely ignored one of your questions.

Wouldn't your code mean the lines are stripped of the line feeds they originally had?

Yes, as I coded it the newlines would be removed. This would effectively do a free chomp @test;. I don't see this as a problem as it would cost very little to replace them when writing the lines out again.

However, if you want them left in place, then you could use the following split instead.

#! perl -slw use strict; my $file = 'test.txt'; open DF, '<:raw', $file or die "$file : $!"; my @test = split /(?<=\n)/, do{ local $/ = \ -s( $file ); <DF> }; close DF;

All that said, if you are only appending to the end of the file, why read the file at all? Have you heard of opening a file for append?


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.