in reply to Re: Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!
in thread Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!
Now that worked fine, and I still only get 600 processes reported!open DF,"test.txt"; binmode DF; my @test = <DF>; close DF; my $rec; foreach(@test){$rec.=$_;} open (DF,'>test1.txt') || die ("Failed to open test1.txt: $!\n"); binmode DF; print DF $rec; close DF;
So this is a major improvement in the write area, but the read is still (at least on my XP system) a big cause of most of the io processes as it's reading in 4K chunks!open DF,"test.txt"; my @test=<DF>; close DF; # Print first 3 elements to prove read as an array print "Line 1=$test[0]<BR>Line 2=$test[1]<BR>Line 3=$test[2]<BR>"; # Can we get around this? Waste of processing & memory! my $rec; foreach(@test){$rec.=$_;} open (DF,'>test.txt') || die ("Failed to open test2.txt: $!\n"); binmode DF; print DF $rec; close DF;
|
|---|