... isn't writing chunks then concatenating them simply going to defer the memory consumption problem to the program doing the concatenation?
Nope. Concatenating files takes very little application memory unless you're trying to optimize disk head movement. Even then, the OS generally does a better job. Think
while ( <IN> ) {
print OUT $_;
}
but in slighly larger chunks.
| [reply] [d/l] |
Effectively, that is exactly what the OP's test script is doing. Except that the data is being generated internally, 300 bytes at a time, rather than being read from an external source.
There is no reason that I am aware of why either version of the OP's testcase should be consuming large amounts of memory. It certainly doesn't on my system--1.5MB max.--which led me to conclude that the OS must be set up to buffer output files in ram until closed.
So if it is the OS that is responsible for the memory consumption, then it will equally affect output files being written by a seperate Perl process, or a system utility unless the OS discriminates between system utilities and other programs? Or maybe this is a build option?
Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
"Memory, processor, disk in that order on the hardware side. Algorithm, algoritm, algorithm on the code side." - tachyon
| [reply] |
my $str .= stuff;
or it might be something deeper. If the full code isn't available for us to diagnose, and memory use is acceptable if batch sizes are reduced, I say reduce the batch sizes and concatenate.
| [reply] [d/l] |