2 seconds to read it; 1/2 second to process it; 4 seconds to write it; and only 510MB memory used in the process!
That's efficient!
Not really, from a memory standpoint. You could do much better with a standard loop that reads to a small buffer and writes to the output file in a loop.
(Not to mention that you seem to have really fast disks (SSDs?). Haven't met a HDD yet that could read faster than 150 MB/s or write faster than 100 MB/s.)
open my $in, '<', 'input.txt' or die; open my $out, '>', 'output.txt' or die; my $buf; while (read $in, $buf, 4096) { $buf =~ tr/\t/ /; print $out $buf; } close $_ for ($in, $out);
But, this has a large potential to slow down the loop to around 10 MB/s because of properties of seeking media, and OS algorithms on read-ahead and flushing that never quite give that good performance [1]. Still a helluva lot better than the OS swapping you out because it can't fit the 500 MB into memory.
[1] I have never seen an OS successfully avoid doing reading and writing in parallel (= sub-optimal) for cat largefile > otherfile
In reply to Re^2: Windows 7 Remove Tabs Out of Memory
by Anonymous Monk
in thread Windows 7 Remove Tabs Out of Memory
by tallums
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |