in reply to Splitting Large File into many small ones.
Read line by line is not a good idea. You should use read() to read in a chunk, then do one <> to make sure the small files end at line ends. (I tested the demo with a big log file, the performance is very good.)
use strict; use warnings; open (FH, "<message_log01") or die "Could not open source file. $!"; my $i = 0; while (1) { my $chunk; print "process part $i\n"; open(OUT, ">part$i.log") or die "Could not open destination file"; $i ++; if (!eof(FH)) { read(FH, $chunk, 1000000); print OUT $chunk; } if (!eof(FH)) { $chunk = <FH>; print OUT $chunk; } close(OUT); last if eof(FH); }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Splitting Large File into many small ones.
by disciple (Pilgrim) on Dec 09, 2003 at 04:31 UTC |