in reply to Chunk large data/log files into more manageable pieces
In such circumstances, I'd probably use split(1) (i.e. the Unix tool, not the Perl function).
split -l 10000 big_file to split into 10000-record files.
split -b 10000 big_file to split into 10000-byte files.
It would probably be faster too. (But ++ all the same (thinking of Windows)).
_____________________________________________
Come to YAPC::Europe 2003 in Paris, 23-25 July 2003.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Chunk large data/log files into more manageable pieces (split(1))
by EdwardG (Vicar) on Jun 20, 2003 at 06:43 UTC |