in reply to Chunking very large files
reading in 1GB chunks is probably ok on your machine although there is no performance help for reading more than 64Kbytes at a time.
Simplify you code so that all is handled in one case, perhaps like below pseudo code.
Just keep asking for the same size chunk each time, if the file has less than that size, you will only get what is left. There is no need for a special case to handle the last chunk. If you want to test for undef vs 0 bytes case, do that after the while() loop.
foreach (@file) { open input file or die... my $part=1; my $sizeRead; while ($sizeRead = read $in_fh, $buffer, $chunk;) { open out file for the current part number increment part number for next time if there is such write the data...using $sizeRead (what was actually read) } if $sizeRead is undef, last read didn't "work". }
|
|---|