in reply to Chunking very large files

read $in_fh, my $buffer, -s or warn "Read zero bytes from $_: $!";

That line seems a bit suspicious. Isn't that going to require a buffer the size of the whole file, which will then only be partly filled before you hit EOF?

Why not make that for loop a while loop, and allow it to continue until it is done? Just keep reading $size sized chunks until you get to EoF and read less than the buffer's size worth of data. No need to treat the last read differently.

Also, wouldn't it make more sense to break the files up into fixed size chunks, rather than a fixed number of chunks each? The reason to chunk them in the first place is so they fit in memory or on a USB stick or something right?

Replies are listed 'Best First'.
Re^2: Chunking very large files
by hiptoss (Novice) on Nov 09, 2011 at 19:36 UTC
    You're right in that the final version will have a fixed chunk size. This is simply an exercise to figure out if I can do it or not, and then I'll change the code accordingly.

    I am not very experienced with buffers, and the code I pasted is from an earlier answer on perlmonks to a very similar question that I asked, only they were dealing with smaller files. I'll see if I can figure out how to adjust the loop and use fixed chunk sizes.

    Thanks for your advice.

      Should be as simple as:

      my $sizeRead = $chunkSize; while ($sizeRead == $chunkSize) { $sizeRead = read $in_fh, $buffer, $size; die "Error reading: $!\n" unless defined $sizeRead; ... }

        What about the end of the file where $sizeRead will be smaller than chunkSize?