That line seems a bit suspicious. Isn't that going to require a buffer the size of the whole file, which will then only be partly filled before you hit EOF?
Why not make that for loop a while loop, and allow it to continue until it is done? Just keep reading $size sized chunks until you get to EoF and read less than the buffer's size worth of data. No need to treat the last read differently.
Also, wouldn't it make more sense to break the files up into fixed size chunks, rather than a fixed number of chunks each? The reason to chunk them in the first place is so they fit in memory or on a USB stick or something right?
In reply to Re: Chunking very large files
by SuicideJunkie
in thread Chunking very large files
by hiptoss
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |