in reply to Chunking very large files
That line seems a bit suspicious. Isn't that going to require a buffer the size of the whole file, which will then only be partly filled before you hit EOF?
Why not make that for loop a while loop, and allow it to continue until it is done? Just keep reading $size sized chunks until you get to EoF and read less than the buffer's size worth of data. No need to treat the last read differently.
Also, wouldn't it make more sense to break the files up into fixed size chunks, rather than a fixed number of chunks each? The reason to chunk them in the first place is so they fit in memory or on a USB stick or something right?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Chunking very large files
by hiptoss (Novice) on Nov 09, 2011 at 19:36 UTC | |
by SuicideJunkie (Vicar) on Nov 09, 2011 at 20:07 UTC | |
by hiptoss (Novice) on Nov 09, 2011 at 20:13 UTC | |
by SuicideJunkie (Vicar) on Nov 09, 2011 at 20:19 UTC |