my $chunk = 1073741824; #1gb [...] #- read an output file worth of data $sizeRead = read $in_fh, $buffer, $chunk;
I think that is highly suspect. You really should not be reading that much at once, but instead read 4K..128K chunks in a loop and write them to the destination file immediately.
In reply to Re: Chunking very large files
by Anonymous Monk
in thread Chunking very large files
by hiptoss
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |