in reply to File::Copy on Large-ish Files over Network
I too have had problems with 'File::Copy'. But for my situation, it was not with the size of the file, but rather with the stress on the system. I used 'File::Copy' on a mail server, and never seemed to have a problem when the server was getting less than 8 emails per second, but when the server went over 10 emails per second, I would get mis-matched "qf..." and "df..." files. So we were losing files!
My solution was to use Perl to move or copy files of less than 2GB, and use the system 'move/copy' for files larger than 2GB. In my environment, the cost of using 'system' or 'qx' for files larger than 2GB was actually faster.
For your situation have you looked at 'rsync'. The advantage is that if the sizes don't match you can restart 'rsync' until it is correct. And without ever going into the actual code, 'rsync' does seem much faster than 'move/copy/ftp' over networks. I have used 'rsync' on Windows, but I don't know if you have access to it in your environment.
Good Luck
"Well done is better than well said." - Benjamin Franklin
|
|---|