in reply to Copying a large file (6Gigs) across the network and deleting it at source location

Sorry for stating this so flatly, but you should really do a checksum on the original and compare it to your copy before unlink()ing the orginal. Have a look at Digest::MD5, which seems to be very popular. You could also call the *NIX command 'cksum' (which probably has been ported to Windows, or at least has an equivalent) and get very similar results.

Also, when handling errors, try 'die' instead of 'print', so Perl will return the right error level: unlink( "$_" ) or die "Couldn't delete file: $_\n"; If you add an 'or die' to the copy operation, it will only attempt to delete the orginal if the copy is successful.

--
Allolex

  • Comment on Re: Copying a large file (6Gigs) across the network and deleting it at source location
  • Download Code

Replies are listed 'Best First'.
Re: Re: Copying a large file (6Gigs) across the network and deleting it at source location
by iburrell (Chaplain) on Feb 05, 2004 at 02:41 UTC
    Doing a checksum will effectively double the transfer time because the files need to be read back from the remote location. Especially since a network filesystem copy is fairly reliable since the OS does some error recovery.

      Well, yes. That's a good point. But no one said the checksum has to be done from the remote system. ;)

      --
      Allolex