in reply to Fastest way to copy a file from 1 direcotry to another directory

I'd use File::Copy but I doubt anything's going to be as fast as cp (or Windows equivalent) unless your files are tiny.

Replies are listed 'Best First'.
Re^2: Fastest way to copy a file from 1 direcotry to another directory
by Anonymous Monk on Jul 06, 2010 at 19:23 UTC
    I just ran a series of experiments. In this case, I'm running on an old linux machine, and copying from a local disk to a nas-mounted directory.

    Using File::Copy : 129 seconds system("cp blah blah") : 171 seconds
    Now I tried a different approach: @files is a list of the 250 files:
    my $cmd = "cd $olddirectory; tar -cvf - " . join(' ',@files) . " | gzi +p -c -1 | (cd $newdirectory; gzip -c -d | tar -xvf -)"; system ($cmd); Using gzip -9 : 64 seconds Using gzip -1 : 57 seconds Without gzips : 52 seconds
    All times are for copying 250 small files amounting to 1084k. I also found that times scaled symmetrically with number of files.

    For what its worth, rsync was copying at the rate of approximately 3 files per second, so the perl/tar method is much faster. (I tried perl in the first place because this directory contains 1.5 million files - rsync was taking more than a day just to collect file info...)

    In this case, its apparently more necessary to minimize processing power versus network usage (otherwise gzip -9 would be faster).

    As always, your mileage may vary...