in reply to Curious Observation with File::Copy;

Another method worth a benchmark is tunneling tar through ssh.

This will pull the contents of the specified remote directory into the current directory.

ssh user@foo "cd /dir/to/copy ; tar cf - ." | tar xvfBp -
I used to run multiple instances of this to get maximum throughput and it works really well. I can't remember exact numbers, but I copied around 60+GB with 10 instances in pretty short order. It performed substantially better than scp. I'm not exactly sure why, but I was figuring it had something to do with the fact that I was copying mailboxes. The 60+GB was made up of hundreds of thousands of individual files averaging a couple of K each.

~~
naChoZ

Replies are listed 'Best First'.
Re: Re: Curious Observation with File::Copy;
by hossman (Prior) on Jun 27, 2003 at 22:20 UTC

    There's really not a lot of difference between that and using scp -r

      False. Otherwise I would've just used scp. scp was significantly slower. Besides, tar is quite a bit more flexible anyway.

      ~~
      naChoZ