nimdokk has asked for the wisdom of the Perl Monks concerning the following question:
I had done some tests a while back benchmarking how fast it would take File::Copy to move large files versus our third-party software. Normally, File::Copy would come out on top which is part of the reason we are looking at trying this out with Perl.
My question is this: has anyone set up something like this before - thoughts on other ways to do this. What I have set up is one script that will take a parameter to indicate which sub-directory structure needs to be copied. It will then loop through the subdirectories, copying all the files sequentially to the new location, creating the new directories as it goes. I looked at File::Xcopy, but I'm not sure if it would help us much and since I've never used it before, I'm not sure if the copies would be optimized the way they seem to be with File::Copy.
Thoughts would be appreciated.
Thanks.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Data Copying
by dragonchild (Archbishop) on Jan 18, 2005 at 14:59 UTC | |
|
Re: Data Copying
by Anonymous Monk on Jan 18, 2005 at 15:39 UTC | |
|
Re: Data Copying
by TedPride (Priest) on Jan 18, 2005 at 15:25 UTC |