Murcia has asked for the wisdom of the Perl Monks concerning the following question:
Problem:
I want to download thousands of files from a server. These files are distributed in same dirs.
Now I do it with a system call (wget) uploading all files in each dir one after another.
that takes time!my @dir = (lmo lin lmf bsu ssu sst ); # and more dir names foreach my $dir(@dir){ my $link = "ftp://ftp.<Path_to_dir>.jp/$dir/*"; system("wget -nH -nd --timestamping $link"); }
Should I make a fork? Makes this sense? And IF how? Internet bandwidth is not a problem!
Thanks Murcia
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: file upload
by halley (Prior) on Sep 15, 2005 at 14:32 UTC | |
|
fork me? fork queue!
by LanceDeeply (Chaplain) on Sep 15, 2005 at 17:03 UTC | |
|
Re: file upload
by newroz (Monk) on Sep 15, 2005 at 15:03 UTC |