Problem:
I want to download thousands of files from a server. These files are distributed in same dirs.
Now I do it with a system call (wget) uploading all files in each dir one after another.
that takes time!my @dir = (lmo lin lmf bsu ssu sst ); # and more dir names foreach my $dir(@dir){ my $link = "ftp://ftp.<Path_to_dir>.jp/$dir/*"; system("wget -nH -nd --timestamping $link"); }
Should I make a fork? Makes this sense? And IF how? Internet bandwidth is not a problem!
Thanks MurciaIn reply to file upload by Murcia
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |