DeepakS has asked for the wisdom of the Perl Monks concerning the following question:

Hi Perl Monks, I am a newbie to Perl. I have a Perl script (uses LWP UserAgent), that has a list of providers. For each provider, the script logs into their FTP machine, picks the data files (abt 20 KB each) and downloads all of them. All this happens one by one for each provider. This takes a lot of time, if some provider has put billions of files.

I want some ideas by which I can decrease the execution time by running parallel threads or multiple file downloading etc. Please guide me by telling what modules and functions to be used.

Replies are listed 'Best First'.
Re: Optimizing FTP Downloads
by BrowserUk (Patriarch) on Aug 18, 2011 at 07:23 UTC
    I want some ideas by which I can decrease the execution time by running parallel threads or multiple file downloading etc. Please guide me by telling what modules and functions to be used.

    You've picked out "parallel threads" as a possibility to solve your task. Now you ask "what modules and functions". Have you tried typing 'threads' into CPAN?


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
Re: Optimizing FTP Downloads
by cjb (Friar) on Aug 18, 2011 at 07:51 UTC

    You might get some luck with Parallel::ForkManager, the example code provided in the description does pretty much what you're wanting to do.