ashokpj has asked for the wisdom of the Perl Monks concerning the following question:

I have a scenario where i need to query 25 remote database and process the results. In order to run it parallely i have planned to use POE(POE::Component::EasyDBI )and get the data, now each query will return millions of record and script halts with "Out of memory" error and without poe i tried to connect to single remote server with $dbh->{'mysql_use_result'}=1 option still get up with memory issue. I can use some range in query and run it multiple times but again as it uses various composite indexes and hence takes lot of time overall. Please suggest some good solution.

Replies are listed 'Best First'.
Re: Remote database and process
by BrowserUk (Patriarch) on Jun 07, 2010 at 06:19 UTC

    What would you being doing with the millions of records once you received them?

    Eg. Processing them immediately or storing them to local disk for further processing?


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      Hi, Am planning to localfile for further processing.

        Then I would suggest you look at the mysql --quick command & option.

        If you combine that with the mysql < script.sql > results., the memory usage by the local process will be minimal and the results will be retrieved and filed far more quickly than you will ever achieve with Perl & DBI. (Sorry if that offends anyone, but it is true!)


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
      Hi, Am planning to write to local file for further processing.