in reply to Getting/handling big files w/ perl

My guess is that you should throw silicon at it. The bottleneck whatever it is is probably hardware, not software ... download speed; the fact that you are using WGET (thus HTTP encoding/decoding) ... the speed of the storage subsystem. Parallelizing the operations of the CPU, even though there is more than one core, probably will not improve the situation. Measure to prove otherwise before proceeding.

Replies are listed 'Best First'.
Re^2: Getting/handling big files w/ perl
by karlgoethebier (Abbot) on Nov 16, 2014 at 19:27 UTC
    "...throw silicon at it..."

    He throws already: "8-core 64-Gb MacPro w/ 2 Tb of SSD".

    Regards, Karl

    P.S.: I wish this nice gear for christmas ;-)

    «The Crux of the Biscuit is the Apostrophe»

Re^2: Getting/handling big files w/ perl
by BrowserUk (Patriarch) on Nov 16, 2014 at 20:45 UTC
    My guess is that you should throw silicon at it.

    So intuitive you are. Not!

    He has the hardware. What he's asking, is how can he make good use of it.


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.