in reply to Fast Processing

30-35 csv files ... each file is very big containing around 8-9 k lines.... taking time around 25-30 minutes

35 * 9000 = 315000 lines. That is not big. This shows perl processing 40 million lines in just over 10 seconds:

C:\test>wc -l bigfile 40000000 bigfile [11:29:51.26] C:\test>perl -nlE"}{say $." bigfile 40000000 [11:30:02.41] C:\test>

So, the cause of your slow processing is not Perl, nor the size of those files, but whatever you are doing in your script. Throwing parallel processes or threads at it should be your last resort, not your first.

Your first resort should be correct whatever is wrong with your code that is causing it to be so slow. The quickest way to do that would be to post the code and let us help you with it.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
RIP an inspiration; A true Folk's Guy