You need to profile your code to see where it's spending the most time. Once you've done that, the solution will probably be obvious to you, but if it isn't, people here will be happy to help provided that you supply all the necessary information.
However, it is my experience that when it comes to crunching text files in perl the bottleneck is in reading and writing the disk even when my perl code is really crufty. In which case the solution is to buy faster disks, with bigger caches, and arrange them so that they can be read in parallel without saturating the various buses. Your sysadmin will be happy to help with this. Note that you can read/write disks in parallel without having to parallelise your code. I leave figuring out how as an exercise for your sysadmin and operating system vendor.
In reply to Re: Simultaneously process the multiple files at the same time
by DrHyde
in thread Simultaneously process the multiple files at the same time
by senthil_v
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |