I regularly do analysis by reading many large text files into
Hashes and then doing the actual analysis.
Since, everytime I run the scripts, I have to read the text
files everytime sequentially and then execute the other
subroutines.
Is there a way to fork out my "text reading subroutines" so
that I can read all text files as parallel processes, and
in the end I can use all the loaded Hashes in my master script
to do my analysis??