Hi ... My requirement is to parse huge GB sized files (2 input GB files) join them into one with a common field. Since the input files are in GB sizes the parsing alone is taking huge time. I am still left with comparison and redirecting the output.
The 2 input GB files are on Linux server. I tried working with Hashes, Sorting and using Tie::File module but all of them resulted in "Out of Memory" I cannot even use the Database for my requirement.
Is there a way I can speed the process? Please help me, I have been struggling with this for past one week.
In reply to Working on huge (GB sized) files by vasavi
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |