I have this Perl script which first fetches one field and its corresponding value from a 1gb(around 150000 lines) input file and stores into one hash. now after processing this file it takes another 16gb input file (around 30 million lines), reads line by line, compares it with the hash keys and print the corresponding hash value to the output file with few other required data, creating a 2gb output file. Now they have asked me to add 4 new fields into the report which needs 4 more hashes to be included into the script. but when I added and tested the script...it caused severe memory issue in the server.. current memory details on my linux server.
# free total used free shared buffers cac +hed Mem: 8158176 8124068 34108 0 13248 7928 +420 -/+ buffers/cache: 182400 7975776 Swap: 2104472 103976 2000496
I need to know how can I resolve this memory issue...If needed, how much more ram I need to add into my server...I know information provided by me could be vague for you.. but please let me know what else you need to know (shall I paste the complete script here?)...please help...
In reply to How to reslove this memory issue by Ankur_kuls
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |