in reply to Iterating through HUGE FILES

I had a very similar problem working with a HUGE text file myself once. It was about 50 gigs and I ran into a limit that felt like something compiled into perl or my system libraries - I tried all sorts of work-arounds to try to keep the processing incremental.

In the end, I found my solution in using an external program written in shell to send the lines to perl, one at a time. It was slower than it probably would have been running the loop in perl, but it worked. Also make sure to see if you get any relief from your problem if you pipe it in through cat or similar - piping your input makes it impossible to seek within the file - and I believe that perl treats it differently.

Sorry for all the hand-waving, but when you are having bugs that shouldn't occur, you have to be willing to try solutions that shouldn't work.