in reply to merge multiple files giving out of memory error

I would add some code to give the status on the console of how many records are read before the program "hangs". Below I print the $rec_count if it is evenly divisible by 1,000. Pick an appropriate number for you...
my $rec_count=0; while (<>) { $rec_count++; print STDERR "rec=$rec_count\n" if ($rec_count%1000==0); .... }
From this debug output, you can calculate an estimate of much of the entire data set got read before the program "hung" while creating the %seen hash. Right now we know nothing about that.

Your first while loop creates a HoA, Hash of Array. I believe that in general, this will require a lot more memory than a simple hash_key=> "string value". If indeed memory is the limit, then instead of a HoA, do more processing and put up with the associated hassle with modifying the "string value".

The first question and objective is get your required data into a structure that "fits" into memory. If that's not possible, then there are solutions.

update: This means to get your first "while" loop not to hang. The second loop has some things like "sort keys" that could take a lot memory and which your program doesn't absolutely have to do (other ways to do that function).