in reply to sorting large files

Well, one kloodge way to do this would be to read in x number of lines at a time, sort them, then output to a new file, where x is the maximum number of lines Perl can easily handle at once. Then you run a series of line-by line merges on the resulting files, until they're all merged back into one file and you can unlink everything except the now sorted file.

Assuming 500 MB of data and a reasonable amount of memory, you ought to be able to sort 50-100MB chunks at a time, with a maximum number of 9 merges.