Original poster here.
All is well. Used a combination of shell and perl to get the job done. Here's a rundown of the final solution:
- Open a new file for writing on another filesystem. Go through each input file and rewrite it to the new file using printf and int rand(1_000_000) to make lines with a random number at the start of each one.
- Use the *nix sort command to sort this file to a new one, writing to a different filesystem, and using the -T flag to put the temporary stuff on another filesystem. This took 45 minutes, but worked perfectly.
(I tried combining 3 and 4 by having perl parse the large file and write to separate files, but it ran out of memory)
- Split the large sorted file up using "split -l 250000 sortedfile myfiles_"
- Have a perlscript open each file, remove the number and at the start of it, and write to the final finished files.
This worked fine - no memory problems, little I/O impact, and finished in a reasonable amount of time (< 4 hours). Thanks everyone!