I usually use a script from the Perl Cookbook (first edition) when I need to randomize the lines in a file. It works great. The script loads the entire file into memory as an array, then uses a fischer-yates shuffle to randomize the array elements before printing them out.
But this time, my file is much larger and I'm wondering if there is a better method to use. The file has around 3.7 million entries, and is 227 megabytes in size. My machine has 512MB of memory, but I was afraid this was going to bring it to its knees. And in fact, when I tried, it quickly died with an "out of memory" error. Is there a better way to randomize the lines in a file, that works for files that would challenge the memory capacity of a machine? I can do the coding; I'm just wondering about what approaches people would suggest.
In reply to Randomize lines with limited memory by natch
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |