There's a sort method that I cannot recall the name of (it's sufficiently uncommon), where you break the records into several small files, sort each one separately, then combine the files 'slowly', sorting as you join, until everything is sorted. You can then write it all back out as one large file, but the key is that you never handle all 500Megs during a sort at once.
Another possible option is to read through the file once, and extract for each item you want to sort, the necessary keys to sort on and the positive where you are in the file (in bytes). Put these all into a hash, then sort the hash as appropriate. Then, reopen the large file, and using the positions, copy what's necessary into a second file which should be sorted appropriately. The only drawback here is that you need another 500megs of free space to do this in since you HAVE to duplicate the file otherwise you'll screw up the position info.
Now, both assume that these are flat files (that is, each data piece takes up a continuous set of bytes). If you have something which I can't imagine what, where data for one item is spread throughout the file, neither of these won't work.
In reply to Re: huge memory usage = software exception
by Masem
in thread huge memory usage = software exception
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |