In other words, instead of doing the sorting after populating the file with all of the hash keys, do the sorting one element at a time as each hash key is added to the file.From what I understand, a huge hash structure already exists and foreach keys %hash makes a list of the hash keys, which essentially doubles the amount of memory required. My question is how to spew all of the keys into a file without making an intermediate structure that contains all of the keys. I suspect that there is a way to do that. If so, the the sort part belongs to another process that will release its memory when done. The Perl hash table assignments of 1,2,3,4 will cause %hash to grow, but only as much as needed and presumably less than 2*storage required for the keys.
In reply to Re^5: In-place sort with order assignment
by Marshall
in thread In-place sort with order assignment
by BrowserUk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |