The list with the smaller nodes will probably have as many as 1,000,000 nodesEven if you can maintain this in memory in Perl you may want to reconsider if you want to maintain this in memory.
For large amounts of data it usually makes sense to temporarily store them on disk or other media and to process them in chunks. It draws much less on system resources and it creates a more scalable solution.
For an example of this type of solutions see the famous mergesort algorithm that can sort an unlimited amount of data as it is not limited by available system memory.
In reply to Re: perl memory use question
by varian
in thread perl memory use question
by exodist
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |