in reply to huge memory usage = software exception

I have had some experience with large files, and at the time tilly provided the golden tip: BerkeleyDB. Available on both win32 and gnix. You can read all about my quest over here. Whatever you can fit in memory, Berkeley's BTree beats perl's qsort by far for large amounts of items (beats it in terms of both memory and CPU ;-).

Having said that, my xp on linux is that perl nicely dies with an 'out of memory' at the moment I xceed my RAM + SWAP. But that may be different with activestate.

Another issue is how you your data is organized. Maybe you have an array of 1Mb chunks each, in that case your memory overhead is small. However, if you have a 2000 Mb array of 2 bits, you need at least 80Gb of memory. Array overhead may be large. In my case, a 10Mb array of 2-byte items took me over 400Mb of memory. Methinks you get the picture by now.

Hope this helps a bit,

Jeroen
"We are not alone"(FZ)

  • Comment on Re: huge memory usage = software exception