in reply to Re: two hashes occupying the same space
in thread two hashes occupying the same space

Simple suggestion: Implement your proposal and then compare the time taken to perform 80 million read/modify/write cycles across 300,000 kv pairs, with doing the same using a perl hash.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
"Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon
  • Comment on Re^2: two hashes occupying the same space

Replies are listed 'Best First'.
Re^3: two hashes occupying the same space
by davido (Cardinal) on Oct 28, 2004 at 06:00 UTC

    I don't need to test to know that the database solution will be slower than the hash solution as long as the hash fits in conventional memory, and I think you know the answer to that too. But at some point, 300_000 k/v pairs, 500_000, or maybe a million, the hash is going to cease to be a plausable option due to memory constraints. Given that the OP is asking about trying to get tandem lookup hashes to occupy the same memory space is a pretty good indication that (s)he is running into memory problems.

    One avenue that folks often take when they find that they just can't hold the whole dataset in memory is to turn to databases. Them's the breaks. Tough decisions will have to be made. Either live with the inefficiency of not having the tandem lookup hash capability, or live with the inefficiency of the overhead of a database.

    If holding the dataset in memory once is ok, but twice is impossible, then you can't have a tandem pair of lookup hashes.

    If memory isn't an issue, then the original question is pointless. But there was a point stated in the question: "What I'm trying to do is reduce the memory requirement." In retrospect, I wish I hadn't offered the database suggestion. Not because it's not a reasonable comprimise (It is reasonable) , but because the OP undoubtedly already knows his options with regards to databases, being a 'Team Sybase Member'.


    Dave