in reply to Re^4: Reducing memory footprint when doing a lookup of millions of coordinates
in thread Reducing memory footprint when doing a lookup of millions of coordinates

Does this help?

No.

But from those figures I have to agree with moritz that splitting the dataset into 24 files and loading each set individually makes perfect sense. It would hardly affect your performance at all, but reduce your memory consumption to the size of the largest set.

Ie. roughly 2235512/7356374 = 30% of 1.2GB ~= 350MB.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
  • Comment on Re^5: Reducing memory footprint when doing a lookup of millions of coordinates