If you do a few billion lookups each taking 0.01 seconds, you're talking about a year. That's usually not acceptable.
Splitting up the problem into a series of subproblems that fit in RAM is a huge performance win. Despite the added complexity.
In reply to Re^3: Working with large amount of data
by tilly
in thread Working with large amount of data
by just1fix
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |