We have a system here that uses very large datastructures. For some projects, the size of e.g. a certain hash can reach up to 100_000 elements: The script becomes notably slower when reaching this size. Is there any way to speed this up? I was thinking of using tie with a suitable Tie:: module that is faster than perl's default implementation. I guess it should be possible to be much faster if you limit yourself to storing literal values only, not references. Any recommendations?
In reply to Efficient giant hashes by crenz
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |