in reply to Optimization, side-effects and wrong code
As an alternate solution you could make a linked list out of your hash. Each element has a _next => $next link. Then you can have multiple traversals and none interfere with each other. I know that may not be a usable solution for the current project but in the future if you need to be able to search threw the large list multiple times, you could build the linked list out a regular hash, then traverse it at will and reset your 'pointers' to the beginning when needed. Of course if this is a set that you are doing adds ane removes from then you have to do all the associated work with cleaning up the list. There is quite possibly a modules that ties a hash to a linked list to give you hash lookups and fast searchs. The reseting of each still has global consequences if you are using each somewhere else on the same hash.
|
|---|