is it essential that elements of @sets be deleted, or would it suffice to keep the array a constant size, and "undef" elements when appropriate?
Yes. In use, new sets would be added to @sets on the fly. The mechanism is essentially a queue. New sets ("work items") are being continuously added asynchronously with the resources (represented by the booleans) becoming available. If I don't remove completed work items (sets for which all required resources have become available), then the queue would grow endlessly.
But your notion about using hashes is a good one. You've used a hash to index those sets that use a boolean against each boolean. You're undefing rather than splicing to avoid screwing up the indexing. But, if I also use a hash for holding the sets, I can delete a set from %sets, without screwing up the index. And that greatly reduces the number of sets that need to be examined for each boolean.
Thanks. I'll have a go at implementing that tomorrow.
In reply to Re^2: Better algorithm or data structure?
by BrowserUk
in thread Better algorithm or data structure?
by BrowserUk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |