I tend to find that keeping track of several parallel "simple" data structures is more complex and error-prone than using a single, "complex" one.
I concur. Having parallel data structures is hard to maintain (a maintainer coder wont know everywhere to change) and is confusing usually. So long as there is a consistant and logical structure to the data its complexity is usually not an issue. Although I have to admit that I did write Data::BFDump to dump complex and self-referential data strutures in a more intuitive way as when viewed with Data::Dumper even a fairly simple data structure can end up looking like a plate of spaghetti.
Yves / DeMerphq
---
Writing a good benchmark isnt as easy as it might look.
In reply to Re: (FoxUni) Re: Complex Data Structures
by demerphq
in thread Complex Data Structures
by random
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |