http://qs1969.pair.com?node_id=175413


in reply to Complex Data Structures

One of the in-house modules I work with (and wrote) has arrayrefs and hashrefs nested about six deep in places. It's daunting when you look at it as a whole, but if you deal with it as an aggregate of dependent parts, rather than as a monolithic whole, it's quite manageable.

I tend to find that keeping track of several parallel "simple" data structures is more complex and error-prone than using a single, "complex" one. Inevitably, I modify one of the related structures without modifying the others in sympathy, and everything falls apart. (Maybe I'm misunderstanding your point?)

--
The hell with paco, vote for Erudil!
:wq

Replies are listed 'Best First'.
Re: (FoxUni) Re: Complex Data Structures
by demerphq (Chancellor) on Jun 18, 2002 at 16:53 UTC
    but if you deal with it as an aggregate of dependent parts, rather than as a monolithic whole, it's quite manageable.

    I tend to find that keeping track of several parallel "simple" data structures is more complex and error-prone than using a single, "complex" one.

    I concur. Having parallel data structures is hard to maintain (a maintainer coder wont know everywhere to change) and is confusing usually. So long as there is a consistant and logical structure to the data its complexity is usually not an issue. Although I have to admit that I did write Data::BFDump to dump complex and self-referential data strutures in a more intuitive way as when viewed with Data::Dumper even a fairly simple data structure can end up looking like a plate of spaghetti.

    Yves / DeMerphq
    ---
    Writing a good benchmark isnt as easy as it might look.