I’ve got a tiered architecture that is based on CGI::Application. Each application has a slew of modules it may use to achieve its result. There is also a heavy reliance on a database backend (maintaining state, reporting et al). Obviously there is also a heavy reliance on CGI.
The architecture is essentially two tiers:
I’m wondering the best method of implementing this style of application. Should each module be essentially standalone? By that I mean should each backend module create its own db handle (if required), and other object instances that may be used more than once by other modules called by the "interface" tier?
Or
To conserve memory, cpu cycles, and other overhead (ala dbi instances), should I pass pre-created objects (at the upper tier) down to the bottom tier?
I guess I’m looking at a trade off here. Performance v’s maintainability.
I’m kind of leaning toward pre-creating objects and passing them around, simply because of the performance aspect. For example I’ve an application that would open and close four different DBI objects to produce a page. Creating one DBI instance with a DB handle and handing it out will obviously be quite a bit faster than creating it 4 different times.
The application is on apache, but not mod_perl (yet), so from a purely database perspective the answer is to put DBI into every module that needs it once we get to mod_perl. However for other objects (CGI and others) would it be better to precreate them and pass them about? Or to make each backend module completely self sufficient?
In reply to Performance v's Maintainability by Ryszard
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |