jerrygarciuh has asked for the wisdom of the Perl Monks concerning the following question:

Well met monks,
I have been pondering whether I should break my latest app up into smaller scripts. I am using Trimbach's Cycles::Calendar to display Calendar info from a mySQL db. I have the following modules imported into the namespace as well (did I say that correctly?):
#!/usr/local/bin/perl -w use strict; use lib qw(/home/yatayata/www/local_mods); use DBI; use Cycles::Calendar; use POSIX qw(strftime); use CGI qw/:standard *table start_ul/; #now you can start_table/end_t +able and start_ul/end_ul use CGI::Carp qw/fatalsToBrowser /; use CGI::Pretty qw/ :html3 /; use Mail::Sendmail;

I am wondering if this process is slowing down the application, not that it is all that slow but it isn't live yet. It is 773 lines long and has all the admin tools, mail tools, account management tools, and calendar tools in one bundle. What is the thinking on apps like this. Should I seperate say admin from account management from display and input? What are the relative merits of one large app versus multiple utility scripts?
TIA
jg
_____________________________________________________
Think a race on a horse on a ball with a fish! TG

Replies are listed 'Best First'.
Re: One Big Script vs Several Specialized Scripts
by gav^ (Curate) on Apr 11, 2002 at 01:36 UTC
    If it's a CGI script and it starts getting big then you could look at SelfLoader/AutoLoader to save compiling subs you don't use and using require instead of use to save loading in modules if you don't need them.

    Also you might want to consider using a templating tool to remove the HTML from your script, not only reducing the code but improving maintanance. If there are any subs that can be reused elsewhere consider putting them in a seperate module.

    Of course, if this is running under mod_perl this isn't an issue...

    gav^

Re: One Big Script vs Several Specialized Scripts
by belg4mit (Prior) on Apr 11, 2002 at 02:58 UTC
    My personal take is WWUD? (What Would Un*X Do?). And the answer is quite simply. Many small tools. Less to manage at once. Less to go wrong. Allow for unseen combinations, uses and development.

    --
    perl -pe "s/\b;([mnst])/'\1/mg"

Re: One Big Script vs Several Specialized Scripts
by digiryde (Pilgrim) on Apr 11, 2002 at 02:21 UTC

    In every project I take on, I reduce the "pieces" of the project down to individual lumps. Several reasons. One is to find overlapping functionality and stop duplication of efforts. Another is to make development/debugging easier as I am now debugging small pieces of code. The next big reason is that I can then dynamically load (in most cases) only the pieces I need for that individual run (or CGI hit).

    I find that all of these things improve development time, quality and my teammates (if I have any) ability to contribute to what I am focused on. In the long term, it promotes reusability, scalability and maintainability (and puts me into a better position with my management).

    The calendar based applications I have written in the past always had many functions, though only a few were used on each call. I was able to write much faster functioning code (did not have the ability to use mod_perl or anything similar in most cases) and we were able to quickly extend core components to the "new" requests that inevetiably come in (come to mind) after a neat new tool is put online.

    In a nutshell, many small pieces (with very few exceptions in my experience) almost always work better than large chunks of code.</P.

    There is a simple way to load an unknown object (module) at run time. Rather than have a bunch of use this; use that;s up front in my code. For me, using a dynamic system has cut down on overall loading, memory, time, etc.

Re: One Big Script vs Several Specialized Scripts
by Fletch (Bishop) on Apr 11, 2002 at 01:52 UTC

    Of course there's always mod_perl and Apache::Registry. With Apache::RegistryLoader you could precompile your script when apache starts and there'd be no startup hit when requests come in.

    But as a general sugguestion, yes you'd probably be better off splitting functionality off into seperate chunks. Place common code (configuration code, handling authentication, et al) into modules, and then split processing of the different requests into seperate programs.

Re: One Big Script vs Several Specialized Scripts
by webadept (Pilgrim) on Apr 11, 2002 at 01:45 UTC
    Everything gav^ said is on the money. For me its always a good idea to break down code into the smallest meaningful functions I can. And break the script up into usable modules where ever possible.

    Notice I said "meaningful" functions. Each time a program needs to call a function you loss some speed, so keep that in mind, but the trade off is often reusable code, so on your next project you are starting with a better tool kit and can get things done faster.

    Well.. that's my two bits, hope it helps.

    webadept

Re: One Big Script vs Several Specialized Scripts
by jonknee (Monk) on Apr 12, 2002 at 02:06 UTC
    The main reason why I break up CGI's is that when I'm hacking one section the whole site doesn't go down. If I miss a semi colon in one minor part of the script, the whole thing might go down... but if it's in it's own file it will just go down when I invoke it.