alanraetz has asked for the wisdom of the Perl Monks concerning the following question:

I've been looking at perl/CGI applications and many boast "all this functionality in a single script!" as if separate modules was a major detriment to CGI response time. Does anyone have any hard data on comparing CGI response time between apps that use many modules versus a single script?

My CGI app is 13 separate modules (and growing). I could easy munge it together into a single file, but setting up the timing analysis of it would be non-trivial (for me, at least). Does anyone out there have some real-world experience with this issue? Or some code or modules that would function as a timing test bed for CGI scripts?

thanks,

-alan

  • Comment on CGI speed: one script versus many modules

Replies are listed 'Best First'.
Re: CGI speed: one script versus many modules
by derby (Abbot) on Jan 08, 2002 at 17:31 UTC
    alan,

    This is a pretty good question. For timing scripts, I would recommend using Benchmark module (especially the code snippet shown in the new method). For benchmarking the importing of modules, that would be a little trickier since your probably using "use" which is a compile time directive. You could revert to loading your modules the old way ("require' and 'import') for timing.

    That being said, there are two issues concerning modules versus "all-in-one" scripts (AIOS). AIOS are easier for end users to manage - especially with the disdain some hosting companies (or sysadmin teams) show in upgrading the perl lib. Modules are easier to maintain from a developer prospective. I prefer modules but some are getting pretty close to "dll hell" status and do not take advantage of the Autoloader.

    In theory, modules would be slightly slower than AIOSes if every piece of code in the module is exercised. You save from a very small amount of load time for small modules to a very significant amount of time for large modules and many modules. The key here is import only the code you need. That can be accomplished by very wise module layout (and use) or by wise use of the Autoloader. You want to avoid importing code that will never be used by your script.

    So my .02 cents would be to continue with what you're comfortable with especially if you have to maintain it. Also if your modules start to grow, use Autoloader to speed up your start time (and save on memory).

    -derby

Re: CGI speed: one script versus many modules
by Aristotle (Chancellor) on Jan 08, 2002 at 19:58 UTC

    As derby said. The key is in the breakeven point between the time spent loading vs compiling code.

    If you only need a fraction of your code to process any specific request, then you can save compilation time by swapping code out into modules. If on the other hand you need (nearly) all of your code to serve any specific request, then you win on load time by keeping the code in a single file.

    Very large CGI application are quite likely to gain rather than lose performance by using a multimodule approach. Message board scripts for example are commonly spread across many files.

Re: CGI speed: one script versus many modules
by mce (Curate) on Jan 08, 2002 at 19:15 UTC
    Hi Alan,
    Checkout fastcgi is you worry about module loading times and CGI performance.
    ---------------------------
    Dr. Mark Ceulemans
    Senior Consultant
    IT Masters, Belgium
      Yes, a persistent environment is the best way to handle this. FastCGI, mod_perl, SpeedyCGI, PerlEx, etc. Then you can forget about performance tradeoffs (which are pretty small for things like this in most cases) and just build it the way that is easiest for you.
Re: CGI speed: one script versus many modules
by alanraetz (Novice) on Jan 09, 2002 at 00:04 UTC

    So I lay in bed last night after posting this and I guess putting my question out there made me think more about doing timing analysis. I browse through the perl cookbook often and wasn't there a module on hi-res timing? And I know there's a web user agent module...

    So, I got up browsed through the cookbook and yes, what I thought would be non-trivial was actually kind of trivial:

    #!/usr/local/bin/perl use strict; use LWP::Simple; use Time::HiRes; my $totalTime = 0; my $i; for ($i=0; $i < 20; $i++) { my $before = Time::HiRes::gettimeofday(); if ( !defined( my $content = LWP::Simple::get('http://chicodigita +l.com/cgi-bin/temp/webtool.cgi'))) { print "failed get\n"; } my $elapsed = Time::HiRes::gettimeofday() - $before; print "get $i in $elapsed seconds.\n"; $totalTime = $totalTime + $elapsed; } $totalTime = $totalTime / 20; print "Average response time for 20 requests was: $totalTime seconds.\ +n";

    Hey, it works! Well, it's more a measure of the internet speed, but averaged, at least that would be a speed difference as perceived by the user. So I'll keep you posted on my finding here.

    Thank you perlmonks, for making me think.

    -alan

Re: CGI speed: one script versus many modules
by Anonymous Monk on Jan 09, 2002 at 08:37 UTC

    So here's what I did: to take out the internet access timing from the equation, I set up a local apache server on my windows box using indigoperl (indigostar.com), which sets up mod_perl by default (although I tried this with and without mod_perl).

    I then combined my modules together in one big script and ran the above timing test against my local server, comparing the original script (with 13 modules) to the combined single script. The single script, btw, was 4,200 lines and 114kb).

    The results: The single script average response time was 0.86 seconds, while the module-base script was 0.97 seconds. Strangely, mod_perl enabled didn't varied only about 1-2% from the non mod_perl response time.

    So this is about a 10-15% increase in speed, which was more significant than I thought it would be, but still probably not worth the trouble, at least not right now...

    alan