in reply to Why can code be so slow?
Other monk's have already addressed the conventional code profiling part of this, so I'm going to chip in on a often overlooked part of performance tweaking for CGI scripts. And that is that even with ideal algorithms, non-persistent CGI is slow.
I'm prefacing this with the benchmarks I did a while ago for various CGI processing packages using Apache2 on a Linux box running on an Athlon XP2100+ processor and Perl 5.8.8 benched with http_load doing 10 parallel fetches for 30 seconds. The scripts was a very simple one that decoded one CGI parameter and just printed its value back to the web browser. The 'null' scripts cheated and just printed a value without bothering to actually read the parameter.
CGI.pm (3.05) via standard CGI - 16 fetches per sec +ond CGI::Simple (0.075) via standard CGI - 20 fetches per sec +ond CGI::Deurl (1.08) via standard CGI - 36 fetches per sec +ond CGI::Thin (0.52) via standard CGI - 38 fetches per sec +ond CGI::Lite (2.02) via standard CGI - 52 fetches per sec +ond CGI::Minimal (1.16, :preload) via standard CGI - 52 fetches per sec +ond CGI::Minimal (1.16) via standard CGI - 66 fetches per sec +ond cgi-lib.pl (2.18) via standard CGI - 71 fetches per sec +ond null Perl script via standard CGI - 103 fetches per sec +ond null C program via standard CGI - 174 fetches per sec +ond CGI::Simple (0.075) via mod_perl - 381 fetches per sec +ond CGI.pm (3.05) via mod_perl - 386 fetches per sec +ond CGI::Minimal (1.16) via mod_perl - 417 fetches per sec +ond null Perl script via mod_perl - 500 fetches per sec +ond
A 'null' perl script that includes no external packages (roughly the same kind of script as yours) executed 103 fetches/second. Using CGI.pm dropped the speed to only 16 fetches/second mostly due to the overhead of its large size.
CGI.pm, by itself, is around 237K bytes of code - and it pulls in Carp (8 Kbytes in perl 5.8.8). Carp then pulls in Exporter (15 Kbytes), Exporter pulls in Exporter::Heavy (6 Kbytes) and Exporter::Heavy pulls in strict (3 Kbytes). If you do a 'use warnings;' that pulls another 16 Kbytes. If you do 'use CGI::Carp;' that will tack on another 16 Kbytes.
So before your script does anything, you very likely will have loaded an additional 300 Kbytes of code just for having done
use strict; use warnings; use CGI; use CGI::Carp;
So you would have limitted the maximum possible speed of your script as a standard (non-mod_perl) CGI to only (adjusting my numbers for the fact your system is about 50% faster than mine judging from the 'null script' speeds) 24 fetches per second. If your own code uses more modules than I've listed, even slower. You mentioned using a 'template library' - Template Toolkit pulls in hundreds of Kbytes with just 'use Template;'. That alone would cut your speed in half again. It can pull in as much as a megabyte of code depending on the features you use - which would drop your speed to under 5 fetches per second.
Vanilla CGI (non-persistent environments) is very slow for scripts that are of any significant complexity in general simply because it takes too much time to compile them and their supporting libraries.
When performance is on the line, if you can, I would strongly recommend using a persistent execution environment (mod_perl or FastCGI for example).
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Why can code be so slow?
by chromatic (Archbishop) on May 01, 2007 at 18:40 UTC | |
by snowhare (Friar) on May 01, 2007 at 19:23 UTC | |
by chromatic (Archbishop) on May 01, 2007 at 20:44 UTC | |
by snowhare (Friar) on May 02, 2007 at 01:46 UTC | |
by freakingwildchild (Scribe) on May 03, 2007 at 09:33 UTC | |
by freakingwildchild (Scribe) on May 01, 2007 at 21:11 UTC | |
by jbert (Priest) on May 02, 2007 at 16:19 UTC | |
by BrowserUk (Patriarch) on May 01, 2007 at 18:50 UTC | |
|
Re^2: Why can code be so slow?
by BrowserUk (Patriarch) on May 01, 2007 at 14:02 UTC | |
|
Re^2: Why can code be so slow?
by Anonymous Monk on May 05, 2007 at 14:53 UTC |