in reply to Re^3: Why can code be so slow?
in thread Why can code be so slow?

True - and irrelevant...

It's extremely relevant, if Perl doesn't actually compile all of that code. If your point was that Perl has to find and read all of those blocks from disk, that's fine. I didn't get that impression from your notes, however.

If you can suggest a faster way than to use CGI.pm in a script than that, I would be fascinated to know what it is.

Make sure CGI.pm is in the first directory in @INC.

To do this benchmark properly, make sure all of the modules you want to load are in the same directory, and preferably the first directory in @INC. If you really want to compare the weight of one module over another, you have to remove all other variations, and disk IO can be a very large variation, especially if compilation and execution time is minimal as in this case.

Replies are listed 'Best First'.
Re^5: Why can code be so slow?
by snowhare (Friar) on May 02, 2007 at 01:46 UTC

    Fine. For your edification.

    speedtest-cgi-minimal.pl

    #!/usr/bin/perl use CGI::Minimal; my $value = CGI::Minimal->new->param('a'); print "Status: 200 OK\015\012Content-Type: text/plain\015\012\015\012a + = $value\n";

    speedtest-cgi-pm.pl

    #!/usr/bin/perl use CGI; my $value = CGI->new->param('a'); print "Status: 200 OK\015\012Content-Type: text/plain\015\012\015\012a + = $value\n";

    I ran the test both with them hardlinked from the first @INC directory and then again with them in only their normal locations. The test was a 60 second run with 30 parallel fetches of http_load of each on a P4 with hyperthreading enabled at 3.06 GHz (I used a second fast machine to execute http_load for the requests). The version of Apache was 2.2.2 and the version of Perl was 5.8.8.

    in first @INC Directory Normal CGI.pm (3.25) 21.5/second 21.5/second CGI::Minimal (1.26) 73.8/second 73.6/second

    The difference is below the system noise floor. Any other criticisms of my methodology?

    My 'irrelevant' statement point was that it doesn't really matter a whole lot that if CGI.pm didn't do its clever little trick with subroutines stored in hash strings that it would be even slower than it is right now. It still is measured to run much slower than any of the alternative CGI parameter processing modules and that difference is simply due to its raw size.

      so basically, if I load 5 modules of 100k in size each; I would be throwing away 500k in size every single time I start this CGI; where mod_perl would be keeping 500k in memory and use the rest in shared memory?

      Is mod_perl the only possible way to keep a script execution size and footprint as low as possible? What about scripts not running on apache using those 5 modules of 100k in size each?

      I was thinking for all these years Perl would allocate its needed memory for the routines which are actively used; not using memory for the unused ones..