in reply to Re^2: Why can code be so slow?
in thread Why can code be so slow?
You can try to measure the effect of startup time by something like the following:
which on my system produces the output:#!/usr/bin/perl use warnings; use strict; use Time::HiRes; my $hr_log; BEGIN { open($hr_log, ">", "/tmp/hrlog") or die "can't open hi-res log"; sub hr_logger { my ($secs, $msecs) = Time::HiRes::gettimeofday(); print $hr_log "$secs.$msecs: ", join("|", @_), "\n"; } hr_logger("BEGIN"); } # Obviously, you'll have all your modules here use CGI; # Put this just before you actually do any work in your code # i.e. after all your 'use' lines hr_logger("Ready to run"); # This bit would be your app print "Now get on with running the app\n"; # And put this before you exit, for completeness hr_logger("finished"); exit 0;
In my case above, the 'use CGI' time dominates, because I'm doing nothing in the app.1178122211.615440: BEGIN 1178122211.646999: Ready to run 1178122211.647111: finished
If you do the same, but sprinkle a few more calls to hr_logger in your code, then you should be able to work out what is slow. From what you've posted so far, it sounds as though you are parsing a lot of XML.
If you need to do this parse per-request, then mod_perl won't speed you up much (and you'll need to look into optimising your XML use or replacing it with something else). If you only need to do this parse on startup, then mod_perl (or FastCGI) would help you.
It's not really a case of getting this optimised and then getting another speed boost by moving to mod_perl. What mod_perl does is save you *startup costs*. Profiling and optimising your existing code will save your per-request costs. They're pretty indpendent really and you want to know which is hurting you since you don't want to waste your time optimising the areas which aren't hurting you.
|
|---|