Hero Zzyzzx has asked for the wisdom of the Perl Monks concerning the following question:

So I've written a set of scripts that creates a indexed mySQL-driven website with some basic document management features. I developed it with use strict; and it runs beautifully. I guess that's the first step to getting it to run correctly with mod_perl.

The thing I'm worrying about is in some areas, code is executed based on the ABSENCE of a variable- for instance, if deleteflag=yes is not "posted" to the script, the script doesn't execute a loop, it skips ahead.

I've undef'd the hash that stores the "posted" variables before the variables are parsed in my script as a precautionary measure. Is this enough to make sure my hash will be empty, and not cause problems down the line with my script looking for the absence of variables?

Replies are listed 'Best First'.
Re: mod_perl- am I safe?
by Masem (Monsignor) on Mar 23, 2001 at 19:47 UTC
    One way to check, since particularly you are working with what appear to be destructive SQL statements (INSERTs, UPDATEs, and DELETEs), is to first clone the scripts into a new virtual server that uses mod_perl, and replace those SQL calls with print statements, or use a different database, or something along those lines; test the mod_perl 'happyness' in a virtual server before bringing everything on-line into mod_perl.

    I'm assuming that when you mean "posted" variables that you are getting these from CGI.pm. I've found that for sites with multiple scripts and a few support modules under mod_perl, you'll want to create the CGI object from the scripts as opposed to the support modules; even though the script stays resident in memory, the entire body is effectively wrapped into a block, such that a call like my $cgi = new CGI; only exists for that once-through and no more. If you try to move the CGI creation into the support modules and exporting the $cgi value, you'll run into problems there since that $cgi is not recreated on each HTTP access.

    But again, absolutely test your mod_perl code with a database that can be thrown away or with appropriate debugging statements in place of the SQL statements, before you bring up the site live. It sounds like you should be ok by just mod_perl'ing it, but you never know.


    Dr. Michael K. Neylon - mneylon-pm@masemware.com || "You've left the lens cap of your mind on again, Pinky" - The Brain

      Thanks! My next step was to set up the virtual server, copy the database, and use ab (apache benchmark) to slam the server with requests and see what happens. Not very elegant, I suppose, but it should work well enough for testing.

      I just wanted to know beforehand if I might be free and clear with mod_perl. I am using Apache::Registry as my mod_perl handler.

      Here's the appropriate code from httpd.conf. I assume "UseStrict 1" helps protect variables? Sorry for what are probably basic questions, I'm figuring this stuff out gradually.

      <Location /perl-bin> PerlSendHeader On SetHandler perl-script PerlHandler Apache::Registry PerlSetVar UseStrict 1 Options +ExecCGI </Location>

        Don't have the docs handy, but OOTOMH, 'UseStrict' simply implies that use strict; would be at the top of every perl script that mod_perl handles, whether explicitly there or not. It doesn't protect any variables in any way. Of course, I will assume you already have use strict; at the top of every perl file you've written, don't you? :)

        Don't try to drop in the entire project at one shot; start with non-DB-destructive files like search routines or lookups, and make sure these work and apply any bug fixes that you need to across all project files. While it's a lot easier to build a site that will use mod_perl in an environment that already users mod_perl from scratch, existing scripts that use good perl programming practices should be able to drop in without any major gotchas, but there will always be a few.


        Dr. Michael K. Neylon - mneylon-pm@masemware.com || "You've left the lens cap of your mind on again, Pinky" - The Brain
Re: mod_perl- am I safe?
by arturo (Vicar) on Mar 23, 2001 at 19:50 UTC

    if you're running under use strict then I presume that when you declare the hash, you're using my %hash to do so; which should mean that you're free and clear. If you're running under Apache::Registry, your whole script is treated as a subroutine, so each invocation of your script is like a fresh invocation of a sub, and the variable should be a new copy.

    Philosophy can be made out of anything. Or less -- Jerry A. Fodor

Re: mod_perl- am I safe?
by pileswasp (Monk) on Mar 23, 2001 at 20:56 UTC
    It would probably be wise to always declare your variables to a value (even if it's undef)

    my ($foo, $bar, $baz) = (undef,'',0); my $deleteflag = $cgi->param('deleteflag');
    etc. That way, you can guarantee that the values you're going to get will have been generated by this call to the script.
Re: mod_perl- am I safe? UPDATE!! COOL GEEK BENCHMARKING STUFF HERE!!
by Hero Zzyzzx (Curate) on Apr 05, 2001 at 01:29 UTC

    Here's an update on my experience with moving some web apps into mod_perl:

    stricting everything pretty much worked for me. I don't seem to have any strange variable problems. Oh, and I saw an eight-fold (798%) increase in performance. Oof!

    They requests I tested do a couple of select queries and one update to a "pageaccess" table, and output a page and graphic of about 5k. Admittedly small, but that's how I keep pages anyway.

    I did these tests loopback, so no bandwidth restrictions apply here. Look at the "Requests per second" value. Here's the skinny, from ApacheBench:

    Not using mod_perl


    Concurrency Level: 50 Time taken for tests: 87.313 seconds Complete requests: 1000 Failed requests: 0 Total transferred: 5232000 bytes HTML transferred: 5085000 bytes Requests per second: 11.45 Transfer rate: 59.92 kb/s received


    Same test, using mod_perl


    Concurrency Level: 50 Time taken for tests: 10.943 seconds Complete requests: 1000 Failed requests: 0 Total transferred: 5232000 bytes HTML transferred: 5085000 bytes Requests per second: 91.38 Transfer rate: 478.11 kb/s received


    Same box, static page


    Concurrency Level: 50 Time taken for tests: 0.754 seconds Complete requests: 1000 Failed requests: 0 Total transferred: 4510800 bytes HTML transferred: 4243680 bytes Requests per second: 1326.26 Transfer rate: 5982.49 kb/s received

    As expected, dynamic pages are far slower than static. But the acceleration afforded by mod_perl has brought the per-second page value for dynamic pages above the limit imposed by my bandwidth (1.5 megabit T1, which works out to around 55 pages a second in my testing.) That means it'll be hard to choke my server with mod_perl accelerated dynamic pages. My bandwidth will be used up long before I overtax my disk and processor speed.

    Server stats, for those interested:

    • IBM Netfinity 4000 (I think that's the model #)
    • dual PIII 550 mhz procs
    • 512 meg ECC ram
    • 4 gig SCSI for OS and user files
    • 4 gig SCSI for swap and extra space
    • 2x4 gig SCSI mirrored RAID for web files
    • RH 6.2 with patches
    • Using NuSphere for Apache, mySQL and mod_perl

    My next project is setting up persistent mySQL connections to squeeze even more speed out of this puppy

      One more update:
      Upgrading the same box with the same hardware to Redhat 7.1 with the 2.4 kernel saw a significant speed boost again, partly due to 2.4s better SMP support, methinks.

      Dinging the same dynamic page of the same site saw page generation rise to 140 pages/second. Damn!!

      This may all not be attributable to the kernel upgrade, of course. Apache and mod_perl were upgraded too during the OS upgrade. This boost could be attributed to the persistent DB connections afforded by Apache::DBI, which I've finally puzzled out.

      Thought some of you might be interested.