MashMashy has asked for the wisdom of the Perl Monks concerning the following question:


I apologize if this is a silly question. I can't seem to get a straight answer to it, and you guys are always so helpful in putting people on the right track.

Here is the situation.

1. User goes to http:/mysite.com/main.pl.
1a. mysite.com is an apache server, completely regular.
2. main.pl is a perl script that loads a ton of stuff and prints it to the screen.

now, main.pl is a regular human-readable code block.
#!/usr/bin/perl bunch of stuff here print $bunchofstuff; exit;


Am I doing something wrong? would this be significantly faster / less load-intensive if I, say, precompiled it, so that they would instead go to main.pl (which is a block of code, precompiled from main.txt or whatever) instead?

If so, how is this done?

Thank you for any help!

Replies are listed 'Best First'.
Re: is precompiling possible / effective?
by moritz (Cardinal) on Feb 12, 2009 at 23:41 UTC
    The reason that perl doesn't store compiled bytecode on disk is that reading that bytecode is usually slower than re-compiling an application. So that's not an option here.

    What you can do is use a system like FastCGI or mod_perl where each request is a function call into a persistent process rather than an entirely new process.

    If you're lucky you don't always have to generate the file, but you can cache it.

      Is there a way to determine how effective that will be / if it is worth doing?
        Sure. Try it out, and measure the difference.

        You can also try to measure the startup and initialization phase, and see if it's large enough to warrant the effort.

        #!/usr/bin/perl use Time::HiRes qw(time); my $start; BEGIN { $start = time() }; # rest of module loading here # then initialization (open database connections etc.) warn "Startup time: ", time() - $start; # rest of script goes here
Re: is precompiling possible / effective?
by chromatic (Archbishop) on Feb 12, 2009 at 23:27 UTC

    What do you mean by "precompiling"? If you mean something like using Apache::Registry to load Perl and your program and any modules into memory and then keep them there between requests, then your code can run much faster.

      What i mean is this: doing something to that human-readable .pl file to make it go faster, so perl doesn't have to compile each time a user hits that page with their browser.

      1. is that worth it, and if so,
      2. is there a step-by-step guide to do so?

      (also, will this help server load, since it doesn't have to compile it?)
Re: is precompiling possible / effective?
by ww (Archbishop) on Feb 13, 2009 at 00:34 UTC
    Without knowing what "bunch of stuff" refers to it's impossible for me to know how to answer the question you've actually asked (and "no, I'm not clear what you mean by 'precompile'.")

    So, with those caveats, an answer directed to another possible reading of your question...

    If main.pl is producing a fundamentally static page (say something that says 'Welcome to whazit.com; our widgets are better than theirs. Buy now!' followed by links to additional content (such as pages targeted to the asserted advantages; price; and/or an E-commerce ordering page, you might consider using an index.html for the content you're now producing with main.pl, and reserve your hits on cgi-bin for dynamic pages.

      Alternately if it's only quasi-dynamic (changing perhaps weekly, daily, or even hourly) you could rewrite to generate a new "static" version every whatever interval and have cron or the like periodically re-create the static files which apache serves.

      The cake is a lie.
      The cake is a lie.
      The cake is a lie.

Re: is precompiling possible / effective?
by doug (Pilgrim) on Feb 13, 2009 at 19:08 UTC

    perldoc -f dump

    People have been trying to precompile since the old days, but it doesn't seem to be worth the effort. The bang isn't worth the buck. It might be a better use of your time to work on optimizing your code and caching what you can. Good luck.


    - doug