Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello, I have been working on perl file for quite some time now, and I keep adding new things and such. The file has gotten bigger.. 400kb now, 50kb when I started. I have noticed it getting slower. The script has gotten almost 100,000 hits in a single day. I was going to look at some of the more used parts of the script to try and make them faster and this occured to me. I am sure you are smarter then me :) Please let me know, Thanks !
  • Comment on Is there a problem with how large the perl file gets?

Replies are listed 'Best First'.
Re: Is there a problem with how large the perl file gets?
by sgifford (Prior) on Feb 24, 2005 at 06:52 UTC
    Since Perl code is compiled every time it is run, the startup time for a very long script is quite a while. Running it under mod_perl is a solution for that, since it will only compile it once, when Apache starts. Splitting it up into several smaller files won't help, though it may make your script more maintainable.

    If the runtime isn't mostly spent doing compilation, try using Devel::Profile to see what parts of your code are taking up most of the time, and see if you can speed them up.

    Good luck!

      Splitting it up into several smaller files won't help, though it may make your script more maintainable.

      It can help, if certain functions of the CGI are used more often than others. Splitting those off into a much smaller script will mean faster overall access times for the system as a whole.

      That's just a band-aid, of course. Something like mod_perl is definitely the way to go in the long term.

        Splitting those off into a much smaller script will mean faster overall access times for the system as a whole.

        Agreed. I meant that simply chopping one big script into several modules then useing or requireing all of them wouldn't speed anything up.

Re: Is there a problem with how large the perl file gets?
by inman (Curate) on Feb 24, 2005 at 09:32 UTC
    As mentioned earlier, running the script using mod_perl is the way to go. Out of interest though, if you are still using plain old CGI then you may get some benefit from splitting your code into modules and 'require'ing them only as needed. The idea is that not all of the code in your 100k file will be executed each time (depending on the choices made by the users). If you can divide up your code into a main script with modules that represent a path of execution then you only need to load up the code associated with current choices. See CGI::Application as an example of splitting up a script by execution path.

    In the following example, the user makes a choice and the module associated with the choice is 'require'd. The script works for $choice=1 since Data::Dumper is probably in your system. Change the choice and the script throws an error because it can't find the module.

    #! /usr/bin/perl -w # use strict; use warnings; my $choice = 1; if ($choice) { if ($choice == 1) { # Do action 1 require Data::Dumper; import Data::Dumper; print Dumper(\$choice); } else { # show the user a report require Action::Report; import Action::Report; print report(); } } else { # No choice made. show help require Action::Help; print Action::Help::showHelp(); }

    As a further enhancement, the module files can be pre-compiled into bytecode using B::Bytecode to help the loading time.

Re: Is there a problem with how large the perl file gets?
by perlfan (Parson) on Feb 24, 2005 at 05:43 UTC
    So this is a mod_perl application? I think the issue may have more to do with non-optimized routines and algorithmic inefficiencies than just the size of the code. So, no, it is not the size of you script, but the things you are doing in your script. I'd suggest running some sort of profiler - there are many around, though I am not very familiar with any of them.

    Also, if it is a web app, then I am sure 100,000 hits per day would bring many systems to a crawl especially if the code is as monolithic as you make it seem. Can you post some specifics like what you are using it for, if it is a cgi, etc? It will get you much more help.
Re: Is there a problem with how large the perl file gets?
by thor (Priest) on Feb 24, 2005 at 12:33 UTC
    I was helping someone at work with a script that they had generated from a flat file. It turned out to be one long if..elsif...elsif... block. IIRC, the file was over 10MB in size. That ended up core dumping perl v5.6.0. To be fair, it may not have been the size of the file, but the number of related conditionals. I was still surprised to see perl core dump. As has been mentioned other times in this thread, an algorithm change helped this script in both runtime (given less conditionals, the script would run, but it would take about an hour) and the ability to run at all.

    thor

    Feel the white light, the light within
    Be your own disciple, fan the sparks of will
    For all of us waiting, your kingdom will come