G@SP@R has asked for the wisdom of the Perl Monks concerning the following question:

I have 10 or so perl scripts that are using code from a librarie.
I start my scripts with a require 'personal.lib'.
The thing is, personal.lib is growing every day and I'm starting to feel my scripts running slow. Specialy through the HTTP request...
My question is, should I transform my personal.lib into a module? How should I used it? Only importing for every script the functions I need?
I guess the scripts are loading all the librarie function & Vars into mem & Prog namespace and with 2000 lines a script plus 700 from the lib things get a litle bit heavy no? Sugestions are welcome!
Tkx.

G@SP@R

Replies are listed 'Best First'.
Re: (Modules || Libarires) usage
by holli (Abbot) on Jun 15, 2005 at 07:18 UTC
    Turning your file into a module won't save you any time. I'd suggest you use AutoLoader; to split your lib into multiple files.


    holli, /regexed monk/
Re: (Modules || Libarires) usage
by jbrugger (Parson) on Jun 15, 2005 at 07:44 UTC
    Are you using mod-perl? if your lib is properly written, and loaded (and compiled) during the startup of apache, (in startup.pl for example), it would probably be fast enough, and lives only in the main memspace of apache, all it's children can use it, and don't recomile it / and or use holli's suggestion.

    For other items, you don't use too often, write them in their own module, and call them using 'require notSoOftenUsedModule;' to compile it as it's needed.

    "We all agree on the necessity of compromise. We just can't agree on when it's necessary to compromise." - Larry Wall.
Re: (Modules || Libarires) usage
by Codon (Friar) on Jun 15, 2005 at 19:34 UTC

    The size of your codebase is definitely not the issue here. We have ~38000 lines of code spread across ~150 modules that load up under mod_perl. We maintain database connections to Postgres, MySQL, and Oracle and use each extensively. Our code executes very quickly and we have excellent site performance. However, because we are running under mod_perl, we are not going through the overhead of compiling the code with each request.

    You did not specify that this was CGI, so I'm going to assume that you are talking about command-line type scripts. You are loosing time with compiling the code every time you execute it. You could take the approach of breaking you libraries into smaller libraries that are grouped in some way that works well for your needs. You could then just require the needed libraries when the need arises.

    Ivan Heffner
    Sr. Software Engineer, DAS Lead
    WhitePages.com, Inc.
Re: (Modules || Libraries) usage
by chromatic (Archbishop) on Jun 15, 2005 at 17:34 UTC

    I'm not sure that the size of your program is making things slow. Compilation is fast compared to database work, filesystem use, and network traffic. 2700 lines isn't huge for an entire Perl program, but 2000 lines per script is. If I were you, I would factor common code out of the scripts and into separate modules. That tends to have the effect of making everything smaller.

      This was my first thought also--I don't think the size of your program is the issue. If you are use'ing any CPAN modules, you might be surprised how many lines of code get pulled in there, but this is seldom the source of major performance problems.

      You can convert your libraries into modules for various reasons, but speed isn't necessarily one of them.

      It would probably be better to do some benchmarking and figure out what is slow. Now, benchmarking might be easier if you had your code in small, digestable chunks in a module.

      As mentioned the other responses, if you are running CGI code through a server, mod_perl will likely solve many of your speed problems.