in reply to Perl: friend or foe ?
I was wondering just now, how well perl code can cope when its under constant use. Im refering to its ability to process a lot of information and cope better than some other languages like Java or C++.
What are you qualifying as constant use? If your program is run for hours or days at a time, in a single execution, then the overhead to start perl is going to be insignificant. If you're running thousands or millions of executions of the program in a day, then yes, the time to parse is going to add up. In the case of CGIs, you can change this to the former by using mod_perl, as merlyn has already suggested. If it's not something that's a CGI, you might be able to do other similiar optimizations, like running a persistent part of the server to do the heavy lifting (in Perl, or whatever you're most efficient at programming in), and a client (in C, or something else with low execution overhead), that passes its params into the daemon/service/whatever to do its work.
Every language has its own little quirks, and sometimes, you have to think about the problems in slightly different ways to deal with it. I find that the best languages to write in are whichever ones seem the most natural to you, and don't end up frustrating you to the point where you want to beat your head against the wall. (Can anyone explain to me why VBA had a bunch of things that were full fledged functions, and others hidden as methods to a DoCmd object?)
Because of of the type of projects that I work on (dozens of files, spread across modules and programs, that share so much of their code), I like that I don't have to go back and recompile every one of my programs when I change one of the included modules. The cost of a programmer's time can add up over time -- sometimes, it's more cost effective to make the programmer more efficient than to make the software more efficient.
|
|---|