Re: Perl: friend or foe ?
by merlyn (Sage) on Apr 24, 2005 at 13:36 UTC
|
| [reply] |
|
|
Hi,
That was my point, should perl be written so that it is pre-compiled from the very first time. Even if it is 'pre-compiled each time', surely thats a pointless task and wastes some resource doing so.
Im not knockin perl here (or trying not to) but im just trying to understand some reasoning behind its implementation. Can the language be changed to accomodate this? Can it be make richer and more powerful than some of the other languages, whilst remaining 'human friendly'
Keep the points of view coming.
| [reply] |
|
|
You mean: should Perl somehow secretly cache the startup steps, so that it performs those only once? Sure, but then we run into problems specifying exactly what to cache and where to cache that translation step.
And hence, what then happens (in other programming languages) is that we make the users specify exactly when and how to cache the translation from the programming language to the intermediate language. And that would make Perl distinctly less "human friendly" for me. I'd hate having to develop a "makefile" for every Perl program I write. Those systems typically also separate the "compilation" environment from the "execution" enviroment, losing some of the meta information (like the names of all methods in this class), and losing the ability to "compile" while "executing" for some neat tricks.
No, I like the current system. When I don't need caching of the translation (which is what your "compilation" seeks to do), I can use Perl by simply saying "go". When I've decided that caching helps, I can do that explicitly using the mechanisms I gave earlier.
To force caching on all users is a premature optimization, which is an evil step.
For example, I'm developing a few applications for clients right now in CGI::Prototype. The eventual application will likely be executed in a mod_perl environment, but I'm running it as CGI because I don't want any caching to interfere with my clean-slate testing, especially as I tweak various parts that will eventually be cached. The fact that Perl lets me do this makes my development time shorter, not longer. (And in fact, some parts of CGI::Prototype are possible only because I can blur the lines between "compilation" and "execution", so I have a richer framework to do my work, even if I use those features only indirectly.)
| [reply] |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Re: Perl: friend or foe ?
by spurperl (Priest) on Apr 24, 2005 at 13:34 UTC
|
Your first question is unclear. Perl doesn't get tired from being used a lot, if that's what you mean :-)
Perl's not being a compiled language (at least in the classic sense) does affect its performance. True, for some tasks it will never beat tightly-written C, but for *most* task it is good enough. And given the immense productivity and power of Perl, especially in conjunction with CPAN modules, it's worth it.
Give it a try. Write some programs. When you feel Perl is too slow for you, ask here. Seasoned monks often come up with tricks that make Perl fare very nicely in speed with other languages. | [reply] |
|
|
Perl's not being a compiled language (at least in the classic sense)
I'm not sure what exactly you mean by "the classic sense," but Perl is definitely compiled in the sense of the word that I know. Perhaps you meant, "in the lazy way most people use the word." I certainly wouldn't call that the classic definition, though. :-)
PS: Sorry to pick a nit on a post whose substance I fully agree with, but I couldn't resist.
| [reply] |
Re: Perl: friend or foe ?
by tlm (Prior) on Apr 24, 2005 at 13:48 UTC
|
In addition to all of the above, there are several ways (XS, Inline::C, swig) to enable Perl code to call, e.g., compiled C code. A common technique is to structure the program so that the computationally demanding parts can be written in C (for example); the entire program then consists of a Perl "wrapper"1 and one or more native subroutines to take care of the heavy lifting. A lot of modules in CPAN use this approach.
1The term "wrapper" here is a bit misleading, because this Perl code often does substantive work other than just calling the native subroutines.
| [reply] |
|
|
Precisly,
Having to call up the heavy duty C code highlights the point of my question. The perl code is not quick or 'strong' enough to cope with the heavy duty tasks. WHY?
Im not sure if im jumping out on a limb, but, with Java, it easy to write the code, test, re-write, re-test, and release. The one disadvantage is that after re-writing it has to be compiled again, BUT, its quicker in the long run if a big application was running.
Feel free to spank me on this but, doesn't perl only cope with some much smaller tasks, i.e. something that does not demand a high level amount of processing. If this is so, WHY?
MonkPaul
| [reply] |
|
|
| [reply] |
|
|
|
|
|
|
|
|
Re: Perl: friend or foe ?
by jhourcle (Prior) on Apr 24, 2005 at 15:14 UTC
|
I was wondering just now, how well perl code can cope when its under constant use. Im refering to its ability to process a lot of information and cope better than some other languages like Java or C++.
What are you qualifying as constant use? If your program is run for hours or days at a time, in a single execution, then the overhead to start perl is going to be insignificant. If you're running thousands or millions of executions of the program in a day, then yes, the time to parse is going to add up. In the case of CGIs, you can change this to the former by using mod_perl, as merlyn has already suggested. If it's not something that's a CGI, you might be able to do other similiar optimizations, like running a persistent part of the server to do the heavy lifting (in Perl, or whatever you're most efficient at programming in), and a client (in C, or something else with low execution overhead), that passes its params into the daemon/service/whatever to do its work. Every language has its own little quirks, and sometimes, you have to think about the problems in slightly different ways to deal with it. I find that the best languages to write in are whichever ones seem the most natural to you, and don't end up frustrating you to the point where you want to beat your head against the wall. (Can anyone explain to me why VBA had a bunch of things that were full fledged functions, and others hidden as methods to a DoCmd object?)
Because of of the type of projects that I work on (dozens of files, spread across modules and programs, that share so much of their code), I like that I don't have to go back and recompile every one of my programs when I change one of the included modules. The cost of a programmer's time can add up over time -- sometimes, it's more cost effective to make the programmer more efficient than to make the software more efficient.
| [reply] |
Re: Perl: friend or foe ?
by starbolin (Hermit) on Apr 24, 2005 at 17:06 UTC
|
The compilation method is not the only thing that determines execution speed. I have extensive experience in another interpreted language, Forth, and it beats the pants off of anything but tightly written C code. This is not due the compilation method but in the minimalist implementation of the bytecode and the design of the programs. To get raw speed you give up runtime checking, floating point math, and garbage collection among other things. Not as a hard rule but these are design choices that you make. Just as these choices have to made when writing in any language. The important thing is whether the language allows these choices, and if the language doesn't do what you want you can usually do it in a library. So really, it makes little sense to even draw the distinction between languages.
Speed is not everything. As I'm writing this I'm looking at the process status of the many widgets sitting on my desktop. They are all running between 0% and 1% processor utilization. My whole Xorg desktop runs between 2% and 3%. So do I care what language they were written in? No.
I don't take it as a given that a C++ program runs faster that a perl program. Some of the slowest running junk that I've seen has been written in C++ using the MFC Classes. Again the determining factor is not the compilation method but the library implementation.
My next project involve numerical simulations using the ATLAS libraries. These libraries are written in Fortran, C and C++. The libraries do all the heavy work and the application code just shuffles the results over the the display libraries. ( Written in Python, I think. ) So the whole Perl vs X thing is moot. This project mirrors most of my big projects in that I rely on libraries to do the most of the work. So I really don't care what language I'm writing in just so long as it stays out of the way.
s//----->\t/;$~="JAPH";s//\r<$~~/;{s|~$~-|-~$~|||s
|-$~~|$~~-|||s,<$~~,<~$~,,s,~$~>,$~~>,,
$|=1,select$,,$,,$,,1e-1;print;redo}
| [reply] |
Re: Perl: friend or foe ?
by talexb (Chancellor) on Apr 24, 2005 at 18:23 UTC
|
I was wondering just now, how well perl code can cope when its under constant use. Im refering to its ability to process a lot of information and cope better than some other languages like Java or C++.
You've lost me. It sounds like you're talking about Perl like it's an internal combustion engine or something similar. Certainly, engines wear out and eventually die. A piece of software isn't like that. If you're testing to see if the value of $foo falls inside a particular range, that's going to work whether you do it once or a million times a day, for as long as the platform has power.
Do you have an example where a particular language starts to wear out under heavy load? I'm just treally not following the train of thought that lead to this question.
Alex / talexb / Toronto
"Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds
| [reply] [d/l] |
Re: Perl: friend or foe ?
by dragonchild (Archbishop) on Apr 24, 2005 at 20:41 UTC
|
I was wondering just now, how well perl code can cope when its under constant use. Im refering to its ability to process a lot of information and cope better than some other languages like Java or C++.
Does it leak memory? Yes, every non-trivial computer program written in any language will leak memory under some situation.
Your better question is "Is it more likely that a given application in Perl will leak memory than the same application in any other language?" The answer is a resounding No.
Reason: I don't do memory allocation by hand, so I, the programmer, can't screw it up.
| [reply] |
Re: Perl: friend or foe ?
by TedPride (Priest) on Apr 24, 2005 at 23:27 UTC
|
| [reply] |
Re: Perl: friend or foe ?
by Anonymous Monk on Apr 25, 2005 at 12:15 UTC
|
Also of interest to me is whether the fact that its not precompiled could affect its performance, again with reference to other languages. Would it be better to compile the code first so that each time it is run it doesn't have to be interpreted each time.
That's two separate questions. First, let me clearify something. Perl code is compiled before it's run. The Perl code itself is never "interpreted each time" as with, for instance, shell code. Perl code is compiled when you start the program, and then the resulting code is run.
As for whether it impacts performance, yes, in some ways. Both negatively, and positively. Let me list some cons first:
- Because Perl code is compiled each time the program is run, compilation needs to be fast. Perl can't spend a lot of time optimizing the resulting byte code, sometimes resulting in code that might not be as fast as it could have been.
- If you run the same, unmodified, code many times, over and over compiling the same thing can be percieved as "wasteful".
The pros:
- Perl source code takes much less disk space than any compiled code, so by doing "just in time" compiling, one saves disk space.
- There's no risk of having the source code and the binary code be out of sync, as there's no separate binary around.
- No need to generate binaries for different platforms. Most code I write on my OS will run on your OS without even knowing what your OS is.
- A much shorter write/test cycle - no separate compilation cycle needed.
In short, whether or not it's "better" to compile the code first is a matter of preference - you win some, but you lose some as well. It's a trade-off, and for now, it means you'll have to live with "just in time" compiling.
In Perl6, you probably will have the option to save the byte code after compilation.
| [reply] |
|
|
Monkers,
I think everybody had their say on that one. I was'nt trying to kick up a fuss but merely answer some of my own questions by asking you lot. Cheers :)
Two more question's ........ If Perl is so neat and nifty for the things its used for, why does it never get taught at a lower level? Im doing a masters and only just heard about it after christmas. I would like to see a choice of what languages you can learn when you begin you programming course. Is this the same for everybody?
| [reply] |
|
|
Perl certainly wasn't around when I started learning programming. Heck, Larry Wall wasn't even programming then. {grin}
I spent the first six or seven years learning (and later teaching) BASIC.
Then I discovered C, and Pascal, and FORTH, and Logo, and realized how
much larger the world must be.
| [reply] |
|
|
|
|
Perl is not necessarily going to be attractive as a pedagogical language - partly because of TIMTOWDI. Also, there have been criticisms that Perl code is auto-obfuscated, which makes evaluation on pedagogical grounds challenging.
In my (limited) experience, Perl is used to get things done, and teaching languages are used for teaching. Perl exposure in academia is more likely to happen in the experimental sciences or occasionally in the Arts. If you are interfacing with telescopes, gathering data from sensor arrays or modelling bioinformatic systems, you'll likely hear about Perl before it would come up in CS.
| [reply] |
|
|
| [reply] |
|
|
|
|