in reply to Re^2: Perl: friend or foe ?
in thread Perl: friend or foe ?
To force caching on all users is a premature optimization, which is an evil step.
I disagree. First, I disagree that just because perl could cache the compiled bytecode that it must do so for everyone. There could be a commandline arg, or maybe a pragma ("use cachedbytecode;") which would enable it. Of course, if there is no downside to the caching (it's smart enough to deal with non-writable filesystems in a reasonable, unobtrusive manner, i.e., disable bytecode caching automatically), then I see no reason not to enable it by default.
Second, I disagree that it's premature optimisation. It's optimisation, yes. But premature? We already know the overhead of compiling code each time. We can compare that to the caching speed. If the cache is slower than recompiling, then we throw away that code, don't commit it to the main trunk, and document it. If the cache is faster than recompiling each time, especially in small programs (where compilation is a larger percentage of the runtime), then it's a proven optimisation.
Finally, I mostly disagree with the OP that this is really even needed. I laugh at my cow-orkers that work in Java. By the time that their code has finished recompiling, my code is almost done executing. And we have similar numbers of lines of code to work with. They won't even have their code loaded into memory by the time mine is finished running, the JVM load time is so slow.
My cow-orkers working in C/C++ are somewhere in the middle .. except that they're all writing JNI code, which again relies on that JVM load time :-)
|
|---|