Do or even require alone are not fast. Reducing the start up of perl might have a time impact but won't change the RAM consumption.
My bet on the biggest time consumer is the filesystem not the compilation. Precompiling really payed off in the 90s, but now?
So using a RAM-disk could have the best cost benefit ratio.
But we are all speculating here, like others repeated over and over again, the OP should be more explicit
I have my doubts that refactoring 500 scripts is an option and even then...
Precompiling them all into the master process would make them vulnerable to global effects in the BEGIN-phase.
Cheers Rolf
(addicted to the Perl Programming Language :)
Wikisyntax for the Monastery
In reply to Re^4: Use of do() to run lots of perl scripts
by LanX
in thread Use of do() to run lots of perl scripts
by chrestomanci
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |