XP is just a number | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Yes I oversaw the fork part until choroba posted his other benchmark.
Do or even require alone are not fast. Reducing the start up of perl might have a time impact but won't change the RAM consumption. My bet on the biggest time consumer is the filesystem not the compilation. Precompiling really payed off in the 90s, but now? So using a RAM-disk could have the best cost benefit ratio. But we are all speculating here, like others repeated over and over again, the OP should be more explicit
I have my doubts that refactoring 500 scripts is an option and even then... Precompiling them all into the master process would make them vulnerable to global effects in the BEGIN-phase.
Cheers Rolf In reply to Re^4: Use of do() to run lots of perl scripts
by LanX
|
|