Not necessarily--but what the other posters have been saying
to you is that your process of trying to keep the system
load down (by keeping around several identical-but-differently-named copies of your script, then
invoking a certain script on a random basis) isn't
going to solve the problem.
You're going to have the same load problems with 10 users
running differently-named scripts as you are with 10 users running
the same script. The load on the machine is going to be
essentially the same, so your method isn't going to work.
Also, you didn't answer jbert's question: are you doing
this because of a shared resource? Having 10 differently-named scripts isn't going to help that, either.
Is this a CGI script? A mod_perl script? A system tool?
What you need to do, I think, is to figure out exactly
what the problem is--have you, for example, actually
experienced a high system load, or are you just thinking
that in the future such a high load might exist?
If
you're really having problems with system load, you may
need to investigate your algorithm: is there a less-taxing
way you can do the same thing? Could you make the machine
work less by modifying your code?
Do some benchmarking
and profiling of your code: How do I profile my Perl programs?. If you have an idea where your code may be
spending the most part of its time, use Benchmark and
investigate alternative algorithms. If you really don't
know why it's slow, use Devel::DProf to find out. |