Re: Perl and mutli processors
by grantm (Parson) on Oct 14, 2002 at 09:02 UTC
|
You haven't told us much about your script. If it is a web script, then multiple hits on the script will kick off multiple copies of the script which could run simultaneously on different processors. Under Apache 1.x, each hit would be handled by a different httpd process (and different processor - potentially). Even if your script runs under mod_perl, it will still not be multi-threaded, so two invocations will safely run in parallel and will not share variables.
If it's not a web script, then most likely, it will run single threaded on a single CPU. You can compile Perl to support threading but you'd also need to write your Perl script to take advantage of threads.
| [reply] |
|
|
well it's a linear script, but I'm into rewriting it to fork childs to handle some tasks.Will this help on a dual cpu system? Otherwise, my question might have been : how much more powerfull is a dual CPU server under linux (cause we're having an argument about it in my company).
In my opinion, having 2 cpu will mean that processes like apache/mysql et al will balance the load beetween the cpus so my script will automatically get more CPU time, even if it's not specifically written to take the advantage of 2 processors. Am I correct ?
| [reply] |
|
|
Yes, two processors can run two programs simultaneously even if each process is single threaded.
Of course if your script is 'IO bound' (most of it's time is spent waiting for data from disk or network) then extra CPUs aren't going to do much for it.
Without knowing what your script is doing, I can't say what benefit you might getting from forking children. Of course if you fork too many you'll use up available physical memory and once your system starts to swap, the performance will get worse.
Discussions/arguments are probably not going to help you too much. You need to do some profiling to work out what your script is spending its time doing (same goes for your system in general) before you can make any sensible decisions about steps to improve performance.
| [reply] |
Re: Perl and mutli processors
by submersible_toaster (Chaplain) on Oct 14, 2002 at 08:54 UTC
|
In anticipation of <angry> replies </angry> : ) Do you REALLY need to worry? If your program is being spawned by something like apache (cgi etc) , then the creation of different
threads for many instances of your program is being taken care of at another level.
Is your program 'linear' (wrong word) in it's execution, or are there things it should be doing whilst waiting on something else? recent discussion reveals Perl Multithreading in relation to scheduling.
| [reply] |
Re: Perl and mutli processors
by PetaMem (Priest) on Oct 14, 2002 at 09:29 UTC
|
Hi,
if you have a long-run type of script, the script itself will not run any (significantly) faster on a SMP system, than on a single CPU machine.
Linux OS SMP supports' granularity is per-process, so
you gain a speedup if you let run two of your scripts in parallel (on a Dual-Machine) etc. etc.
However - If you user the magic of Threads (since 5.8.0 practicable), you may gain a speedup of a single process on a SMP system.
Bye
PetaMem
| [reply] |
|
|
but everyone seems to agree that the load will be distributed evenly between the 2 proc wich means probably more ressources for my script, since all load generated by apache and mysql can be distributed between the proc...
| [reply] |
Re: Perl and mutli processors
by erasei (Pilgrim) on Oct 14, 2002 at 13:15 UTC
|
I think most of this has already been said, but let me give you my spin and some real-world examples.
A CGI Script: Apache calling your script which accesses a MySQL database will take care of most of your problems for you as far as SMP goes. Since most CGIs are pretty short and simple scripts, you wouldn't see much, if any, speed increase with more than one CPU (from the scripts standpoint).
Stand-Alone Scripts: Say you had a script that parses text files, just for fun, let's say they are a couple hundred megs worth. Then let's say that you wanted to parse those files and insert info about them into a database. Here is where you could use a threaded program. Have one thread parsing the text files, and another thread writing to the database. Running on a multi-CPU system would help you some in this case, but to parse a text file and write to a database, your bottleneck isn't going to be the CPU in most cases, but your disk, or your network if writing to a remote database.
CPU Intesive Apps: Say the several hundred meg files had had above are filled with users connection information, like timestamps, and your boss wanted to know the max number of users on at any given point with a 1 minute granularity. All you have is connection info so you are going to have to walk through those several hundred megs and compare them all. Massively CPU intensive. In this case, you would benefit GREATLY from threads and a multi-CPU host.
So in summary: Multiple CPUs are obviously only going to help you when your bottleneck is the CPU. In most cases on a modern server, it is your disk, or your network.
| [reply] |
Re: Perl and mutli processors
by Sihal (Pilgrim) on Oct 14, 2002 at 08:51 UTC
|
thanks for ++ing me, but I really do care more for an answer ;-) | [reply] |