As given, the benchmark didn't complete within more than half an hour (not sure why this is so excruciatingly slow). So I stopped it, and reduced the string length
my $str = 'x' x 1_000_000; ... cmpthese 100, ...
Results (similar on both above machines/versions of Perl):
Rate Threads Forks Threads 2.02/s -- -4% Forks 2.11/s 5% --
(which is not all that surprising, I guess, as modifying the string results in copy-on-write, which destroys fork's advantage)
Also, the time taken to fiddle with the string here far outweighs the time needed to create a new thread or process...
Update: with a significantly shorter string, threads actually become faster, with an optimum advantage at a length of around 100 to 10000 (for this particular code sample):
Rate Forks Threads Forks 67.1/s -- -64% Threads 189/s 181% -- (v5.10.1, Ubuntu 8.04 (kernel 2.6.24), string length 100) Rate Forks Threads Forks 84.7/s -- -44% Threads 152/s 79% -- (v5.12.0, SUSE 11.1 (kernel 2.6.27), string length 10000)
In reply to Re^8: while reading a file, transfer the same data to two different processes.
by almut
in thread while reading a file, transfer the same data to two different processes.
by avanta
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |