Those are some really surprising results.
With fork: as you say COW means that the big string will get copied on demand in pages--probably 4k or 64k chunks--as the string is processed, so lots of page faults.
With threads, the copy is done once up front.
Funny how they come out almost the same for the 1MB string.
As for why so slow, 32e6 calls to substr (which isn't quickest built-in in Perl) explains a lot of it. This would probably run hugely faster. (Which is a sneaky way of asking for one more 'final' test, but just for curiosities sake :):
use strict; use warnings; use threads; my $str = 'x' x 32_000_000; sub fork_wait { my $pid = fork; if ($pid) { wait; } else { substr( $str, $_, 1 ) &= ~ (' 'x1000) for 0 .. (length( $str ) +/1000) -1; exit 0; } } sub create_thread_join { threads->create( sub { substr( $str, $_, 1 ) &= ~ (' 'x1000) for 0 .. (length( $str ) +/1000) -1; } )->join; } use Benchmark qw( cmpthese ); cmpthese - 3, { Forks => \&fork_wait, Threads => \&create_thread_join, };
In reply to Re^9: while reading a file, transfer the same data to two different processes.
by BrowserUk
in thread while reading a file, transfer the same data to two different processes.
by avanta
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |