in reply to Re^2: Perl Multi Processing.
in thread Perl Multi Processing.

Here is MCE::Flow where workers read from the input file directly. MCE->gather is called to append to an array.

#!/usr/bin/env perl use strict; use warnings; use MCE::Flow; use IO::Socket; my $nthreads = 20; my $in_file2 = 'rang.txt'; sub ip_checker { my $ip = $_; chomp($ip); my $host = IO::Socket::INET->new( PeerAddr => $ip, PeerPort => 80, proto => 'tcp', Timeout => 1 ); if ( defined $host ) { MCE->gather($ip); } } my @result = MCE::Flow->run_file( { chunk_size => 1, max_workers => $nthreads }, \&ip_checker, $in_file2, ); MCE::Flow->finish(); open( my $output, ">", "port.txt" ) or die $!; print {$output} "$_\n" for @result; close($output);

For this one, workers write serially to the output file via MCE->print.

use strict; use warnings; use MCE::Flow; use IO::Socket; my $nthreads = 20; my $in_file2 = 'rang.txt'; open( my $output, ">", "port.txt" ) or die $!; sub ip_checker { my $ip = $_; chomp($ip); my $host = IO::Socket::INET->new( PeerAddr => $ip, PeerPort => 80, proto => 'tcp', Timeout => 1 ); if ( defined $host ) { MCE->print($output, "$ip\n"); } } MCE::Flow->run_file( { chunk_size => 1, max_workers => $nthreads }, \&ip_checker, $in_file2, ); MCE::Flow->finish(); close($output);

Perl provides various ways to concur parallel processing. There is Parallel::ForkManager. For threads-like API, there is MCE::Child and MCE::Hobo. MCE itself provides chunking capabilities. It has sugar syntax to gather into an array or hash. For output, MCE->print, MCE->printf, and MCE->say write serially to a file handle (default STDOUT if omitted).

Basically, no problem if the Perl binary lacks threads support.