ryantate has asked for the wisdom of the Perl Monks concerning the following question:
The error looks like:
Error in tempfile() using /tmp/1yUjjuyGJn/sappyXXXXXXXX: Have exceeded the maximum number of attempts (10) to open temp file/dir at /home/sappy/dev/pfork.pl line 19
Apparently, File::Temp will only try 10 times to come up with a unique random filename, then gives up. When one forks off processes, File::Temp somehow comes up with the same "random" filenames for each process. Is this expected behavior, given how fork works, or is this a bug in File::Temp?
FWIW, I am on File::Temp 0.14, and I noticed the following in the ChangeLog entry for 0.15:
* Temp.pm: Increase maximum number of tries before aborting.MAX_TRIES has been taken from 10 to 1000! Is this the only way to get around this problem, just have the module keep trying, or is there a more elegant solution?
I have a workaround, which is to include $$, the PID, in the template I pass to File::Temp for generating filenames. But I think this should be at least noted in the File::Temp documentation (which even has a section on forking).
Example code giving errors (derived, by the way, from code found in Parallel::ForkManager docs and/or Perlmonks):
use strict; use warnings; use Parallel::ForkManager; use HTTP::GHTTP; use Time::HiRes qw[ time ]; use File::Temp qw(tempdir tempfile); use File::Path; my $start = time; my $pm=new Parallel::ForkManager(15); my $temp_dir = tempdir(); for my $link (map { chomp; $_ } <DATA>) { $pm->start and next; my $getter = HTTP::GHTTP->new; $getter->set_uri("http://$link/"); $getter->process_request; my $page = $getter->get_body; my $fh = File::Temp->new(TEMPLATE => "sappyXXXXXXXX", DIR => $temp_dir, UNLINK => 0) or die "Could not make tempfil +e: $!"; print $fh $page or die "Could not print to tempfile: $!"; close $fh or die "Could not close tempfile: $!"; print "$link downloaded.\n"; $pm->finish; } $pm->wait_all_children; #rmtree([$temp_dir]); print "Removed temp dir '$temp_dir'\n"; print 'Done in: ', time - $start, ' seconds.'; __DATA__ www.google.com www.yahoo.com www.amazon.com www.ebay.com www.perlmonks.com news.yahoo.com news.google.com www.msn.com www.slashdot.org www.indymedia.org www.sfgate.com www.nytimes.com www.cnn.com
Output, including errors:
blah@blah [534] perl -wT /home/sappy/dev/pfork.pl www.amazon.com downloaded. www.yahoo.com downloaded. news.google.com downloaded. www.google.com downloaded. news.yahoo.com downloaded. www.slashdot.org downloaded. www.indymedia.org downloaded. www.ebay.com downloaded. www.cnn.com downloaded. www.sfgate.com downloaded. Error in tempfile() using /tmp/1yUjjuyGJn/sappyXXXXXXXX: Have exceeded + the maximum number of attempts (10) to open temp file/dir at /home/s +appy/dev/pfork.pl line 19 Error in tempfile() using /tmp/1yUjjuyGJn/sappyXXXXXXXX: Have exceeded + the maximum number of attempts (10) to open temp file/dir at /home/s +appy/dev/pfork.pl line 19
To fix code, I change one line as such:
my $fh = File::Temp->new(TEMPLATE => "sappy" . $$ . "XXXXXXXX", DIR => $temp_dir, UNLINK => 0) or die "Could not make tempfil +e: $!";
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: File::Temp randomness when forking
by tirwhan (Abbot) on Nov 29, 2005 at 11:16 UTC | |
by ryantate (Friar) on Nov 29, 2005 at 17:47 UTC | |
by tirwhan (Abbot) on Nov 29, 2005 at 17:57 UTC | |
by ryantate (Friar) on Nov 29, 2005 at 18:07 UTC | |
|
Re: File::Temp randomness when forking
by Moron (Curate) on Nov 29, 2005 at 10:11 UTC | |
by tirwhan (Abbot) on Nov 29, 2005 at 10:23 UTC | |
by ryantate (Friar) on Nov 29, 2005 at 18:03 UTC | |
| |
by thor (Priest) on Nov 29, 2005 at 12:56 UTC | |
| |
by ryantate (Friar) on Nov 29, 2005 at 17:59 UTC | |
|