Since timing is an issue, the general slowness of File::Find (relative to the compiled "find" command) could pose a real problem -- especially on very large directory trees -- so the challenge is: how to get the find command to do the right thing, and quit when there are no more links to be found?
The cool thing here is how the IPC is done: child sends HUP signals to the parent each time a link is found (via the "-exec" option on unix find), parent closes everything up as soon as the expected link count is reached.
#!/usr/bin/perl use strict; use warnings; my ( $path, $file ) = @ARGV; die "Usage: $0 search/path data.file\n" unless ( -d $path and -f $file ); my ( $inode, $nlinks ) = ( stat _ )[1,3]; die "$file has no hard links\n" if $nlinks == 1; my ( $chld, $nfound, @found ); $SIG{HUP} = sub { $nfound++; `kill $chld` if $nfound == $nlinks }; $chld = open( FIND, "-|", "find $path -inum $inode -print0 -exec kill +-HUP $$ \\;" ) or die "find: $!\n"; $/ = chr(0); while ( <FIND> ) { chomp; push @found, $_; } printf( "found %d of %d links for %s in %s:\n", scalar @found, $nlinks, $inode, $path ); print join( "\n", @found ), "\n";
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Another way to avoid File::Find
by merlyn (Sage) on Nov 18, 2006 at 10:59 UTC | |
by graff (Chancellor) on Nov 18, 2006 at 20:01 UTC | |
by MidLifeXis (Monsignor) on Nov 20, 2006 at 18:41 UTC |