http://qs1969.pair.com?node_id=506619

jesuashok has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

Is there any perl module available to return the list of hard link files ?

becuse lstat is used to check the soft link.

If I give the absoulute path It has to return the list of hard link files available in the specified path.

"Keep pouring your ideas"

Replies are listed 'Best First'.
Re: Finding Hard links
by saintmike (Vicar) on Nov 08, 2005 at 06:35 UTC
Re: Finding Hard links
by Ultra (Hermit) on Nov 08, 2005 at 06:40 UTC

    Basically "hard link files" are not different files, there is just one file, and the other file names point to the same file. So, your problem is not finding "the files" but finding the names.

    This may be a little tricky, and here's why:

  • First you need to find out if there are hard links to the file (use Perl's builtin stat to find out).
  • If there are any hard links, then you need to test all files and see if they point to the same inode (remember there is only one file and multiple names); for this, again you may use stat.
  • The tricky part is that there may be hard links that your user can't access (directory/file permissions etc.)

    I don't know if there is a CPAN module for this, but using File::Find and stat you could create one yourself ;-)

    Dodge This!
Re: Finding Hard links
by blazar (Canon) on Nov 08, 2005 at 11:26 UTC
    I don't know it there's a module available to do that. But I suppose that as a general rule it won't be an easy thing since, AFAIK while filesystem and related system calls return relevant information in the direction filename => inode, they do not do the opposite. So you can gather them from comparing files. Of course this is made easier by the fact that
    • you only have to compare files on the same fs,
    • you only have to compare files with the same number, larger than 1, of (hard) links as known from stat, which hopefully are not that many.

    Update: ok, here I try my hand at this with a minimal example/proof of concept:

    #!/usr/bin/perl -ln use strict; use warnings; BEGIN { @ARGV='find . -xdev|' } our %files; my ($ino, $nlink)=(stat)[1,3]; next unless $nlink and $nlink > 1; if ($nlink == push @{ $files{$ino} }, $_) { print "$_ => $ino" for @{ $files{$ino} }; delete $files{$ino}; } __END__
      Original question was about perl module... but if you start using `find` command, then:
      my $original_file = '/path/to/file'; my $dir = '/where/to/find/hardlinks/'; chomp for @hard_links = `find \Q$dir\E -inum \$(find \Q$original_file\E -printf "%i")`;

        Oh, well of course! I used an external find command because as I stressed, I just wanted to keep minimal and using File::Find, as I would most probably do in a more realistic case1, was not substantial for the technique I wanted to explore...

        You'll also notice that I used -ln on a slightly longer script than I would usually do in "production", whatever that is.

        But in the above you're fundamentally using Perl as a shell script. That is really "kinda too much". Incidentally you're using the "inner" find just to print the inode number. In that case the external stat command would suffice:

        $ stat -c %i work/ 18972756

        OTOH I didn't know about -inum, so I thank you - I know it's out there in the docs, but I tend to learn by example...

        Also, as a final observation, I think that indeed the OP was really asking about how to find hard links sharing the same inode as a given file, that is what you actually do. But if you want to search a whole directory hierarchy like I did, a "pure-find" approach along the lines of your example would be unpractical, since I think it would necessarily take two nested find's.



        1 And if I called an external find cmd, I'd probably use an explicit open.