in reply to Re^2: Remove Duplicate Files
in thread Remove Duplicate Files

Then again, hardlinks are less of a concern for cleanup, because they don't waste disk space.

Replies are listed 'Best First'.
Re^4: Remove Duplicate Files
by Anonymous Monk on Oct 29, 2004 at 09:32 UTC
    Well, any program that compares files and removes duplicates that doesn't look at whether they are links, will remove excess links. By looking at the inodes and device numbers to detect links, you can gain one of two things: have the option to *keep* links - which can be pretty useful for binaries that act different on how they are invoked, or a more speedy comparions, as you don't have to calculate the md5 hash, and then then compare the entire file.
      Of course. I was pointing out that if the purpose of the tool was to reduce disk usage, keeping hardlinks wouldn't hurt its functionality. You are right that hardlinks can often be a good thing, but without further information about the environment this was supposed to run in, we can't tell whether leaving them is the right thing. (Probably, it's just irrelevant and ok to leave undefined.)