Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
Hi Monks
Could someone please advise as to why the following code is reporting the same files a few times...
#!/usr/bin/perl -w use strict; use File::Find; no warnings 'File::Find'; use Digest::MD5; local $| = 1; my $path = $ARGV[0]; #my $testpath = 'C:/Temp/'; print "Searching for duplicate files in $path\n"; #find(\&check_file, $testpath); find(\&check_file, $path); local $" = ""; my %files; my %md5; my $wasted = 0; my $size = 0; foreach my $size (sort {$b <=> $a} keys %files) { next unless @{$files{$size}} > 1; foreach my $file (@{$files{$size}}) { open(FILE, $file) or next; binmode(FILE); push @{$md5{Digest::MD5->new->addfile(*FILE)->hexdigest}},$file."\ +n"; } foreach my $hash (keys %md5) { next unless @{$md5{$hash}} > 1; print "\n@{$md5{$hash}}"; print "File size $size\n \n"; $wasted += $size * (@{$md5{$hash}} - 1); } } 1 while $wasted =~ s/^([-+]?\d+)(\d{3})/$1,$2/; print "\n$wasted bytes in duplicated files\n"; sub check_file { (my $fn = $File::Find::name) =~ tr#/#\\#; -f && push @{$files{(stat(_))[7]}}, $fn; }
Cheers
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Duplicate File Finder script reporting multiples
by Athanasius (Archbishop) on Mar 30, 2015 at 08:00 UTC | |
by GotToBTru (Prior) on Mar 30, 2015 at 14:32 UTC | |
by choroba (Cardinal) on Mar 30, 2015 at 14:39 UTC | |
by Anonymous Monk on Mar 30, 2015 at 09:42 UTC | |
|
Re: Duplicate File Finder script reporting multiples
by Anonymous Monk on Mar 30, 2015 at 07:13 UTC | |
by Anonymous Monk on Mar 30, 2015 at 09:41 UTC |