In case you want to compare files with the same name (that exist in both lists/directories), you could compute the intersection of both lists, and then iterate over the resulting list of files names, simply prepending the appropriate paths... Something like:
my %seen; $seen{$_}++ for @return, @remoteFilelist; my @files_in_both_lists = grep $seen{$_} > 1, keys %seen; for my $fname (@files_in_both_lists) { if (compare_text("$path1/$fname", "$path2/$fname") == 0) { #... } }
Otherwise (if you want to compare every file in list 1 with every file in list 2), I would compute checksums (e.g. MD5) for all files, and use the checksums as keys in a hash, with a list of filenames as the associated value. Those entries with more than one file in that list will indicate identical files...
Update: sample code for the latter approach:
#!/usr/bin/perl use strict; use warnings; use Digest::MD5; my @allfiles = ...; # your file lists merged (including paths) my %by_md5; for my $file (@allfiles) { open my $fh, "<", $file or die "Couldn't open '$file': $!"; binmode $fh; my $md5 = Digest::MD5->new(); $md5->addfile($fh); my $digest = $md5->hexdigest(); # or ->digest() -- hexdigest is j +ust more "dumping-friendly"... push @{ $by_md5{$digest} }, $file; } for my $digest (grep @{$by_md5{$_}} > 1, keys %by_md5) { print "duplicates: @{ $by_md5{$digest} }\n"; }
(In case you're paranoid (and worry about the very unlikely case of a digest collision), you can always do a byte-for-byte comparison of the files with the same digest...(those reported as duplicates with the above snippet))
In reply to Re: Assistance with file compare
by almut
in thread Assistance with file compare
by Karger78
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |