in reply to in Perl find call looking to exclude folder and ignore duplicate finds.

Here's a solution without using File::Find that's pretty simple and I think covers all your requirements (that is, if you are on Linux). TIMTOWTDI !

#!/usr/bin/perl use strict; # https://perlmonks.org/?node_id=11151755 use warnings; my @paths = @ARGV; my %seen; while( my $path = shift @paths ) { $path eq '/data/logs/master' and next; unshift @paths, grep -d, <$path/*>; -f and $seen{join ' ', (stat)[0,1]}++ == 0 and save_file($_) for <$path/*auth.log>; }
  • Comment on Re: in Perl find call looking to exclude folder and ignore duplicate finds.
  • Download Code

Replies are listed 'Best First'.
Re^2: in Perl find call looking to exclude folder and ignore duplicate finds.
by parv (Parson) on Apr 24, 2023 at 06:13 UTC
    Here's a solution without using File::Find that's pretty simple and I think covers all your requirements (that is, if you are on Linux) ...

    What is in the code you had posted would make it not work on UNIX or Unix-like OSen?

      I think, in this case "Linux" is meant as pars pro toto for "not Windows"