in reply to Re: Pruning directory searches with File::Find
in thread Pruning directory searches with File::Find

It would seem to me that that particular module's paradigm for extracting files from the file system could pose some serious memory issues if the rules that you specify result in returning nearly all of the files in the tree you specified, which may certainly be the case if you're specifying just a lenient "not" rule for pruning things out. This is akin to slurping an entire file instead of reading it line by line. Often you can get away with it as the file will be of a reasonable length, but sometimes you'll get burned when you try to blast an enormous file into memory. Slurp a short config file, and nobody will notice; slurp a SQL transaction log that hasn't been rotated recently and you could bring the system to its knees. Caveat Slurpor.

  • Comment on Re: Re: Pruning directory searches with File::Find

Replies are listed 'Best First'.
Re: Re: Re: Pruning directory searches with File::Find
by broquaint (Abbot) on Jul 25, 2003 at 23:52 UTC
    It would seem to me that that particular module's paradigm for extracting files from the file system could pose some serious memory issues if the rules that you specify result in returning nearly all of the files in the tree you specified
    Er yes, but I wouldn't say this is an issue of the module so much as its grand ability to let you get on with it. Much like SQL will allow you to perform a SELECT *, it doesn't necessarily condemn SQL (the various issues of SQL are for another node I'm sure). With great power comes great responsibility and all that :) Anyhow you could always just use the iterative approach like so
    use File::Find::Rule; my $dir_rule = rule( directory => not_name => qr/^_/, start => @ARGV, ## or whatever ); while(my $dir = $dir_rule->match) { ... }
    Lovely.
    HTH

    _________
    broquaint