in reply to Limitations to chmod and performance

Working one file at a time reduces the possibility that something changes in the filesystem between when you see the file and change the permission. It also means that there is no upper limit on how many files you can handle.

Working with a list of files and letting chmod do the loop is probably faster than looping over chmod statements. But if your list of files is very large, you could run out of memory. (Very unlikely, but there is a theoretical limit there.)

Personally I would suggest working with whatever style fits the surrounding code the best. With File::Find that would be one file at a time, just do the chmod in the callback.

(Actually I would be inclined to go to the shell and just do a chmod -R ..., but it sounds like you have been down that route...)

Replies are listed 'Best First'.
Re: Re (tilly) 1: Limitations to chmod and performance
by arhuman (Vicar) on Nov 12, 2001 at 18:48 UTC
    Working with a list of files and letting chmod do the loop is probably faster than looping over chmod statements. But if your list of files is very large, you could run out of memory. (Very unlikely, but there is a theoretical limit there.)

    I'll furthermore add that, with many implementations of 'chmod' the number of args is limited,
    and you'll probably hit a 'arg list too long' error way before you exhaust your memory...
    (For example on my linux box I'm limited to around 12900 files as arguments...)

    To my mind you'd definitly better use the one-at-a-time strategy...

    Note : This problem also occurs with a lot of others file related tools : rm, ls...

    "Only Bad Coders Code Badly In Perl" (OBC2BIP)