in reply to Re: Noob could use advice on simplification & optimization
in thread Noob could use advice on simplification & optimization

Thanks for the suggestions. I read the files to memory to speed up the searches, would this actually slow things down if I run the script on more memory restricted machines?
  • Comment on Re^2: Noob could use advice on simplification & optimization

Replies are listed 'Best First'.
Re^3: Noob could use advice on simplification & optimization
by temporal (Pilgrim) on May 04, 2012 at 14:25 UTC

    When you run a recursive directory search and it encounters some large binary file or something in a hidden away subdirectory you're going to be loading quite a bit into memory. Where it will really get bad is when you get a file larger than your system's memory (or more accurately, the memory allocated to a Perl process).

    The easiest way to avoid this is to read the files line by line. But you could also write an smart read method which would buffer your reads in a limited length array, giving you something of the best of both worlds.

    The other advantage to slurping the file is you avoid splitting the file on newlines into an array.

    You might want to add some filename filtering so the user can exclude/include certain file types.

    Strange things are afoot at the Circle-K.