in reply to Re: Need to search for a string in a file
in thread Need to search for a string in a file

Will my @list = grep /re/, <FH> necessarily pull the entire file into memory? (I don't know and would like to know for sure!)

It sure seems to!

open my $fh, "<", "/dev/urandom" or die "$!"; my @a = grep /huh, whaddayamean/, <$fh>;

Replies are listed 'Best First'.
Re^3: Need to search for a string in a file
by aaron_baugher (Curate) on Oct 20, 2011 at 22:29 UTC

    Someone please correct me if I'm wrong, but I don't think your example will pull the entire file into memory. I think it will read lines from $fh one at a time, passing them to grep, which will return matching lines to @a.

    However, you're reading from /dev/urandom, which is an endless stream of random bytes, not a file which has an end. So eventually @a is going to get very large, yes. Also, depending on what your end-of-line delimiter is set to, <$fh> may return some very long lines for grep to deal with.

      I thought (hoped) that grep would have the behaviour you described; but even when I use a very unlikely regular expression, perl's memory usage increases at a constant rate.

      The rate doesn't appear to differ depending on what regular expression I use, which seems to imply it's not the matches (of which I doubt there are any) filling memory, but just the input from /dev/urandom (which, as you mention, is endless).

      The EOL delimiter is probably \n, which should occur in /dev/urandom 1 in 256 times (so long, but not that long).

        Interesting. Maybe someone who knows how to use the debugger better than I do could try it and see where the memory is going.

        Perhaps the <$fh> syntax only works as I'm thinking when used in scalar context? I do seem to remember reading somewhere (camel book?) that there's some extra magic when it's used in a while loop that it doesn't have otherwise.