in reply to Re: Using a lookup table(?)
in thread Using a lookup table(?)

Your ignored the fact that the search was case-insensitive.

my $host = lc(shift(@_));

Your grep is wrong.

my @lines = grep { lc($_) eq $host } @ry;

Why read the whole file everytime?

use List::Util qw( first ); my $line = first { lc($_) eq $host } @ry;

Why use Tie::File if you're not going to take advantage of it?

use List::Util qw( first ); my $line = first { lc($_) eq $host } <$fh>;

Why load up the entire file into memory if you look at it one line at a time?

sub get_comments { my $host = lc(shift(@_)); my $file_name = "myfile.txt"; open(my $fh, '<', $file_name) or die("Unable to open server info file \"$file_name\": $!\n"); while (<$fh>) { return $1 if /^$host:\s*(.*)/; } return; }

Of course, that's still very expensive if get_comments is called repeatedly. If the list is really that long, some sort of index (or database) would be very useful. A sorted file with fixed length records would also be fine (allowing for a binary search).

Replies are listed 'Best First'.
Re^3: Using a lookup table(?)
by blue_cowdawg (Monsignor) on Feb 07, 2007 at 19:22 UTC
        Why load up the entire file into memory if you look at it one line at a time?

    Last I checked... Tie::File does not load the entire file into memory...


    Peter L. Berghold -- Unix Professional
    Peter -at- Berghold -dot- Net; AOL IM redcowdawg Yahoo IM: blue_cowdawg

      That comment refered to the intermediary step towards the optimization that used <$fh> in list context.

      Tie::File does not load the entire file in memory (although it stores in memory the index of every line encountered in the file, unrelated and beyond any limit you have placed on its cache size).