bahadur has asked for the wisdom of the Perl Monks concerning the following question:

i am designing a CLI. once a user starts the script he has to enter a command.based on that command the script performs the required action.
here is my code
$myhash{"set time"}=\&settime; $myhash{"show time"}=\&showtime; $myhash{default}=\&error; print "Welcome to nMetrics application monitor\n"; print "Please type in a command\n"; while(<STDIN>) { chomp; $res="false"; foreach $k (keys %myhash) { if ( $k eq "$_" ) { &{$myhash{$k}}(); $res="true"; } } if ($res eq "false") { &{$myhash{default}}(); } }
so basically i put all the commands in a hash and read from stdin and than invoke the required command.
but later on these commands would increase to more than 20. and having such a big hash would not be a good idea.
so is there any more efficient way of handling this problem? previously i was doing it using if else statements. but thanks to one of the monks who gave me this wonderful idea. is there a more smarter idea than this?

Replies are listed 'Best First'.
Re: CLI using hashes
by revdiablo (Prior) on May 30, 2005 at 01:08 UTC

    For one thing, a hash with 20 keys is not really very big at all. Another thing is you shouldn't be iterating on the hash keys and matching with eq. The hash can do near O(1) lookups by itself -- that's one of the major reasons to use a hash. You are completely eliminating that advantage, and you're making the code much more complicated than it needs to be. The entire inside of your while loop can be replaced with this:

    chomp; if (exists $myhash{$_}) { $myhash{$_}->(); } else { $myhash{default}->(); }

    Update: in case it wasn't clear from my reply, I suggest you stick with the hash-based approach. It should work fine for this kind of thing.

      thanks for the wonderful tip. this was so nice. by the way if 20 commands are not such a big thing than how many commnads will make it big? i am asking so just incase i have to make another hash
        You would need to be well into the tens of thousands of entries before the hash started having problems. And at that point, the problems would be memory-related, not cpu-related, so splitting the hash wouldn't help. In short, you should never need to worry about if a hash is getting too big, up until the point at which you should have your data in a database, anyway.