in reply to Iterating hashes safely and efficiently
Here is a completely safe way to have a loop that involves deletions and/or insertions, without skipping members or losing track of where you are. This code iterates over the original set of hash keys:for my $key (my @temp = keys %my_hash) { my $value = $my_hash{$key} # and no need to change the code below . . # some processing on the element . }
When I see someone doing this, I begin to wonder whether they've really thought about what they're trying to do. Deleting or inserting elements into the hash (or array) you're iterating over in foreach OUGHT to cause you problems! What do you do when you get to the element that you've since deleted? Why wouldn't you want to perform the operation on the element you've just inserted?
I understand why you might use each and I understand the problems you can have with that, but I feel that if you can't perform your actions by using foreach and keys, then you should be refactoring your loop.
Perhaps what you really want to be doing is having 2 loops, but you can probably just get by with one. For example:
my %new; foreach my $key (keys %old} { if(....) { $new{$key} = $old{$key}; # do further processing. } else { print STDERR "dropped key: $key\n"; } }
I can't think of a good reason that a good programmer would mess around with the list they're iterating over unless they felt that they just had to avoid extra variables, and that isn't really a good reason.
All the best,
jarich
|
|---|