in reply to Re^3: perl sort issue while going through article at perl.com
in thread perl sort issue while going through article at perl.com

why doesn't below program produce number of keys?
#!/usr/bin/perl -w use strict; use diagnostics; my $hash = { key1 => 'value1', key2 => 'value2', key3 => 'value3', key1 => 'value3', key1 => 'value33', key1 => 'value3', key2 => 'value3', key2 => 'value3', key4 => 'value3', key4 => '23232', }; #foreach (keys %{$hash}) { # print "$_ => ${$hash}{$_}\n"; #} my %hist; foreach (keys %{$hash}) { $hist{$_}++ for $_; } foreach (keys %hist) { print "$_ , $hist{$_}\n"; } print "size of hash: " , keys(%{$hash}) . "\n"; ././././perl.hash.basic.2 key2 , 1 key1 , 1 key4 , 1 key3 , 1 size of hash: 4
I am looking to find out how many key2 are in the hash and how many key1 is in the hash and etc.. Below tutorial does not tell you how many of each key exists in the hash
Now there are two passes over the list, and the situation isn't going +to get any prettier from here. What you want is basically a histogram + of the data, and you can get that with a hash: my %histogram; $histogram{$_}++ for @list; This hash associates each individual item with its count, and it only +traverses the list once. In a recent case, I was looking at a list of tags associated with vari +ous photographs. To lay out the data for display, it was useful to kn +ow how many different tags there are in the list. I could get that si +mply from the number of keys in the histogram: $unique = keys %histogram; I could also delete the duplicates and end up with a list of the uniqu +e tags: @unique = keys %histogram;

Replies are listed 'Best First'.
Re^5: perl sort issue while going through article at perl.com
by shmem (Chancellor) on Oct 23, 2007 at 05:51 UTC
    Number of keys is correct. Hash keys are unique. While you can say
    my %hash = ( key => 'foo', key => 'bar', key => 'quux', key => 'fizz', key => 'buzz', );

    in the sense that its's legal and perl won't complain, you end up with %hash containing just one entry. Each assignment overrides the previous one, and you have

    %hash = ( key => 'buzz', );

    finally.

    --shmem

    _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                  /\_¯/(q    /
    ----------------------------  \__(m.====·.(_("always off the crowd"))."·
    ");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
Re^5: perl sort issue while going through article at perl.com
by convenientstore (Pilgrim) on Oct 23, 2007 at 22:14 UTC
    can someone please help me with this? I do not understand why it's not giving me the each counted key.. it's just giving me 1.. what am I missing??
    #!/usr/bin/perl -w use strict; use diagnostics; use Data::Dumper; my %hash = ( key1 => 'value1', key2 => 'value2', key3 => 'value3', key3 => 'value3', key3 => 'value3', key1 => 'value3', key1 => 'value33', key1 => 'value3', key1 => 'value3', key2 => 'value3', key2 => 'value3', key2 => 'value3', key4 => 'value3', key4 => '23232', key5 => '23232', key6 => '23232', key6 => '23232', ); my %hist; foreach (sort keys %hash) { $hist{$_}++; } print Dumper(%hist); __END__ foreach (sort keys %hist) { print "$_ => $hist{$_}\n"; } $VAR1 = 'key5'; $VAR2 = 1; $VAR3 = 'key2'; $VAR4 = 1; $VAR5 = 'key6'; $VAR6 = 1; $VAR7 = 'key4'; $VAR8 = 1; $VAR9 = 'key1'; $VAR10 = 1; $VAR11 = 'key3'; $VAR12 = 1;
      A hash is like an array but the index can be text instead of a number. so when you write
      my %hash = ( key1 => 'value1', key2 => 'value2', key3 => 'value3', key3 => 'value3', key3 => 'value3', key1 => 'value3',

      the second time you do key1 => ... it will overwrite any earlier value of key1.

      There will only ever be one value associated with each key, so when you

      foreach (sort keys %hash) { $hist{$_}++; }

      each key will be unique, so the count will be one for each of them

      You need to study hashes more. Perhaps perldsc will be of help

        thank you so much. I was reading all the perl books going crazy.. but thank you and I will also read perldsc again to make sure I get better understanding. this is huge