http://qs1969.pair.com?node_id=53031

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'm posting anonymously to defeat being voted down...

This is probably really easy. I just don't know how to make this any better: I am trying to make a hash of row references:

foreach my $row (@$array_ref) { $hash{$row->[0]} = $row; }
Where $array_ref is a pointer to a two-dimensional array. Does anyone know a one liner that would do this? Thanks in advance .... :o)

Replies are listed 'Best First'.
Re: optimizing code - need help
by davorg (Chancellor) on Jan 19, 2001 at 21:43 UTC
      While that looks nice, be very careful to not use a map that returns multiple entries per input with a long list before 5.6.1 is out. Unfortunately it currently scales quadratically:
      %hash = map { $_, $_ } 1..20_000;
      If that doesn't make you a believer, s/20/100/ and try again.

      This applies to all versions of Perl with map until the upcoming 5.6.1.

        Thanks tilly ! and everyone else. You guys rock! =D
Re: optimizing code - need help
by danger (Priest) on Jan 19, 2001 at 21:45 UTC

    What's wrong with how you are doing it? Making a one liner of it will only make it shorter -- not necessarily better:

    my %hash; $hash{$_->[0]} = $_ for @$array_ref; #or my %hash = map{$_->[0] => $_} @$array_ref;
      As has been mentioned in previous questions of this sort, for is faster and requires less memory than map, because map creates a temporary in-memory list to return, which is then transformed into a hash. This effect is much more pronounced for large arrays than small ones... in fact, let's run some benchmarks:
      use strict; use warnings; use Benchmark qw(cmpthese); for my $size (5,50,500) { my @lol = map { [$_, qw(some random useless data)] } (1..$size); print "\n\nComparing 'map' and 'for' hash-assignment from array of $ +size elements\n"; cmpthese (-2, { for => sub { my %hash; $hash{$_->[0]} = $_ for @lol; }, map => sub { my %hash; %hash = map{$_->[0] => $_} @lol; } }); } ## OUTPUTS ## Comparing 'map' and 'for' hash-assignment from array of 5 elements Benchmark: running for, map, each for at least 2 CPU seconds... for: 3 wallclock secs ( 2.22 usr + 0.00 sys = 2.22 CPU) @ 77 +311.15/s (n=171940) map: 1 wallclock secs ( 2.19 usr + 0.00 sys = 2.19 CPU) @ 55 +606.20/s (n=122000) Rate map for map 55606/s -- -28% for 77311/s 39% -- Comparing 'map' and 'for' hash-assignment from array of 50 elements Benchmark: running for, map, each for at least 2 CPU seconds... for: 2 wallclock secs ( 2.04 usr + 0.00 sys = 2.04 CPU) @ 99 +92.17/s (n=20414) map: 2 wallclock secs ( 2.12 usr + 0.00 sys = 2.12 CPU) @ 56 +57.09/s (n=12010) Rate map for map 5657/s -- -43% for 9992/s 77% -- Comparing 'map' and 'for' hash-assignment from array of 500 elements Benchmark: running for, map, each for at least 2 CPU seconds... for: 3 wallclock secs ( 2.01 usr + 0.00 sys = 2.01 CPU) @ 94 +8.83/s (n=1910) map: 2 wallclock secs ( 2.07 usr + 0.00 sys = 2.07 CPU) @ 33 +5.75/s (n=696) Rate map for map 336/s -- -65% for 949/s 183% --
      For large arrays of 500 elements, using the seemingly innocuous map technique is nearly three times slower. For your typical 50-element array, map is almost twice as slow.

      I'm actually surprised the performance penalty was so pronounced.

        Your analysis while good in theory is off in details. Judging by a comparison with a development Perl, map will always be about twice as slow. Probably because of all of the lists you create, then having to internally do the assignments to the hash.

        However the slow-down you wondered at with 500 elements is the effect of the quadratic slowdown in my note. The problem is that the entire argument stack was getting moved for each call of the block. The first time that takes work n. The second n-1. The third n-2. etc. So it is quadratic.

        If you are willing to recompile Perl, there is a very simple patch available.

Re: optimizing code - need help
by Blue (Hermit) on Jan 19, 2001 at 23:58 UTC
    I'm posting anonymously to defeat being voted down...

    Something just strikes me as very wrong about that. Willing to post things to get ++ed, but have something potentially controversial and you want to duck out of the whole system that we use jsut because it may harm you.

    while (upset()) {$calming_breath++;}

    Ya know, asking how to improve something that is already working but you feel you can make better usually gains respect here at the monastery.

    Look, not to come down on you hard, but the XP doesn't really mean that much, it's fun. When reading a node, very few of us go look up the author of each to see their XP. And if you spend enough time to have a feel for who is high level or not, whether they have a few more or less XP doesn't change their opinions. Come out, don't worry about XP, and just be a person who enjoys Perl.

    =Blue
    ...you might be eaten by a grue...

        Yes, damn those cookies, *sniff* I was trying to be all stealthy and got snagged in the act ;) I would have posted under my own name, although I am trying very hard not to get voted down, I fear and respect the -- !!!!!!!!! lol
(redmist) Re: optimizing code - need help
by redmist (Deacon) on Jan 20, 2001 at 02:14 UTC
    I would agree with you two under different circumstances, but this was a reasonable question asked in a pleasant fashion. Why should the poster have to fear being voted down just because the question may be "easy"? That's ridiculous.

    I fully support the posters actions, and would like to say that this wouldn't have to happen if people would just loosen up and not down-vote every beginner that posts an easy question.

    redmist
    Silicon Cowboy
    redmist::webnode