Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re: Re: map-like hash iterator

by jdporter (Paladin)
on Nov 06, 2002 at 20:51 UTC ( [id://210901]=note: print w/replies, xml ) Need Help??


in reply to Re: map-like hash iterator
in thread map-like hash iterator

Yeah... but if the hash has 10_000_000 keys, it's probably no better to have a list of 10_000_000 undefs than a list of 10_000_000 actual key values.

Maybe this instead:

my $n = keys %$h; for ( my $i = 0; $i < $n; $i++ ) { local( $a, $b ) = each %$h; $c->() }

Replies are listed 'Best First'.
Re^3: map-like hash iterator
by Aristotle (Chancellor) on Nov 06, 2002 at 21:26 UTC

    The idea was to be able to use map to build the return list. Obviously that makes little sense in void context, which is all your for proposition will be able to provide.

    undefs are actually less wasteful than actual values - my @x = (undef) x 10,000,000; consumes 130MB for me, my @x = ('a'x10) x 10,000,000; nearly hits the 500MB mark.

    Of course, if you're not in void context and actually intent on returning the resulting list from processing a 10,000,000 key hash, you'll have to be able to fit that in memory anyway. Since you're throwing around big chunks of data, memory can't be a huge concern, otherwise you'd be walking the hash manually and chewing the bits more carefully.

    You can't have your cake and eat it - you can't be using an iterator when you're concerned about memory usage.

    Makeshifts last the longest.

      Of course, if you're not in void context and actually intent on returning the resulting list from processing a 10,000,000 key hash, you'll have to be able to fit that in memory anyway.
      Not necessarily always the case, though. The callback routine might never return anything -- except one time when it returns one thing. A jillion-key hash in, a one-element list out.

      Just like map.


      UPDATE

      You can't have your cake and eat it - you can't be using an iterator when you're concerned about memory usage.
      That is patently false. In fact, the built-in hash iterator (each) is all about efficiency -- in both space and time.

      There is no reason why iterators can't be efficient.

        That would be grep ;-) Still, that can be pretty wasteful. Why iterate over the entire 10,000,000 records even when the single one of interest is found at the very beginning? The iterators do not offer any early bailing mechanism.

        You have to look at the larger picture.

        Iterators mainly offer convenience. There are very few situations where iterators are useful under big efficiency concerns (be it memory or time), in the absence of lazy lists. Even so, you can't go wrong with the explicit loop construct.

        This ain't Ruby. :-) Perl 6 will, however, have lazy lists. (Is there anything Perl 6 won't fix? :-))

        Makeshifts last the longest.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://210901]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others exploiting the Monastery: (6)
As of 2024-03-28 20:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found