Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Re: Splitting a hash

by tadman (Prior)
on Mar 11, 2003 at 09:43 UTC ( [id://241978]=note: print w/replies, xml ) Need Help??


in reply to Re: Splitting a hash
in thread Splitting a hash

Or, in the interest of not creating variables you don't use:
foreach my $key (sort keys %dbm) { print "$_\n" foreach (split("\0", $dbm{key})); }
The advantage to this is that if another variable is added, at least you won't have to update this code to handle it.

Replies are listed 'Best First'.
Re^2: Splitting a hash
by Aristotle (Chancellor) on Mar 15, 2003 at 12:08 UTC
    Or (usually) preferrably
    foreach my $key (sort keys %dbm) { print map "$_\n", split /\0/, $dbm{$key}; }
    (You have a typo: $dbm{key} is missing a sigil.)

    Makeshifts last the longest.

      Good catch on the error, by the way. Still, in this case I'm not sure that map is the best approach. You end up creating a list with split and then reprocessing it with map and then finally printing it. The foreach variation actually omits a step, which I would think makes it just slightly better.

      That is not to say you couldn't do something like this:
      foreach my $key (sort keys %dbm) { print join("\n", split(/\0/, $dbm{$key})), "\n"; }
      This is probably slower because you create a huge scalar in the process, but still, it should work.
        Or maybe
        print join "\n", split(/\0/, $dbm{$key}), '';
        The for version does omit a step, but at the cost of a lot of calls to print. The map version only calls it once. Thinking about it, you can even pull that call entirely out of the loop:
        print map "$_\n", split /\0/, @dbm{sort keys %dbm};
        The double foreach version is the least memory intensive; the no foreach version is most so. But, it is trivial to replace the print then:
        my $cache = freeze [ map "$_\n", split /\0/, @dbm{sort keys %dbm} ];

        It wouldn't have been quite so trivial a transition with the double foreach code. That's why I prefer, and believe it is generally preferrable, to build large structures and process them all at once, rather than nibble at bits of them. All data is in once place at every step of the process, and its internal correlations don't get lost.

        I'm not going to worry about memory either so long as I have no indication that it's going to become a problem. If and when it does, Perl's tying mechanism makes it easy to opt for the harddisk anyway.

        Makeshifts last the longest.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://241978]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (5)
As of 2024-04-19 03:22 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found