Misusing it in a way like you do "to make clearer what it does" is actually getting away from the familiar path for the people who do know it, into unknown territory.
I've been thinkng about this for a while now and well, I'm not convinced that you're right. Nor that you're wrong!
I did a scan around and found that I have used $a op= $b in my code on many occasions. "Yeah well. You do lots of weird things in your code. What's new about that?" I hear the assembled populace cry :)
The point is, I've never set out to misuse the construct (at least not in this particular way). And nor do I ever recall having explicitly taken the decision to do so. Not for a given particular usage/program or in general. But the fact that I have done so without thinking about it on several occasions suggests to me that there is a reason that underlies it.
Having thought about it, I reached the conclusion it's because that's how I think about what happens/needs to happen inside a reduce block. At each iteration, $a needs to become whatever combination of this $a and this $b that will allow me to combine that result (when it's passed back to me) with the next $b.
Now I know that reads as a very clumsy description when you consider it in the light of the simple uses of reduce--sum, prod etc.--, but it helps (me) see extended uses that allow me to do some fairly sophisticated things with it.
For example, compare my Perl 5 solution to Re: 99 Problems in Perl6 with those others in that thread. It's far simpler than those in Perl 5, Perl 6, or Lisp. Dare I say, it even compares favourably with the Haskell solution, but that's open to debate :)
In that example, and another here, (though you'll have to look hard to see it), the external, implicit assignment of the block's result back to $a is an almost an unneccesary fact of life rather than the raison d'etre for using reduce.
Now calling those uses "misuses" may be the bottom line as far as you are concerned, but they allow the concise and efficient expression of those algorithms where the alternatives are far less concise, and therefore far less clear--and also less efficient.
As a final test (after writing the above paragraph; I'd never considered it before), I wondered what effect the 'duplicate assignment' was having upon performance. Assuming it would be detrimental, I was really quite surprised to find that in all the tests I ran, for all the operators I tried, it actually ran faster, and sometimes substantially so.
cmpthese -1, { A => q[ my @a = reduce{ $a += $b } 1 .. 1e4 ], B => q[ my @a = reduce{ $a + $b } 1 .. 1e4 ], };; Rate B A B 315/s -- -21% A 399/s 27% -- cmpthese -1, { A => q[ my @a = reduce{ $a *= $b } 1 .. 1e5 ], B => q[ my @a = reduce{ $a * $b } 1 .. 1e5 ], };; Rate B A B 13.3/s -- -11% A 15.1/s 13% --
I am at a loss to explain why that should be, which of course means I must suspect the results and question the validity of the benchmark, but if there is anything wrong with it, I cannot see it.
Anyway, I'm not advocating the performance issue as a reason for doing it. But I do think that the way it causes me to think about what is going on inside the code block has led me to discovering some unusual and beneficial uses for reduce. And that may be enough to pursuade some people that the 'misuse' is only a misuse if you choose to think about it that way. $a is going to be mutated after each iteraion anyway, so mutating it early doesn't have any additional side-effects. Not even it seems, a performance hit.
In reply to Re^4: Bugs? Or only in my expectations?
by BrowserUk
in thread Bugs? Or only in my expectations?
by BrowserUk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |