in reply to Re^5: Understanding 'Multiple Inheritance' (hindsight)
in thread Understanding 'Multiple Inheritance'

I wanted to reply to the issues you raised with P6's handling of multimethod dispatch.

In the first case, all the co-inheriters . . . have to consider the possibility, and handle it.

Actually, they don't. There will be a usable set of default behaviors. And, just like in all things, you only have to define something if you don't like the default. If you like it, don't redefine it!

(As an aside, I wish people would follow this advice for most overloaded stuff. If you have a random numeric overloaded object that doesn't subtract when adding or some other goofiness, just define numify, set fallback to true, and let Perl handle the rest.)

you have to add a mechanism to the language to allow the superclass writer to hand code the inheritance/dispatch ordering.

And, since Larry wants it, Larry's gonna get it. Re-read A12 for how he will do this. The short version is you'll get to do it in all the places you mention, and more.

In any case, I think that roles (if they turn out to be what I think they should be) will obviate the need for the above manually controllable dispatch ordering and render MI a little used feature.

Absolutely! There will be little need for MI at all, once roles/traits/mixins/etc/etc/ad-nauseum are grokked. But! (and there's always a but) . . . in those random cases where MI is actually the easier WTDI, then you will have the option. Remember - roles/traits/whatever are really a generalization of multiple inheritance without all the bagage that MI comes with through single inheritance.

Being right, does not endow the right to be rude; politeness costs nothing.
Being unknowing, is not the same as being stupid.
Expressing a contrary opinion, whether to the individual or the group, is more often a sign of deeper thought than of cantankerous belligerence.
Do not mistake your goals as the only goals; your opinion as the only opinion; your confidence as correctness. Saying you know better is not the same as explaining you know better.

  • Comment on Re^6: Understanding 'Multiple Inheritance' (hindsight)

Replies are listed 'Best First'.
Re^7: Understanding 'Multiple Inheritance' (hindsight)
by BrowserUk (Patriarch) on Mar 07, 2005 at 18:21 UTC
    In the first case, all the co-inheriters . . . have to consider the possibility, and handle it.
    Actually, they don't....

    What I mean by that is; say you have a method that has two effects upon the current state of the object. For simplicity we'll say that the foo() method increments or decrements a variable dependant upon it's parameters, and left-truncates a string.

    A superclass comes along and wants alter the points at which the variable is incremented and decremented, but it cannot easily do so independantly of the truncation because if it overrides the class and doesn't call the supermethod, the truncation doesn't happen. If it does call the supermethod, it looses control of the variable setting.

    "Oh, but the two actions should be independantly overidable!", I hear someone say. Except, that when the class is written, there is no reason for that. Also, the two actions are inextricably linked. When you do first, you have to do the second, and you cannot do either without the other. Making them independant methods and having one call the other every time works, but only if you have the knowledge or foresight to predict the breakpoints within overall actions required by the original class that might be useful in a superclass.

    In extremis, this would require every independant function, statement and clause of every line of every method to be rendered as it's own independant method. In effect, you would need to be able to override every opcode independantly.

    This is obviously ludicrous as 99% of them will never be overridden. And that's the point. No matter how you decide to override the requirements of the current need to cater for "future possibilities", you may make--are quite likely to make--the wrong choices. And so your attempts fail to help future developments and may even hinder them.

    The alternative is that you make no such concessions to the future and code just that which is required. When the future arrives and the requirements are known, then the base class may need modifying, but in the mean time you have saved costs by avoiding unnecessary effort, and probably simplified the tasks of all those applications that use the original base class with having to cater for those unnecessary future requirements.

    When the original base class is modified to meet those future requirements, it may impose changes on existing users also, but at least you will know that they are required.

    If you attempt to predict the future requirements and cater to them now, in order to minimise change in the future, it makes you vulnerable to making the wrong choices, which Sod's Law says will happen greater than 50% of the time.


    Examine what is said, not who speaks.
    Silence betokens consent.
    Love the truth but pardon error.