Actually, they don't....
What I mean by that is; say you have a method that has two effects upon the current state of the object. For simplicity we'll say that the foo() method increments or decrements a variable dependant upon it's parameters, and left-truncates a string.
A superclass comes along and wants alter the points at which the variable is incremented and decremented, but it cannot easily do so independantly of the truncation because if it overrides the class and doesn't call the supermethod, the truncation doesn't happen. If it does call the supermethod, it looses control of the variable setting.
"Oh, but the two actions should be independantly overidable!", I hear someone say. Except, that when the class is written, there is no reason for that. Also, the two actions are inextricably linked. When you do first, you have to do the second, and you cannot do either without the other. Making them independant methods and having one call the other every time works, but only if you have the knowledge or foresight to predict the breakpoints within overall actions required by the original class that might be useful in a superclass.
In extremis, this would require every independant function, statement and clause of every line of every method to be rendered as it's own independant method. In effect, you would need to be able to override every opcode independantly.
This is obviously ludicrous as 99% of them will never be overridden. And that's the point. No matter how you decide to override the requirements of the current need to cater for "future possibilities", you may make--are quite likely to make--the wrong choices. And so your attempts fail to help future developments and may even hinder them.
The alternative is that you make no such concessions to the future and code just that which is required. When the future arrives and the requirements are known, then the base class may need modifying, but in the mean time you have saved costs by avoiding unnecessary effort, and probably simplified the tasks of all those applications that use the original base class with having to cater for those unnecessary future requirements.
When the original base class is modified to meet those future requirements, it may impose changes on existing users also, but at least you will know that they are required.
If you attempt to predict the future requirements and cater to them now, in order to minimise change in the future, it makes you vulnerable to making the wrong choices, which Sod's Law says will happen greater than 50% of the time.
In reply to Re^7: Understanding 'Multiple Inheritance' (hindsight)
by BrowserUk
in thread Understanding 'Multiple Inheritance'
by punkish
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |