Thanks victor. This is the point.
I stopped to comment these kind of "counting less classes = better" argument. And stop to consider all the feedback useful: we changed nile because the first design was not good. If people like the Stream hierarchy, they can just keep it. I think that we are focusing on the wrong problem and at the end frankly I believe that we did our job and pretty well. We will continue and we want to build collection hierarchy based on traits, but don't worry this will not be in Squeak.
Stef
On May 17, 2008, at 3:28 AM, Victor Rodriguez wrote:
Hi,
On Fri, May 16, 2008 at 4:38 PM, Andreas Raab andreas.raab@gmx.de wrote: ...
It is interesting to do a quick check to see how much this might change matters: First, combining these three classes into one means that the traits version has now twice the number of entities vs. the non-traits version (3 vs 6). This view is also supported by counting the "backward compatible"
Less entities are not necessarily better than more, as I´m sure you know. Generally, more classes with a clear responsibility are better than less, harder to understand, classes.
part of Figure 12 (which is directly comparable with the Squeak version) which results in 11 classes and traits (compared to 5 classes in Squeak).
Next, if we take the total number of methods in these three classes:
ReadStream selectors size + WriteStream selectors size + ReadWriteStream selectors size
68
(the measure was taken in 3.9 to be roughly comparable with the paper and I'm not sure why the paper claims 55 methods) and compare this with the number of unique selectors (discounting all re-implemented methods):
(Set withAll: (ReadStream selectors asArray), (WriteStream selectors asArray), (ReadWriteStream selectors asArray)) size
59
What we get is 15% improvement *minimum* by folding these three classes (very likely more if one looks in detail).
Next, let's look at "canceled methods" (those that use #shouldNotImplement). The paper lists 2 canceled methods which happen to be WriteStream>>next and ReadStream>>nextPut:. And of course those wouldn't exist in a single-inheritance implementation either. Etc.
In other words, the measures change *dramatically* as soon as we apply the original idea regardless of whether traits are used or not. Which speaks clearly for the original idea of folding these three classes into one but concluding that traits have anything to do with it would require a very different comparison.
If the paper wants to make any claims regarding traits, it really needs to distinguish improvements that are due to traits from general improvements (i.e., improvements that are just as applicable to single-inheritance implementations). Otherwise it is comparing apples to oranges and can't be taken seriously in this regard.
But there *are* limits to what you can achieve with single inheritance. It is not very hard to come up with an example:
The Magnitude class is the perfect candidate for being converted into a trait, if you ask me. Here is its class comment:
I'm the abstract class Magnitude that provides common protocol for objects that have the ability to be compared along a linear dimension, such as dates or times. Subclasses of Magnitude include Date, ArithmeticValue, and Time, as well as Character and LookupKey.
My subclasses should implement < aMagnitude = aMagnitude hash
Subclasses of Magnitude, by implementing #< #= #hash, gain methods #<= #> #>= #between:and: #hashMappedBy: #max: #min: #min:max:. The subclasses of Magnitude are Number, Character, DateAndTime, etc.
String does not subclass Magnitude, it subclasses ArrayedCollection, and yet it does implement #< #= and #hash. It could clearly benefit from using Magnitude as a trait (indeed, it does implement #hashMappedBy: exactly as Magnitude).
Having traits like Magnitude leave you more options to define a better inheritance hierarchy.
Saludos,
Víctor Rodríguez.
Cheers,
- Andreas