It's running, see: http://www.gliebe.de/self/ :)
This project uses code (mainly the i386 assembler and parts of the NIC) from an earlier Linux port started by Gordon Cichon in 1999.
Bye Torsten
Torsten.Bergmann@phaidros.com wrote:
It's running, see: http://www.gliebe.de/self/ :)
This project uses code (mainly the i386 assembler and parts of the NIC) from an earlier Linux port started by Gordon Cichon in 1999.
I've been playing around with it (and reading all the Self papers I can find) for a few weeks. I'd read a little about Self before, but this was the first chance I've had to actually run it (many thanks to all involved!).
I'm very impressed. Morphic in Squeak makes more sense to me now after experiencing the Self version, and I find the lack of classes and objects-are-just-collections-of-slots aspects of Self to be very appealing.
I have a question about Self that I have anxious to post to the Squeak mailing list (because I know there are people here with the right experience to answer it). I've been wondering: what is wrong with Self? Why hasn't it taken over the world? :)
In many ways, Self seems like an improvement over Smalltalk, and I get the impression that the "Programming as Experience" inspiration for Self has a lot to do with the direction Squeak is headed. But, other than Morphic and "HotSpot(tm)", it looks like many of the Self concepts have been ignored. Is it because it turns out to be hard to organize a big system without classes? Is it because implementations exist for only a few types of computers, so many people have never experienced it? Or is there something else?
Ted
I guess one of the most qualified persons to answer your question is John Maloney and we discussed this in detail in an interview we made with him. You can read the relevant parts of this interview in the August issue of Squeak News which you can download from http://www.squeaknews.com/download/index.html. Unfortunately currently the only way to read the interview is to download the whole issue although we are trying to address this. I believe you can find an answer to your question in that article.
Cheers,
Tansel
There is a lenghty discussion of possible answer to your question in
I have a question about Self that I have anxious to post to the Squeak
mailing list
(because I know there are people here with the right experience to answer
it). I've
been wondering: what is wrong with Self? Why hasn't it taken over the
world? :)
In many ways, Self seems like an improvement over Smalltalk, and I get the
impression
that the "Programming as Experience" inspiration for Self has a lot to do
with the
direction Squeak is headed. But, other than Morphic and "HotSpot(tm)", it
looks like
many of the Self concepts have been ignored. Is it because it turns out to
be hard to
organize a big system without classes? Is it because implementations exist
for only a
few types of computers, so many people have never experienced it? Or is
there
something else?
Ted
Despite all the hype about the self compiler it actually runs like a total dog (Self 4.1 on the Mac anyway). It has to be ten times slower than Squeak and is too slow to be useful. (Which is a shame)
On Sunday, January 20, 2002, at 03:02 PM, Editor - Squeak News wrote:
I guess one of the most qualified persons to answer your question is John Maloney and we discussed this in detail in an interview we made with him. You can read the relevant parts of this interview in the August issue of Squeak News which you can download from http://www.squeaknews.com/download/index.html. Unfortunately currently the only way to read the interview is to download the whole issue although we are trying to address this. I believe you can find an answer to your question in that article.
Cheers,
Tansel
There is a lenghty discussion of possible answer to your question in
I have a question about Self that I have anxious to post to the Squeak
mailing list
(because I know there are people here with the right experience to answer
it). I've
been wondering: what is wrong with Self? Why hasn't it taken over the
world? :)
In many ways, Self seems like an improvement over Smalltalk, and I get the
impression
that the "Programming as Experience" inspiration for Self has a lot to do
with the
direction Squeak is headed. But, other than Morphic and "HotSpot(tm)", it
looks like
many of the Self concepts have been ignored. Is it because it turns out to
be hard to
organize a big system without classes? Is it because implementations exist
for only a
few types of computers, so many people have never experienced it? Or is
there
something else?
Ted
On Monday 28 January 2002 02:48, alban read wrote:
Despite all the hype about the self compiler it actually runs like a total dog (Self 4.1 on the Mac anyway). It has to be ten times slower than Squeak and is too slow to be useful. (Which is a shame)
There is a lot of confusion about this. Since this is getting off topic for Squeak, I am sending a copy to the Self list so any discussion can continue over there. On the other hand, the subject might be of interest to jittery Squeakers ;-)
The famous Self compiler that ran numerical benchmarks at 50% the speed of optimized C was introduced in Self 2.0 and was created by Craig Chambers. The result is simply amazing when you consider that C does no index or overflow checking and had a far better code generator. Craig's compiler was a huge leap forward in type analysis and he has continued this work in his Vortex Compiler project (for the Modula 3 and Cecil languages). The problem was that compilation was slow and so interactive use of Self 2 wasn't very nice. The compiler didn't do such a great job in highly polymorphic (very OO styled) benchmarks.
In Self 3 and 4 an entirely different direction was taken by Urs H�lzle. Two simple compilers replaced the complex one used previously, and type analysis was replaced with type feedback. The first compiler (NIC, the Non Inlining Compiler) did a quick and dirty job that allowed code to start running. Data structures called PICs (Polymorphic Inline Caches) improved the performance of message sending and, as a side effect, collected information about the types of objects actually used in the various call sites in the code. When a given method was identified as being critical to system performance (a "hot spot", hence the name of the technology given at Animorphics and included in Java 2) the second compiler (SIC, the Simple Inlining Compiler) was called to generate much better code. Though the type information available in the PICs was just a subset of what could be obtained with type analysis, it was good enough for the SIC to do a good job very quickly.
This new system did a better job on real applications and is far better for interactive use, but it doesn't get the fantastic benchmark results that Self 2 did.
When porting Self to a new system, you can get it running by just implementing the NIC. Performance will be terrible since no inlining or type feedback will be used and these two are the key technologies in Self. For example, a simple "[...] whileTrue: [...]" will involve many message sends and block context creations *per loop*. Contrast that with Squeak where the compiler will generate optimized jump bytecodes instead.
Unfortunately, neither the Power Mac nor the Linux PC ports implement the SIC, yet. David Ungar says performance is good enough on his PowerBook and a previous version (4.1.2) seemed almost reasonable on a 233 MHz iMac. The current version (4.1.4) is supposed to improve performance, but it seemed very slow on the 600 MHz iBook I tried it on (it wouldn't run on the iMac's OS 8.6).
So I am afraid you haven't had a chance to see Self either as it was (Self 2) or how it is meant to be (NIC+SIC). At least for now, I am keeping my UltraSparc :-)
-- Jecel
At 7:48 AM +0000 1/28/02, alban read wrote:
Despite all the hype about the self compiler it actually runs like a total dog (Self 4.1 on the Mac anyway). It has to be ten times slower than Squeak and is too slow to be useful. (Which is a shame)
You *do* realize that the Macintosh port is only a basic port of the bytecode interpreter, don't you? None of the compiler technology was translated at all, and the Self VM design was such that the bytecode interpreter was left devoid of any optimizations for simplicity's sake. The blitter seems also to have been designed with X11 in mind, and that may be the largest source of hiccups and slow-downs.
~
--
On Sunday 20 January 2002 09:33, Ted Wright wrote:
I have a question about Self that I have anxious to post to the Squeak mailing list (because I know there are people here with the right experience to answer it). I've been wondering: what is wrong with Self? Why hasn't it taken over the world? :)
It was explicitly killed by Sun to make room for Java, just as SpringOS was killed around the same time in order to not bother Solaris. While I was not happy with either decision, I am the first to admit that there was good business sense behind them.
In many ways, Self seems like an improvement over Smalltalk,
Smalltalk means different things to different people. To some of us, it does indeed seem more Smalltalkish than Smalltalk-80. Other people don't like that direction at all.
and I get the impression that the "Programming as Experience" inspiration for Self has a lot to do with the direction Squeak is headed. But, other than Morphic and "HotSpot(tm)", it looks like many of the Self concepts have been ignored. Is it because it turns out to be hard to organize a big system without classes?
No - you can simulate classes whenever you need them. I am not sure if I understood John Maloney's objections in the great interview that Tansel pointed out to you, but I got the impression that he missed the global and centralized system view that the browser gives you. Self's outliners are like the Object Inspector in Squeak and are better for giving you local views. Note that Squeak-like browsers could be built in Self in an hour or so (as Mario Wolczko did in his Smalltalk emulator).
Is it because implementations exist for only a few types of computers, so many people have never experienced it?
This is a major factor. Until December 1999 it only ran on Sparc machines. Then it was released for the PowerMac. And now it is available (still early alpha) for PCs running Linux.
Or is there something else?
The virtual machine is large and very complex, much like commercial Smalltalks. That is the main reason why ports have been slow and few. And don't expect to see Self running on (existing) PDAs.
This problem was not lost on Squeak's creators, and the simple "self hosted", interpreter based VM has enabled it to spread everywhere. As always, there are tradeoffs. While I can write some complex code in Self and let the compilers do their magic, to get something usable in Squeak I will have to restrict myself to a subset of the language (Slang) and after debugging the logic the code must be translated into C and externally compiled into a plugin.
On the language design front, Alan Kay has complained in this list about the "mixing of levels" in Self due to parent slots. Program structure, which should be a meta-level issue (handled by mirror objects, perhaps) becomes visible and changeable at the base-level. This was actually a design goal for Self's creators, but it has little impact in practice - if it were eliminated the code would only have to be changed in about a dozen places in the image, half of which are there mostly to demonstrate this feature.
-- Jecel
Jecel Assumpcao Jr jecel@merlintec.com wrote:
...Is it [...] hard to organize a big system without classes?
No - you can simulate classes whenever you need them.
That is one of the things I liked. I found it quite mind expanding to look at classes in this light.
I am not sure if I understood John Maloney's objections in the great interview that Tansel pointed out to you,
(Thanks Tansel!)
but I got the impression that he missed the global and centralized system view that the browser gives you. Self's outliners are like the Object Inspector in Squeak and are better for giving you local views. Note that Squeak-like browsers could be built in Self in an hour or so (as Mario Wolczko did in his Smalltalk emulator).
John Maloney had one objection that was not obvious to me (probably because I'm just starting looking at Self and I'm not ready to think about how system tools would be built): "Since the object relationships used to build the Self equivalent of Smalltalk's class library are merely a matter of convention, system tools cannot rely on them. Programmers are free to write programs that don't follow the conventions. I think this is why, for example, Self has no equivalent to the Smalltalk's changes log and change set managers. These tools are much easier to write if you have the notion of "class" built into the language."
...On the language design front, Alan Kay has complained in this list about the "mixing of levels" in Self due to parent slots. Program structure, which should be a meta-level issue (handled by mirror objects, perhaps) becomes visible and changeable at the base-level. This was actually a design goal for Self's creators, but it has little impact in practice - if it were eliminated the code would only have to be changed in about a dozen places in the image, half of which are there mostly to demonstrate this feature.
I don't understand this issue (maybe I have not noticed any of these places in the "Demo.snap" image that is distributed with Self for Linux release).
Ted
I guess this can still squeak by as not too off topic ;-)
On Tuesday 22 January 2002 17:30, Ted Wright wrote:
Jecel Assumpcao Jr jecel@merlintec.com wrote:
No - you can simulate classes whenever you need them.
That is one of the things I liked. I found it quite mind expanding to look at classes in this light.
One danger is that this flexibility could increase the typical newbie confusion between the "is-a-kind-of" and "has-as-part" relations. This is particularly true if they start new programs by copying the structure they see in existing applications (lots of empty "traits" objects right from the begining).
The solution I have found is a "lazy classification" development style as exemplified by the short tutorial in Demo.snap. You create each object as a stand alone entity and keep them that way until you absolutely need to refactor them into common elements in order to avoid a lot of duplication. Keep refactoring agressively (I doubt extreme programmers would complain about this...) until you end up with code that has as few class-like objects as possible. By not trying to anticipate the role of an object before you are actually working on it you reduce the chance of getting "is-a-kind-of"/"has-as-part" wrong.
Note that in eToys in Squeak you have the tools to use this development style. You can choose to use per instance properties and delay creating new classes until this approach becomes too awkward.
John Maloney had one objection that was not obvious to me (probably because I'm just starting looking at Self and I'm not ready to think about how system tools would be built): "Since the object relationships used to build the Self equivalent of Smalltalk's class library are merely a matter of convention, system tools cannot rely on them. Programmers are free to write programs that don't follow the conventions.
A Self programmer can make an object be a copy of a "myType" prototype object stored in "globals" and include in it a constant "parent*" slot pointing to an object called "traits myType". But he can also come up with something entirely different. So if you created a browser that depended on the "prototypes/traits in globals" convention your tools would break on many real applications.
I think this is why, for example, Self has no equivalent to the Smalltalk's changes log and change set managers. These tools are much easier to write if you have the notion of "class" built into the language."
While David Ungar's "Transporter" application requires more manual care than I would like, it does not only roughly the same job as a change set manager but also much of the stuff planned for the upcoming packages system for Squeak. A primitive change log was added in Self 2 and then removed in newer versions and the Transporter could be extended to handle this (I use the external Unix tool RCS instead).
Here is what is complicated by not having classes: imagine that you have an object A and you change some code in it and then make a copy of it (object B). How can you tell that apart from first copying A and then making the same change in both A and B? When you "file out" your changes, the actual text you generate might be rather different in each case. With classes you have a cannonical view of what is being changed (instances don't count). When all you have are instances, then there are several possible interpretations. But this isn't important - as long as you get the right objects in the end, who cares how they were built?
["mixing of levels" in Self due to parent slots]
I don't understand this issue (maybe I have not noticed any of these places in the "Demo.snap" image that is distributed with Self for Linux release).
See the various tree collections and check out how the #at:Put: and #deleteNode (among others) methods change the inheritance structure at runtime. All it takes as in innocent looking "... parent: ..." in the middle of your code. You can do the same thing in Squeak, but that requires going very deep in meta-programming land and by the time you find Behavior>>superclass: , Class>>addSubclass: and friends you have probably seen enough to make you come to your senses and turn back.
A lot of people get excited about this "dynamic inheritance" feature when they discover Self, but it probably isn't a good idea after all.
The other place where this "parent slots are like regular slots" thing makes the meta level poke through is in "directed resends", which are like "super" in Squeak. Since you can inherit from multiple parents, when you want to resend to one in particular you must explicitly name which one you are interested in.
-- Jecel
squeak-dev@lists.squeakfoundation.org