A Question about Croquet's Philosophy...
David P. Reed
dpreed at reed.com
Tue Jan 7 15:10:05 UTC 2003
This is a useful discussion.
The main point about Croquet is that the default is interactive
sharing, rather than the personal computer default of no sharing.
Starting from that default, the goal is to discourage and prevent, if
possible, harmful interactions.
But it isn't possible to distinguish at a low level in the system
what is harmful and what is not.
In the context of a highly adaptive distributed system, linking cause
and effect is not a local computation. So there *will* be actions
that I can take that fall into two classes:
a) they will affect others in ways that any local test cannot
predict, yet I can control to do
b) they will affect others in ways that any local test cannot
predict, nor can I be aware
of their potential for harm.
The "system" (that is Croquet) cannot provide such protection. So
"rules of the road" must be developed for users and for code added by
users. Code in the system can assist in deploying such rules and
using them to deter bad behavior or limit its effects.
The sensible rules of the road in a system that is about behavior,
not documents, and interaction, not storage, are yet to be discovered.
So for example, if my Avatar punches your Avatar, is that bad? Or
isn't it much worse if my Avatar misleads your Avatar?
Thes are contextual, social things.
At 03:26 PM 1/7/2003 +0100, Andreas Raab wrote:
>My feeling here is essentially that there have to be some rules of what
>people can and can not do with Croquet. The most important one (which we
>will have to enforce) is that you can "do what you want to yourself"
>(including crashing your machine if you like ;-) but not "to" others.
>This is what David was referring to with the name space architecture -
>in effect it means that I can ship any code I want to you but it will
>not (it MUST not) harm your machine or your work in any "direct" way.
>However, the name space architecture does not enforce the kind of
>"social protection" that you refer to. For example, it may not prevent
>me from making a script that spies on you by simply following you
>around, see where you go and what you like to do - this is exactly the
>kind of stuff that for example Sen. Fritz Hollings tries to protect with
>the so-called "online personal privacy act" - see for example
>In order to be able to cope with these issues a much more fine-grained
>control about the authority I wish to grant some other (code or person)
>in Croquet - and that's the place where many of the arguments that are
>discussed at www.erights.org are highly relevant. So my essential
>feeling here is that we have to have a two-level approach here. One that
>enforces "basic security" (which really means that it makes sure your
>system is not compromised) and one that is able to deal with (err...
>this sounds weird even to a non-native speaker but I don't know a better
>term...) "social security" in the way that gives other people directly
>or indirectly access to your environment and any kind of personal
>Some of this may (in fact, WILL) limit what you can do with the "always
>visible, always accessible" code/data on your machine, at least as far
>as remote representations of that code or data are concerned. There
>_will_ be objects that are never being shipped to a remote place (for
>example your credit card info) but only be provided for certain
>transactions (again, www.erights.org has lots of information in
>particular about the way financial transactions need to work in a secure
> - Andreas
>> -----Original Message-----
>> From: Darius Clarke [mailto:darius at inglang.com]
>> Sent: Tuesday, January 07, 2003 3:08 AM
>> To: andreas.raab at squeakland.org;
>> Kim.Rose at viewpointsresearch.org; alan.kay at squeakland.org;
>> dastrs at bellsouth.net; dpreed at reed.com
> > Subject: RE: A Question about Croquet's Philosophy...
>> Thank you for the rapid & accurate reply!
>> While the original Alan quote was regarding the OS wars & I
>> seemed to allude to
>> Open Source (in total) as being flawed, that debate really
>> doesn't worry me.
>> The argument by analogy seemed to confuse my question as well.
>> Perhaps I should have said more specifically in a "late
>> bound, collaborative
>> environment, where all symbols & code are visible and
>> accessible at run
>> time"... any mouse can roar - as well as squeak.
>> It's the social engineering that our youth are so adept at
>> (and have time for)
>> that worries me, not Croquet's technology.
>> By the way, I really want Croquet & Squeak to succeed and
>> fully meet your
>> aspirations for them, especially in self motivated learning
>> for our youth.
More information about the Squeak-dev