Proselytization (was RE: Programming with Tuples (was On Reuse) - still reuse)

Steve Wart swart at home.com
Tue Jul 6 13:44:27 UTC 1999


The 7+-2 rule is equally true of anyone, whether they have been programming
for 20 years or for 20 weeks. It is *which* 7 chunks sit in the programmers
forebrain (and which 7 chunks beneath each of those 7 chunks, and so on)
that make the difference.

It has been said before that the User Interface is the hardest technical
problem. Harder than codex development or compiler design or anything else
that is close to being solved. This is why it is no big deal that the Squeak
UI is about 3 degrees off skew from the MS "standard" UI: the windowing
interface that we all know and love (or love to hate) is 2 dimensions off
what humans routinely cope with. The UI wars to date have been about
alignment in 2 dimensional space of a 4 dimensional problem.

People can learn to deal with anything you can think of. If you are a
programmer, you may often think "this is so complex, only the person who
wrote it [i.e. yourself] will ever be able to deal with it." Wrong. Code is
2 dimensional (linear in space and time), and people manage 4 dimensional
reality from birth. People are really smart. It is convenient to be cynical
and say "users are stupid" or "the person who wrote this code I have to
maintain was stupid". It helps us to survive when we can express confidence
in our superiority over our environment. People say "Microsoft sucks" (or
Microsoft people say "Linux sucks" or "Smalltalk sucks") as a way of coping
with uncertainty.

So the question is, how do we get people to believe in X, to express
confidence in X? Understanding and learning are part of this process, but
there are many aspects of what is to be understood and learned.

Look at some examples: SQL, COM, Visual Basic, Windows, CORBA. The list is
obviously endless; think of any standard (or brand) you want, and then ask
yourself who makes the most noise about the standard. Is it the people who
understand it? Nope. Is it the marketing people? Not really. It is the
average person in a taxicab, boardroom or coffee shop who knows diddly squat
about the subject, but expresses strong confidence that they are right about
something. It is actually quite a scary phenomenon, but everyone wants to
harness it, because it is the power of belief.

I was in a bookstore the other day (probably others of you have been as well
:-), although for me it was the first time in a while. I was totally
flabbergasted by the books I saw. Half a bookshelf worth of Visual Studio
books in one shrink-wrapped package. Multiple volumes of Oracle training
books, each 6 inches thick. Who would read this stuff, let alone spend 200
bucks on it? People who believe that it will do them good. The Squeak
environment in only intimidating to someone who does it in his spare time.
If it will help you survive, you learn it as quickly as you can.

So how do we get people to understand Squeak? Easy: make it the Next Big
Thing. Disney could do it. There have already been tentative articles in
Wired and Business Week. ParcPlace almost did it with Smalltalk in 91, and
they learned about playing with fire. Look at Java: Sun and IBM have put
tons of cash behind it and may yet get burned. So a strategy is needed.

Maybe a co-marketing agreement with McDonalds so kids can get McSqueak VMs
with their Happy Meals? Okay, maybe too expensive. How about this:

1. Windows and the Mac suck because they are too inflexible
2. The web sucks because it hasn't got enough semantics

Therefore, there is a need for an environment that is super flexible,
semantically rich, but that will let people build super flashy presentations
of information tightly integrated with sophisticated behavior.

The good thing is that most of the pieces are there already. The bad thing
is that the noise level in is somewhere between almost unbearable and way
past unbearable. But this is the world we live in, it is not going to get
any easier, and the potential is awesome.

The first step is to identify the need. Then you set a goal to fulfill the
need. And then you build a plan to meet your goal. And then, if all goes
well, you throw the plan away and watch in distress as your ideas are taken
over by people who have no idea what you really meant in the first place.

But you have to believe. And if you believe strongly enough, others will
follow. And if you take responsibility for what you have done, then you will
truly stand out from the charlatans who have plagued this industry from its
inception.

Regards,

Steve

[Apologies in advance for the rant]


> -----Original Message-----
> From: Terry Raymond [mailto:traymond at ids.net]
> Sent: July 4, 1999 7:24 PM
> To: squeak at cs.uiuc.edu
> Subject: Re: Programming with Tuples (was On Reuse) - still reuse
>
>
> On Fri, 2 Jul 1999 23:13:16 -0700, Dan Ingalls wrote:
>
> >"Terry Raymond" <traymond at IDS.NET> wrote...
> >>On the system that I am working some people have highly factored code,
> >>which is easier to reuse.  But it is more difficult to understand,
> >>you have to look at a lot of trees to figure out what the forest looks
> >>like.
> >
> >Factoring is certainly good.  In an ideal system, no independent concept
> >or behavior would be implemented in more than one place.
> >
> >However, in grappling with the reality of end-user programming, it's not
> >clear that maximum factoring isn't the most important thing.
> Consider our
> >collection hierarchy:  reasonably well factored from a computer science
> >perspective, but totally overwhelming to a newbie.  Instead consider
> >mashing a lot of this together into a single kind of collection that can
> >do almost all of that stuff with clear protocols for indexing,
> set operations,
> >streaming, sorting, etc.  This is taking GENERALITY to be the
> highest figure
> >of merit.  Generality is important because it helps to reduce
> the number of
> >things you need to learn about, and the number of things you need to keep
> >in your head at one time.  Factoring is still important, of course, and
> >you should see it in the design of the various protocols.
> >
> >I like the "7 +- 2 rule" metioned earlier in this thread -- that you can
> >only deal with about 7 independent "chunks" at once in your brain.  I
> >think it's even less when any of them has added complication
> that consumes
> >valuable attention.  That's why it's important for things to be SIMPLE as
> >well as general.
> >
> >In my ideal computing environment (and we're a long way from it yet),
> >almost anything that goes on could be diagrammed in a picture with not
> >more than 7 independent elements, or understood with not more than 7
> >pieces of code on the screen.  More concreteness helps, too, as you don't
> >have to use precious "chunks" of attention on what isn't being shown.
>
> Well, I guess my problem is related to the number of chunks.  Frequently,
> when I am trying to understand a complex operation that has been highly
> factored I end up with a large number of methods I have to put together
> in order to understand the whole picture.  Even if the methods are well
> named to permit chunking, many times the problem is that some leaf method
> performs an operation that is depended upon by some other leaf method.
> This requires one to develop an relationship model among the methods.
> However, if all the small methods were defactored into one large method
> the dependencies frequently become much more apparent.
>
> Maybe, the problem is that we can factor only on one dimension and
> complex problems involve many dimensions.
>
>
> Terry Raymond
> Crafted Smalltalk
> (401) 846-6573    http://www.craftedsmalltalk.com
>





More information about the Squeak-dev mailing list