Some bizarre thoughts on the nature of programming...

Peter Smet peter.smet at flinders.edu.au
Tue Jun 22 22:23:12 UTC 1999


David,

The reason I think OO has not helped or progressed as much as it could has
to do with 'locality' - or 'coupling' as it is called by software engineers.
If you look at most biological systems (and this is what the Swarm program
was trying to demonstrate), each object ONLY knows its nearest neighbours.
To use an extreme example, if every neuron in the brain was connected to
every other neuron, the sheer volume of the connections would have caused it
to explode many milleniums ago. While every good programmer strives to
minimize coupling, the level of coupling in programs is surely orders of
magnitude greater than that seen in biological systems. The challenge is to
design programs where every object only knows its immediate neighbours.

While I love all of Kent Beck's ideas with regard to refactoring, I worry
that everytime you refactor the coupling seems to increase. To put this into
an extreme form, in a perfectly factored program, method xxx will only be
specified in ONE place, in ONE object. Any object that wants to do anything
even vaguely similar to method xxx MUST call method xxx to do it. By
definition, this means it is dependent on the other object providing method
xxx. If it wanted to be independent, then it should implement method xxx
itself, but this would introduce unnecessary and evil code duplication. So a
well factored program will potentially show more coupling than a poorly
factored one. The other issue is that 'emergent properties' are by
definition difficult to anticipate and predict. How hard would it be to
design a termites nest (without ever having seen one) by specifying the
behaviour of individual ants. With hindsight, it is a simple task, without
it, near impossible.

Whilst redundancy is seen as evil, perhaps it should be embraced. Locality
may be more important than lack of redundancy. Every real complex system is
full of redundancy - perhaps it is the only way it can be done. Biologists
were stunned when they first discovered how much DNA is never used (Junk
DNA). What is all this rubbish doing lying around in efficient, well
designed organisms? If you want to modify anything (including a computer
program) it is always safer to modify a copy. Once you have a copy of a
vital DNA sequence, it can mutate as much as it likes. So finding these bits
of Junk DNA is somewhat analogous to finding bits of ancient DOS code in
Windows, or Smalltalk 72 in the Squeak image.


>
>Traditional hieraarchical-control-centric centralized programming is
>"fundamentally wrong."
>
>Objects - as a philosophy, a decomposition method, a culture, a programming
>style and to a minimal degree as a programming language like Squeak -
"helps
>a LOT in this regard."
>
>The reason you see it helping only a little - based on your examples - is
>that you are looking at the OO contribution too close to the machine - i.e.
>at the implementation language and programming structure level.
>


I really don't think so. If you look at the programming structure level, the
Classes and Objects in the image are heavily intertwined. That is, each
object assumes the existence of, and uses, many other objects. This may be a
result of good factoring, but it totally destroys locality (each object just
knows 4 or so neighbours).

>The real object difference arises at the analysis and decomposition
levels -
>but only if the analyst is steeped in object philosophy and culture.
>
>Alan Kay's object is metaphorically a cell and like a biological cell
>perfectly capable of participating in what Maturana and Varela called
>"structural coupling" to evolve more complicated and complex (there is a
>important difference between complicated and complex)organisms
>(applications).


The difference is that a liver cell doesn't assume the existence of aStream
cell when it screams "self printOn: aStream". It only knows harry, jack, and
gladys, the three liver cells that live next door. Don't get me wrong, I
think OO is the best we have for dealing with complexity in computer
problems at the moment - I was just musing on how it could be extended to be
composed of objects orders of magnitude simpler than what we have now.

>
>The people at the Santa Fe Institute that are working on complex systems
>(your second example) have created a program called SWARM using objectiveC
>(I believe) to simulate such systems.  What they could (should) do is take
>the object metaphor more seriously as a foundation for their simulations -
>as should the intelligent agent, distributed agent, and genetic algorithm
>communities.  Then you would see some potentially (r)evolutionary systems
>emerge.


I think Rod Brooks and the people at MIT are on the right track. I really
hope so, because the stuff they are doing is incredibly exciting.

Peter





More information about the Squeak-dev mailing list