idle speculation (was: Face down, nine-edge first)

Jan Bottorff janb at pmatrix.com
Mon May 15 10:20:19 UTC 2000


>That is my point - we still suffer from the early popularity of half
>baked ideas. Were we just unlucky or was that the inescapable path of
>computer evolution (and the sad rule of mainframe->mini->micro
>recapitulation)?

Many of those half-baked ideas were popular because they were low cost
enough to have critical mass. 

If we assume that low cost will rule, the question is how to get low cost
and quality. High volume is one answer. The companies that do really high
volume tend to be the chip companies. Let's assume that technology evolves
a bit more, and companies like Intel eventually start putting EVERYTHING on
the die, with an RF or optical connection to the outside world. You buy
your Intel (or AMD or IBM) computing block  and insert it into the block
holder on your optical router. Things like your plasma (or LCD or whatever)
wall display panel also plug into your optical router. I assume your
optical router will connect to the worldwide optical network, so you don't
even really need the compute block, you can just use some time on some
compute farm (of course perhaps computing locally will still be cheaper
than communication costs).

I think the basic functions of processing, storage, I/O and communication
will still need to happen, for a long time to come. PC's were an artifact
of the cost for processing and storage being far cheaper than
communicating. I think what our systems look like will have a lot to do
with the ratios of these costs. I can't offhand think of ANY software that
would be highly appropriate to control such a future system, although
something more Smalltalk like than Windows like seems like the ticket. I
can't offhand see Perl scripts spitting out html as the foundation for this
either.

I could easily imagine a system where software becomes free, although the
worldwide license to use it doesn't. ALL code would live redundantly on
"code servers" and your user credentials would authorize which variations
you get to use. Say I write a lot, I might want to pay for access to the
spiffy editing module, which get's activated ANYTIME I need to edit. There
also might be spiffy free versions of things, with no license required.

I'd also like to suggest that this will probably run on some sort of
virtual machine. And as processor performance increases, will make any
performance degradation not important.

Perhaps we could get Ted Nelson (or his new generation clone) to write a
book (I still have an original edition of Dream Machines) about how great
computing's going to be.

I think we still see remains of half baked ideas because we're only in the
early years of computing. 

To give a real concrete example of how slow change is: most of our homes
still have a copper wire pair going all the way to the phone company
switch. With current technology, I just can't imagine how it's more
efficient to run 20,000 pairs of wire to one spot, than to have a
hierarchical tree of router things, with my home connected to some router
box just up the street. You would think these parts would become dirt cheap
in the kinds of quantities it would take to install it everywhere.
Unfortunately, the companies that control this infrastructure seem to have
squandered the money needed to update all this stuff.  It could be they
underpriced phone service for the last 20 years, and never made enough to
evolve things now. I think the computer industry has EXACTLY the same
problem, it has such a large investment in PC's and associated software, it
can't easily evolve to something dramatically better. This was also EXACTLY
the same problem in the 70's-80's, a huge investment in "mainframe"
infrastructure. Change did happen though. 

I don't think the danger we're in right now is PC legacy infrastructure,
it's yet again making a huge investment in new half-baked technology sold
by the Internet BS artists. Is seems like every 20 years, a new group comes
along with the hot new stuff, and everybody jumps on the bandwagon, only to
find after all the hot new stuff has a lot of problems. It's possible, the
masses will always have technology that's 20 years behind what's possible.
Ironically, about 20 years ago, Buckey Fuller gave a talk I saw where he
said this is exactly how technology has worked. He also suggested that if
we want to solve a lot of world problems, we might want to get the
technology deployment time down to less than 20 years.

My observation is perhaps we should either accept that it's a cycle, and
expect to throw away all our infrastructure every so often, and plan
accordingly, or else try to deploy technology that can cope with evolution
better, and try to break the cycle.

- Jan





More information about the Squeak-dev mailing list