idle speculation (was: Face down, nine-edge first)

Dean_Swan at Mitel.COM Dean_Swan at Mitel.COM
Mon May 15 16:16:31 UTC 2000



From:  Dean Swan at MITEL on 05/15/2000 12:16 PM

Jan Bortoff <janb at pmatrix.com> wrote:

>To give a real concrete example of how slow change is: most of our homes
>still have a copper wire pair going all the way to the phone company
>switch. With current technology, I just can't imagine how it's more
>efficient to run 20,000 pairs of wire to one spot, than to have a
>hierarchical tree of router things, with my home connected to some router
>box just up the street.

     Well, given that I work for a fairly large telecomm equipment manufacturer,
I'll share my viewpoint on this topic (although it it quite tangential to Squeak
in general).

     20,000 pairs of wire to one spot may or may not be "efficient", depending
on your definition of "efficient".  The paramount design concern for PSTN
(public switched telephone network) equipment is *RELIABILITY*.  This can't be
overstressed.  Some examples of standards and legislation that reflect this:

     1) In the US and Canada, and I suspect other countries have
        similar legislation, Central Office switches are required
        to be able to operate for a full three days without any
        external power.  They're all equipped with rooms full of
        wet-cell batteries, and diesel or gasoline generators to
        meet this requirement.

        A relatively recent example of where this really mattered
        was the "Ice Storm of 1998", which devastated a large part
        of the North-Eastern US, and Southern Canada.  Where I live,
        we were without electricity for 8 days.  In some areas,
        people were without electricity for up to 6 weeks, in the
        middle of January, but the telephone network continued to
        operate normally throughout this period (barring cases where
        the "pairs-of-copper-wire" were severed by ice.

        This level of reliability is, in large part, facilitated by
        the simplicity of having mostly copper wire directly from
        the subsciber's premisise back to the local CO, which for
        80% to 90% of all home in North America is less than
        18,000 feet.

        Admittedly, a large number of new installs are serviced by
        digital loop carriers, which are fed by fiber optics to the
        CO, with copper wire only making up the last 1000 feet or so
        to subscribers, but the DLC's are all powered from the CO or
        on local UPS systems.

     2) Line equipment like pole mounted transformers, etc. are
        required to withstand a shotgun blast from 100 feet.  This
        is a Bellcore/Telcordia standard for outside plant equipment.

        When I first encountered this standard, I was more than a
        little surprised, but in retrospect, it makes sense.  This
        equipment could easily come in the path of a hunter's shot,
        or mischeivious teen-agers with pellet guns, so it is designed
        to expect this.


>You would think these parts would become dirt cheap
>in the kinds of quantities it would take to install it everywhere.
>Unfortunately, the companies that control this infrastructure seem to have
>squandered the money needed to update all this stuff.  It could be they
>underpriced phone service for the last 20 years, and never made enough to
>evolve things now.

     There's a less obvious, and more compelling reason why this is done.  PSTN
equipment is typically ammortized on a 17 year schedule, and thus the equipment
is designed to have a useful lifespan of at least that long.  This, of course
increases the cost of equipment, but is, in large part, a side-effect of the
Rural Electrification of America act.  Many small, independent telephone
companies are funded under grants or loans created by this act, and it imposes a
lot of strict requirements designed to insure "universal access" for telephone
service.

> I think the computer industry has EXACTLY the same
>problem, it has such a large investment in PC's and associated software, it
>can't easily evolve to something dramatically better. This was also EXACTLY
>the same problem in the 70's-80's, a huge investment in "mainframe"
>infrastructure. Change did happen though.

     Yes, and slowly.  This is nature's way, and it is unlikely that we can do
better in technology.  As is, the timescales are infinitessimal for technology
compared to nature.  Nature runs so slowly that we have no recorded examples of
speciation ocurring, in spite of strong evidence that new species have appeared.
The best we can do is continue to contribute to the advancement of our craft,
and let social/industrial Darwinism sort out the what is "best".

     This has been the subject of a lot of research in the "Artificial Life"
community, and Danny Hillis (another Disney Fellow), along with many others, has
noted that there seems to be a "punctuated equilibrium" effect in the evolution
of complex systems.  While fitness levels of individuals (for example, different
technologies in the marketplace) in a population usuall span a fairly wide
range, the average fitness of the population tends to remain stable for "long"
periods.  Every "once-in-a-while", the average fitness of the entire population
will jump significantly, then plateau again for a relatively long period.  This
phenomenon has been observed in many systems, both natural and man-made.

>I don't think the danger we're in right now is PC legacy infrastructure,
>it's yet again making a huge investment in new half-baked technology sold
>by the Internet BS artists. Is seems like every 20 years, a new group comes
>along with the hot new stuff, and everybody jumps on the bandwagon, only to
>find after all the hot new stuff has a lot of problems.

     Sure, the new stuff has a lot of problems, and this will always be true,
but as long as it is marginally "better", it will supplant current technology.
Take the automobile - residents of Los Angeles can attest to some of the
"problems" generated by automobile exhaust, but the automobile solved a *HUGE*
public health problem by eliminating the problem of horse manure in the streets
of our cities.  This has also (speculatively) made a large contribution to
reducing the spread of disease and increasing the average lifespan of people.
New technology always presents us the challange of making tradeoffs and choosing
the "lesser evil".

> It's possible, the
>masses will always have technology that's 20 years behind what's possible.
>Ironically, about 20 years ago, Buckey Fuller gave a talk I saw where he
>said this is exactly how technology has worked. He also suggested that if
>we want to solve a lot of world problems, we might want to get the
>technology deployment time down to less than 20 years.

     Shortening the "deployment" cycle for new technology would certainly solve
many problems, but it would just as surely generate entirely new problems.  I do
however, think that the deployment cycle is shortening in many areas.  In the
electronics industry in general, and the computer industry in particular, design
cycles for new products has shrunk to as short as 3 months, which is equal to or
shorter than the supply chain to manufacture new products.  12 weeks is a
typical lead time for large semiconductor orders, and currently, things like
tantalum capaictors which are a common part costing only pennies is well over a
year!

     The telecomm industry tends to be around 18 months for design cycle, and
the new products tend to persist in the market for years.  When I started at
Mitel 9 years ago, my boss told me that the product line that our division
produces was expected to last only a couple more years.  Nine years and over
1,000,000 units later, that boss has retired, we're still cranking out call
controllers for alternate carriers, and our market is expanding due to the
development of Competitive Local Exchange carriers, Internet Serivice Providers,
and deregulation of long distance service around the world.  The technology
inside the boxes is evolving, but the basic need for access devices to get
traffic onto carriers' networks is still there.

     The balance of the deployment time is the rate at which society can absorb
and apply the new technology, and is closely related to the velocity of money,
general education level, and many other social factors.  Another telecomm
example: ISDN is only available to about 60% of telephone customers in North
America, and touch-tone dialing is still not available in some areas.  This is
largely due to economics.  There has to either be a customer base that can
absorb the costs of upgrading the infrastructure, or everyone else has to be
willing to subsidize the smaller markets.

     Many changes take generations simply because the older generation of people
see no need to adopt newer technologies.  (Personal example: my 83 year old
grandmother still doesn't have cable TV, and my 57 year old mother sees no need
to get e-mail, while me and my sister wouldn't want to do without them, and my
sister's children have never known life without them.)


>My observation is perhaps we should either accept that it's a cycle, and
>expect to throw away all our infrastructure every so often, and plan
>accordingly, or else try to deploy technology that can cope with evolution
>better, and try to break the cycle.

     Actually, I think what really happens is that we are constantly throwing
away the oldest bits of our infrastructure and adding newer technology, so the
periodicity is only an artifact of the gradual turnover.


     Well, enough exposition of my POV.  Comments?  Rebuttals?


                                   -Dean Swan
                                   dean_swan at mitel.com




P.S.  Of course I'm speaking for myself here, and not on the behalf of my
employer.









More information about the Squeak-dev mailing list