Millisecond clock resolution for DateAndTime now

Richard A. O'Keefe ok at cs.otago.ac.nz
Mon Sep 27 06:30:51 UTC 2004


Tim Rowledge <tim at sumeru.stanford.edu> wrote:
	Now here's a thought - do all OSs (any) successfully deal with those 22
	or so leap seconds? Are they algorithmic or declared by fiat? 
	
Leap seconds are not algorithmic.  They are based on the actual rotation
of the earth, which is not predictable to that level.  So they are
declared by fiat, but are not totally arbitrary.

This means that accurate-to-within-one-second timestamps in the future
are IMPOSSIBLE to calculate.  I mean that
    "CalendarWithLeapSeconds now addSeconds: (183*24*60*60)"
is IMPOSSIBLE to determine; there might be a leap second within the
next six months, and then again, there might not be.  What's more,
even if the rotation of the earth might justify it, there might be a
revolt in timekeeping standards and the relevant body might no longer
exist or might decide not to decree one.  It's impossible to predict.

The POSIX standard deals with the issue by explicitly refusing to
acknowledge leap seconds.  In effect, if a six month period contains a
leap second (and it will contain at most one), the POSIX calendar for
that period contains one second that is twice as long as the others.
(Which makes "millisecond accuracy" for that period rather, um,
interesting to contemplate.)

Note also that leap seconds have been around for less than 50 years
and when a committee of experts was asked whether they'd still be used
in 50 years half said yes and half said no.

Sooner or later someone is going to propose using self-levitating
electric cables in some way to make the Earth a better clock...




More information about the Squeak-dev mailing list