Use for TimeZones (was: Time bug under Windows)

Mike Klein mike at
Tue Dec 23 00:53:57 UTC 1997

> From martin at Mon Dec 22 09:15:11 1997

> It seems like a time
> without a frame of reference would be a time *interval*, probably stored as
> a number of seconds, with accessors for seconds, hours, and days. These
> objects wouldn't translate to dates, as the concept of 'Date' implies a
> specific calendar, which in turn defines a frame of reference.

This can be problematic.  It is best to store intervals as "broken-down"
time intervals -- i.e. years, monthes, days, hours, minutes, and seconds.

For instance the time interval of one month is not the same as a constant
number of days.  For the *real* annoying problems, keep reading.
> >    - UniversalTime - a TimeEvent with a universal reference. GMT for
> >all intents and purposes. (Does anyone know offhand what the differences
> >are between GMT and UTC?)
> GMT and UTC are, as far as I know, the same thing for most purposes (there
> may be some technical differences in definition, which I'd be interested to
> know). The primary difference is terminology; the GMT label hasn't been
> officially used for a number of years.

GMT, technically, is a time zone that is no longer in use.  It was last
used in England, many decade ago.  GMT is often used to be "synonomous"
with UTC, and for ~ 1 second accuracy, it is.  UTC, however, has this
annoying detail of a leap second... a second inserted into the calendar
about once every year or two.  This means that in UTC not all minutes
are 60 seconds long.  Every now and then, one has 61 seconds.
The situation is not to different then the one mentioned above with
months.  The problem is Nobody expects months to be the same length,
and almost everybody expects minutes to be, including most of the OS's
that will be hosting Squeak.

For details, see:

Tim Rowledge wrote:

> Take care for what you ask, Bill. If you have it available, take a
> look at the code required in VisualWorks to handle this stuff.

Beware, VisualWorks TimeZone class is a very rough implementation,
and is incorrect for the US if you go to dates before 1987, the last time
the US changed the DST rules.

Having done quite a bit with Timestamps in Smalltalk, let me assure you:
Timestamps should be based on UTC/GMT  (the difference is irrelavent fo
most applications).  Using local time for the internal state of said
stamps will cause nothing but headaches.  And if you ever have to write
code that deals with more than one time zone, you will be in *big* trouble.

Having the OS/VM/Computer know about the "local" time zone, often causes
people to write code that assumes this is the only relevant timezone.
Next thing you know, this "default" time zone becomes a global variable
riddled through your code.  This is bad.

The best approach I've found for Timestamp classes that wish to deal with
UTC and leap seconds (as the X3J20 conformant application must), is to
base the stamp internally on a 'minutes' instance variable, and use
(possibly fractional) seconds only when necesary.  Usually, for most
applications, minutes resolution is all that is needed, and you dont
need to worry about the difference between UTC and GMT.  Forcing the application
developer to explicity think about the seconds, either makes them aware of
the problems, or prevents them from doing stupid things like using equality
of timestamps to determine if something has changed.  (Yes, in these days of
fast computers, the response to a change can actually happen in the same second
as the change was made).

-- Mike Klein

More information about the Squeak-dev mailing list