<Chris Muller> The real problem is that the millisecond clock overturns back to 0 isn't it? That and the fact there's no way to detect when it does. </Chris Muller>
The problem is the lack of an epoch to use as a well-known, absolute origin point on the time line. Clock rollover ("non-monotonicity") is inimical to having an absolute epoch. Best practice is to use some well-known moment in Universal Time--although using Universal Time for the system clock also requires the ability to convert between Universal Time and local time. Chronos, Windows, Symbian, REXX, and Rata Die all use midnight of 0001-01-01 (1 January 0001,) which is the natural epoch of the Gregorian calendar.
Given a system clock primitive that reports ticks since a well-known epoch, all else can be added as needed from dynamically loadable packages. The resolution of the system clock (the duration of one tick, which is also the minimum representable difference between two timepoints as reported by the clock**) can either be a well-known value, such as one microsecond, or else there can be yet another primitive that provides that information.
--Alan
** The precision of a clock is the minimum amount of difference between two timepoints that the clock is competent to measure (a functional characteristic of the clock hardware.) The resolution of a clock is the minimum representable difference between two timepoints that might be reported by the clock, which is determined by the semantics of the notation used to represent timepoints. The former is determined by physics, the latter by math/syntax.