[Vm-dev] Re: Re: How does Time>millisecondClockValue get a resolution of 1 millisecond?

Eliot Miranda eliot.miranda at gmail.com
Thu Aug 9 00:31:47 UTC 2012

On Wed, Aug 8, 2012 at 3:22 PM, Louis LaBrunda <Lou at keystone-software.com>wrote:

> Hi Eliot,
> >Oh, and importantly, the VM ups the resolution of timeGetTime() via
> timeBeginPeriod to 1 millisecond if possible.
> Thanks for the replies.  One more question, what is used for Linux or UNIX
> systems?  I would like to recommend to Instantiations (VA Smalltalk) that
> they change to functions that give a finer resolution than GetTickCount
> (which is what I think they or IBM use).

gettimeofday.  On linux this does'nt necessarily have great resolution.  On
Mac OS it has > 1ms resolution.

> The current VA Smalltalk code in this area asks for a timer interrupt every
> 100 milliseconds.  It then checks delays and callback that have been posted
> to see if any need to expire.  So, you can't really do a delay for less
> than 100 milliseconds.  Even though there are places in the base code that
> sets delays at less than 100 milliseconds.
> I have a few programs where this is a problem.  There is a method where I
> can drop the interrupt period and I have used it to set the interrupt
> period to 10 milliseconds and that helps my programs greatly.  But it
> really only drops the resolution to 15 milliseconds.
> I would like to point Instantiations to the functions that will give a 1
> millisecond resolution on all the systems they support.
> Many thanks.
> Lou
> -----------------------------------------------------------
> Louis LaBrunda
> Keystone Software Corp.
> SkypeMe callto://PhotonDemon
> mailto:Lou at Keystone-Software.com http://www.Keystone-Software.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.squeakfoundation.org/pipermail/vm-dev/attachments/20120808/63bbad08/attachment.htm

More information about the Vm-dev mailing list