[Vm-dev] Re: Re: How does Time>millisecondClockValue get a resolution of 1 millisecond?

Louis LaBrunda Lou at Keystone-Software.com
Wed Aug 8 22:22:29 UTC 2012


Hi Eliot,

>Oh, and importantly, the VM ups the resolution of timeGetTime() via timeBeginPeriod to 1 millisecond if possible. 

Thanks for the replies.  One more question, what is used for Linux or UNIX
systems?  I would like to recommend to Instantiations (VA Smalltalk) that
they change to functions that give a finer resolution than GetTickCount
(which is what I think they or IBM use).

The current VA Smalltalk code in this area asks for a timer interrupt every
100 milliseconds.  It then checks delays and callback that have been posted
to see if any need to expire.  So, you can't really do a delay for less
than 100 milliseconds.  Even though there are places in the base code that
sets delays at less than 100 milliseconds.

I have a few programs where this is a problem.  There is a method where I
can drop the interrupt period and I have used it to set the interrupt
period to 10 milliseconds and that helps my programs greatly.  But it
really only drops the resolution to 15 milliseconds.

I would like to point Instantiations to the functions that will give a 1
millisecond resolution on all the systems they support.

Many thanks.

Lou
-----------------------------------------------------------
Louis LaBrunda
Keystone Software Corp.
SkypeMe callto://PhotonDemon
mailto:Lou at Keystone-Software.com http://www.Keystone-Software.com



More information about the Vm-dev mailing list