[squeak-dev] Re: Delay time question

Andreas Raab andreas.raab at gmx.de
Thu Feb 28 07:38:02 UTC 2008


Robert F. Scheer wrote:
> It's clear that the robot program architecture must change drastically
> as a result.  Currently, there are perhaps 10 concurrent processes
> looping 50 times per second servicing various serial lines carrying
> sensor information or actuating motors and so forth.  Although real time
> processing isn't required, it is mandatory that all processes and the
> main decision loop complete every 20ms, no matter what.

I am not sure why your architecture has to change dramatically as a 
result. If your main problem is to make sure that that a loop is done 
every 20msecs and you need to adjust for statistical delay variations, 
you can simply compute the actual time the wait took and do something like:

  nextDesiredTick := Time millisecondClockValue + 20.
  [true] whileTrue:[
    self doControl.
    "note: for very, very long doControl the waitTime could be negative.
     you have to decide whether it's better to skip or to run an extra
     doControl in this case."
    waitTime := nextDesiredTick - Time millisecondClockValue.
    (Delay forMilliseconds: waitTime) wait.
    "Next desired tick is twenty msecs from last one"
    nextDesiredTick := nextDesiredTick + 20.
  ].

Not only will this adjust to statistical variations in delay but it will 
also take into account the amount of time spent in doControl, including 
eventual time spent in other processes due to preemption. In other 
words, the above is as close to 20msecs as it possibly gets.

Also, keep in mind that no amount of C coding will increase the accuracy 
of your OS' default timer. But if you happen to have a high-quality 
timer it would be quite trivial to link this timer to the TimerSemaphore 
and use it instead of whatever the VM gives you.

Cheers,
   - Andreas



More information about the Squeak-dev mailing list