Hi Yoshiki,<br><br><div class="gmail_quote">On Thu, Jan 6, 2011 at 2:58 PM, Yoshiki Ohshima <span dir="ltr"><<a href="mailto:yoshiki@vpri.org">yoshiki@vpri.org</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hello,<br>
<br>
I'm looking at primMicrosecondClock on Windows with 4.1.1 VM and Cog<br>
2341 just to profile parts of my code (so when is the epoch, and<br>
etc. does not matter here).<br>
<br>
I code snippet like this:<br>
<br>
| t |<br>
t := Time primMicrosecondClock.<br>
(Delay forSeconds: 1) wait.<br>
Time primMicrosecondClock - t<br>
<br>
and I get "1001" or "1000" on 4.1.1 and "0" on Cog... Where does it<br>
stand?<br></blockquote><div><br></div><div>Cog supports a 64-bit microsecond clock, epoch 1st Jan 1901. Primitive 240 is utcMicrosecondsFrom1901, 241 is localMicrosecondsFrom1901. Cog doesn't support the 32-bit microsecond clock in the standard VM, because a clock that wraps in at most 11 hours makes no sense to me, whereas a single clock that'll serve for high-resolution and won't wrap for a few thousand years does. Hopefully we'll harmonise on the 64-bit clock.</div>
<div><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<br>
Searching in the email archive, it appears that it "works" on Unix<br>
VM. Is it reliably working? How about Mac?<br>
<font color="#888888"><br>
-- Yoshiki<br>
<br>
</font></blockquote></div><br>