Ah OK, I was not concentrated enough when I read Eliot post :)
Eliot, this is not a bug, and normal code would statistically create other objects in between... But isn't there a risk that tools like Fuel populate a small collection of simple objects (say an IdentitySet of Symbol), sharing same identityHash for each pair? It would noticeably increase the number of collisions well before the 4096 wall.
Nicolas
2012/7/27 Bert Freudenberg bert@freudenbergs.de:
On 27.07.2012, at 05:18, Nicolas Cellier wrote:
2012/7/27 Frank Shearar frank.shearar@gmail.com:
On 26 July 2012 22:35, Eliot Miranda eliot.miranda@gmail.com wrote:
On Thu, Jul 26, 2012 at 3:59 AM, Levente Uzonyi leves@elte.hu wrote:
On Thu, 26 Jul 2012, Frank Shearar wrote:
In a freshly updated trunk we have 3184 out of 3157 tests passing. We have 16 expected failures, and 11 failures, the latter being:
- BecomeTest>>#testBecomeIdentityHash
This is failing due to a VM bug. There's a fix for it somewhere, but it seems like it's not integrated into Cog yet. Explore this to see that two consecutive objects share the same identityHash:
Array new: 10 streamContents: [ :stream | 1 to: 10 do: [ :e | stream nextPut: Object new identityHash ] ]
IMO this isn't a bug. The identity hash changes at least every other object. Hashes don't have to be unique. But they do have to be well-distributed. With 12 bits of identityHash Cog does fine basing its identityHash on the allocation pointer. The above will wrap around after 8192 allocations, and provide 4096 distinct hashes (the maximum available). So the test needs rewriting to be more statistical. The rationale for this is to speed up allocation. Instead of a read-modify-write cycle to turn the crank of a pseudo-random generator there's a masking of the allocation pointer, which has to be read anyway to allocate an object. BTW, the *right* way to implement this is to lazily allocate hashes, but for that there needs to be a flag (e.g. an identityHash of 0) to mark an object as not yet having a hash but existing Squeak images (because of the old definition) use 0 as a valid hash, so lazy hashes requires either a header bit (not enough of those) or an image change (which is my plan, as part of the new object representation).
If it's not a bug, let's nuke the test. We need to get to a position where we have a green light.
frank
If I understood Eliot correctly, it would suffice to keep a pointer alive to the created objects...
preserveObjectsFromGarbageCollection := IdentitySet new. Array new: 10 streamContents: [ :stream | 1 to: 10 do: [ :e | stream nextPut: (preserveObjectsFromGarbageCollection add: Object new) identityHash ] ]
Nicolas
Makes no difference. GC does not happen after each allocation. Here's one that would work with Cog because each allocation is larger:
(1 to: 10) collect: [ :e | (Array new: 4) identityHash ]
But as Eliot said the test is somewhat meaningless in its current form.
- Bert -