A Better (different) WeakArray
Andreas Raab
andreas.raab at gmx.de
Sat Feb 18 21:56:50 UTC 2006
Unfortunately, this doesn't work. The reason for using #rehash was that
when a key in a weak key dictionary gets eliminated that key's hash
changes. Your implementation doesn't account for that and will break in
the face of some actual finalized keys in the dictionary.
Cheers,
- Andreas
David T. Lewis wrote:
> I entered Mantis #2910 with an enhancement that greatly improves
> performance of explicit deletion from a WeakKeyDictionary. This is
> the bottleneck for e.g. a large WeakRegistry with objects being
> added and removed frequently. It does *not* address finalization
> performance for the reason explained by Andreas below.
>
> If your application is able to explicitly remove objects most of
> the time, as opposed to letting the finalization process take
> care of it, you may find that WeakKeyDictionarySpeedup-dtl.cs
> helps quite a bit.
>
> Please do not use this on a critical image such as a Seaside
> server until some qualified person (aka Andreas) says it's OK.
>
> Dave
>
> On Fri, Feb 10, 2006 at 09:54:54PM -0500, Andreas Raab wrote:
>> Hi William -
>>
>> William Harford wrote:
>>> I am interested to know why WeakArray finalization eats up the CPU when
>>> standard gc does not. The only gc methods I have implemented have been
>>> standard reference counting ones so I know little of the details of
>>> other garbage collectors.
>> The main issue is that the garbage collector does not provide sufficient
>> information about which element of what array has been finalized to the
>> image. All the garbage collector does is to signal an event that such a
>> finalization *has* occurred (but not where). Therefore, if you want to
>> react to it you'll have to scan the elements of "your" weak array and
>> the amount of time spent in that process depends both on the number of
>> objects finalized as well as the number of slots scanned.
>>
>>> My second question is.....
>>> Is there a simple way to implement a WeakArray (WeakValueDictionary)
>>> that handles larger value set better. Maybe a WeakValueDictionary that
>>> only ever looks at 1000 elements at a time for finalization and would
>>> run less often.
>> There is no "simple" way for a particular meaning of "simple" - the
>> current mechanism (which I wrote years and years back) was by far the
>> simplest thing that could possibly work and it has served its intended
>> purposes (mainly avoiding dangling resource handles like files or
>> sockets) very well. It was never intended to scale up to thousands of
>> objects. Any other mechanism will necessarily be more complex than the
>> current mechanism.
>>
>> Cheers,
>> - Andreas
>
>
More information about the Squeak-dev
mailing list
|