do I have to garbageCollect every time I create a large object?

Bob Arning arning at charm.net
Fri Aug 10 14:21:48 UTC 2001


Stephen,

One aspect of the large object wrt garbage colleciton is that, once it is old space, any young object pointers stored into it will cause it to be added to the root table. When incremental GC's are done, the only objects scanned are those in young space, PLUS those in old space which might contain a young space pointer. Scanning a million-entry array takes some time, especially when repeated 10's of times per second. Breaking your large array into a number of small ones might help reduce your incremental GC times. Below is a 1M array with in old space with a single young space pointer. Incremental GC is taking 24% of cycles, :-(

Cheers,
Bob

Since last view	257 (95ms between GCs)
	uptime		24.4s
	full			0 totalling 0ms (0.0% uptime)
	incr		257 totalling 5,765ms (24.0% uptime), avg 22.0ms
	tenures		0

Cheers,
Bob

On Fri, 10 Aug 2001 09:38:37 -0400 "Stephen Pair" <spair at advantive.com> wrote:
>But, would that buy me anything in terms of Squeak performance (except
>for the initial allocation and tenuring overhead)?
>
>Assuming the follow:
>- I have enough memory for the huge object
>- I have time to allocate it and tenure it
>- It will stay around for a long time
>- There will only be one of these monsters, and it won't be recreated
>very often (and when it is, it will not be time critical)
>- The collection is a Set and the hash functions of the elements are
>built to scale (to at least the size of the Set)
>- that the Set will be filled quickly
>
>With these assumptions, I don't see where I would benefit from a bucket
>hash, except if there were some benefit to GC (after the initial
>allocation and tenuring).  Is there?
>
>Of course...this is really hypothetical, a reasonable cache would
>probably be in the hundreds of thousands to say 1 million, not in the
>hundreds of millions.





More information about the Squeak-dev mailing list