On Tue, 19 Feb 2002, Marcel Weiher wrote:
On Monday, February 18, 2002, at 11:50 PM, Scott A Crosby wrote:
Well do we have consistent numbers anywhere? If the numbers are so inconsistent, what conclusions are we drawing from them?
The numbers are variable based on the workloads and types of objects. The estimate of 4x-8x is only an estimate.
Sure. So? How does the difference between incremental and full GC relate to my observation that the numbers posted above show a linear relationship.
Oh, duh, yes.. When I was doing quick benchmarks of GC performance, I got the raw number: ' Can GC 60mb in 170ms' from which I calculated 'or about 360mb (6m objects) in a second.' Which explains your observation of the linear dependence.
We *know* that increasing these parameters makes macroBenchmarks go faster.
We *really* do?
Yep.. Try increasing them, giving squeak a couple hundred megs of RAM, and running macrobenchmarks.
Scott