[Challenge] large files smart compare (was: Re: Squeak for I/O and Memory Intensive tasks )

Bob Arning arning at charm.net
Tue Jan 29 14:52:48 UTC 2002


On Tue, 29 Jan 2002 15:57:39 +0200 Yoel Jacobsen <yoel at emet.co.il> wrote:
>Oddly, It takes now only 23 sec. 
>
>This is an ASUS L8400C Notebook with PIII 850MHz with 384MB RAM. From the Spy output:

Ah, a notebook! Some notebook users have reported performance problems due to the processor going into low-power mode after a while. There is a preference (created for the Mac, but *may* work for windows) to disable this. See #turnOffPowerManager in the performance section. Or see if your OS allows you to override the power-saver feature, try that.

>75205640 bytecodes/sec; 2277950 sends/sec
>
>ÊÊÊ Seams okay. 
>
>ÊÊÊ I have no problem on small tasks but on very large tasks. Try to do it on 500000 entries and report your finding (if you'll have a valid image after that...
>)

I may not have enough memory for that many, but I'll see what I can come up with. From your numbers and extrapolating my previous observation (10K entries = 10M bytes used), 500K entries would need 500M bytes -- more than the real memory either of us has. Part of your problem with large datasets may be that much swapping is needed. Reworking the algorithm to stay within the real memory available may make a big difference.

Cheers,
Bob





More information about the Squeak-dev mailing list