[Challenge] large files smart compare (was: Re: Squeak for I/O and Memory Intensive tasks )

Jon Hylands jon at huv.com
Tue Jan 29 14:46:15 UTC 2002


On Tue, 29 Jan 2002 15:57:39 +0200, Yoel Jacobsen <yoel at emet.co.il> wrote:

>     I have no problem on small tasks but on very large tasks. Try to do 
> it on 500000 entries and report your finding (if you'll have a valid 
> image after that...

Well, there are much better ways to do text processing that reading the
entire file into memory.

If you look at FileStream, you will see that there are easy ways to read in
a file line-by-line...

You could set up a very simple finite state machine to handle parsing each
block of lines in the file. That way, the performance would remain pretty
much constant, regardless of the size of the file.

Later,
Jon

--------------------------------------------------------------
   Jon Hylands      Jon at huv.com      http://www.huv.com/jon

  Project: Micro Seeker (Micro Autonomous Underwater Vehicle)
           http://www.huv.com



More information about the Squeak-dev mailing list