On Sun, Dec 01, 2002 at 09:02:54PM -0800, Tim Rowledge wrote:
Until recently it has been possible to argue against standard parts in software because computers were too slow to make anything other than minutely tweaked code to do any useful job. I rather think that that time has passed and we should admit that in a time of 400 mip handheld machines with 64Mb of ram that we should be concentrating on the quality of what they do rather than almost solely on the speed with which they screw up. A long time ago, Dan Ingalls (I think - maybe Larry Tesler?) was quoted as saying "We have reached the point of computational affluence where we should consider the quality of cycles rather than the number of them". Or something very like that.
All of which sounds pretty amusing from someone that has spent the last twenty years woking on making Smalltalk faster...
Not so surprising, really. You chose to spend those 20 years working on something scalable and designed to support higher level abstractions. That means that there is some small chance that your efforts will not have been entirely discarded ten years hence. It sounds like a reasonable engineering choice to me ;-)
By the way, I would hope that the transition to higher level software abstractions (patterns, etc) would have happened regardless of increases in computational power. If anything, I suspect that the constant improvement of computer hardware may have slowed things down, as it has encouraged people (and corporations, and schools) to repeatedly re-implement the same old ideas, often poorly, and often without appreciating that it was just the same old stuff.
This coming from someone who thinks it's entertaining to re-implement old unix command shells. Go figure.
Dave