On interfaces..

Kevin Fisher kgf at golden.net
Fri May 25 13:08:15 UTC 2001


On the subject of pen-based interfaces, I've had some general thoughts
and questions about interfaces overall.

We've been discussing making the Squeak environment more friendly towards
stylus-based input.  I think it is safe to say that the standard interface
needs some massaging to make it work better with a pen.

The current Squeak interface is very text-based, even under Morphic.
When we create new objects, we do it textually, ie:

foo := SomeObject new.

We then access and manipulate the object textually as well:

foo someMessage: 'testing'.

In Morphic, I have a few other options available...I can open an inspector
window on the object which gives me...a container for more text.

Now, Morphs are a bit different...these can be inspected and manipulated
non-textually with menus and halos to a certain extent.

However, in general, working _in_ Squeak -- creating objects, instantiating
objects, combining objects -- is still done textually.  You might say
that the primary interface to the continuum of objects is still through
the keyboard.  The Workspace, the Browser, the inspectors....all are 
containers for text.

Now MY question...can we do better?  Is there a better interface we can
create for Squeak where we can do everything we do with text in other
ways, with different input devices and methods?  It seems that the trend
today is to force-fit everything into a common metaphor..on Windows,
everything must be a 'document view'.  We force all kinds of data to
conform to a single metaphor...and this, in the end, forces us to 
change our data.

In general computers still follow the old office clerk metaphor...
..desktops, trash bins, keyboards, files.  Any 'new' input interfaces
are always turned into mouse emulators and paper simulators.

WinCE is a great example of this...they shrunk the desktop metaphor down
to a palmtop/stylus device without even asking if that metaphor even
made SENSE on such a device.   I don't know about anyone else, but
doing stuff like press-and-hold to get the right-click menu is pretty
counter-intuitive to me.

As an example, on a palmtop device (no keyboard, just a stylus) I think
it would be great to be able to "program" it in a graphical manner..for
example I connect my "address book" object to my "IR port" object and
enable the sending of my address book over the infrared emitter.

(Now I'd like to say that all of this emerged from my fiddling with
palmtop environments...but much of the credit goes to Ted Nelson.  
If I've learned anything from Ted's writings it's that we should constantly
challenge the interface metaphors we take for granted.  You may not
agree with him, but he _does_ make you think twice about things.)





More information about the Squeak-dev mailing list