the World, its Hand, Morphs, and event handling

Ned Konz ned at bike-nomad.com
Tue Oct 24 16:39:19 UTC 2000


I've been working on a graphical editor framework based on Morphic.
It's been more difficult than it should be. Before I go off and write
something that discards a lot of what's already available,
I'd like to share some thoughts on how the user interaction could
be interpreted.

Currently, a Hand decides what a user event (mouse or keystroke) does.
There is some rather complex logic to decide the destination of, say,
a mouse click; the Hand may decide that the click means to pick up
a Morph, or it may send the mouse click to the Morph underneath, or
it may ignore the Morph's desire for mouse clicks and send the click
instead to the PasteUpMorph underneath.

All of this establishes fixed semantics for user interaction.

However, what user gestures mean _should_ depend on the context. 

I'd like to be able to make editors that interpret mouse gestures and
keystrokes differently -- without having to make special Morphs to use
in those editors. Imagine an editor that allows you to connect Morphs
together. I should be able to use any kind of Morph, whether or not
its author has programmed it to work in my editor.

For instance, consider an editor with modal tools. In an editor like
this, the interpretation of user gestures depends on the current mode.
This already happens to an extent (look at the parts bins, for instance),
but its implementation is, um, a bit diffuse. And it's not at all
pluggable; if I want to change the interpretation of a mouse down event
to mean "start connecting two Morphs" or "grab a Morph" or "resize a Morph",
I have to either make the Morph itself know about the current tool mode,
make a new Hand that knows, or make a PasteUpMorph that preempts mouse
down events (support for this, however, is not complete, as it only
works for mouse down events and not other kinds).

Now, there is support for Worlds inside Worlds, and I could make my own
Hand that has pluggable behavior, but I thought that I'd suggest something
that would work everywhere.

Having the Morphs themselves respond to events (as I think someone said
the current SqC effort is doing right now) is (depending on how it's done)
not necessarily a good idea.

I should be able to use a RectangleMorph, for instance, in any editor
I want to use one in, without change. A RectangleMorph shouldn't have to
know about user interaction.

Likewise, a Morph that usually wants mouse events (like a TextMorph) should
be usable in, say, a GUI designer without change. In the GUI designer, a mouse
down may mean to drag the Morph, and a tab key may mean to go to the next
GUI element (not merely the next text field).

Because there may be more than one hand in a World, and because I'd like
a modal editor to allow each hand to have its own mode, I think that the
interpretation of user gestures should be in a policy object plugged into
the Hand, rather than in the World. This represents a refactoring of the
current Hand behavior; I think that the Hand should:
	* run its event loop
	* provide pointer behavior (a visible Morph that gets moved about)
	* allow for drag/drop by providing a framework for it
	* defer all other behavior to a policy object
(this looks a lot like a Controller, no?)

Some things that should be easy to do but aren't:

* Editors with modal tools
* Editors that work well with multiple hands
* GUI builders that use any widget available

A modal editor would work by changing the Hand's current tool (like in
HotDraw). Default behavior could be provided by a default tool
in a Hand class.

Ideas/suggestions?

-- 
Ned Konz
currently: Stanwood, WA
email:     ned at bike-nomad.com
homepage:  http://bike-nomad.com





More information about the Squeak-dev mailing list