Morphic needs proper Tools support (was: Re: the World, its Hand, Morphs, and event handling)

Henrik Gedenryd Henrik.Gedenryd at lucs.lu.se
Wed Oct 25 13:00:44 UTC 2000


General rationale:

Since Morphic is really trying to do (allow for doing) all things for all
people, there has to be a very large amount of flexibility built into it.
The lack of such flexibility is the cause of the current state of Morphic: a
lot of code residing in the completely wrong places, eg. the HandMorph.

As I see it, interaction behavior needs to (potentially) be negotiated among
all the involved parties. Speaking in non-computer terms, (1) the person,
(2) the material acted upon, and (3) the thing the person is applying to the
material, the "tool". Think hobbyist; piece of wood; power drill. (Possibly
also the thing applied by the tool to the material: think glue; nail;
etc.--but I think this can be made part of the tool's responsibility).

In Morphic these correspond to (1) the source of events, currently more or
less the HandMorph, (2) "normal" morphs as such, (3) this doesn't exist in
the current Morphic. Please, let's call these "Tools", just like in MacDraw,
Photoshop, etc., instead of "modal event modifiers" or any such.

Trying to do without all of this will work for limited purposes, but not for
something as universally aimed as Morphic. The greatest omission is
currently the complete lack of tools. This was the first thing I had to
implement for my own framework; it was _not_ easy or straightforward.
Morphic is not currently built to be extendable or flexible, really,
although it can be done, and things have improved a lot.


Ok, what does this mean in terms of changes to Morphic?

Right now, the lack of tools means that code has to work around this in
kludgy ways. Hence the ugliness of the HandMorph; hence also the Halo, which
has required patches in code all over the Squeak image to work well--Code in
the wrong places, like in both these cases, is the #1 sign of a flaw in the
basic architecture.

Adding Tools means delegating much/most of the event logic to a pluggable
object--the currently active tool for the hand. This will mean that there
will remain little need for modifying the Hand's remaining code. And it
follows automatically that each individual hand has its own active tool.

(However, I've recognized a need to still factor out some code from the hand
into a general sort of "events policy object/class". The clearest example is
to allow for pen-based input; the code that tacitly assumes mouse-based
input ought to be factored out to there. Additionally, the core Hand code
should not even appropriate any modifier keys or mouse buttons for menus or
special behaviors--all of this should reside in a policy. You don't notice
the problems with this until you try to do something radically different as
I've been doing. But this is a minor point, as say 95% of all code shouldn't
change such basic code. The remaining 5% is for radical experiments with
alternative paradigms--but remember that such use was the express intention
behind releasing Squeak for the public! So the policy is like the "blue
plug", whereas the tool is the "pink plug" for what is currently the Hand's
fixed behavior.)

Allowing the "target" morph to modify the behavior is necessary for such
things as part bins. However, they should modify higher-level behaviors
defined by tools, rather than the raw, tool-less events like mouseDown. Now,
such low-level events are often modified as a way to work around the lack of
tools. Example: the parts bin should modify the behavior related to
composing objects, rather than general drag-n-drop behavior.


Ned Konz wrote:

> I've been working on a graphical editor framework based on Morphic.
> It's been more difficult than it should be. Before I go off and write
> something that discards a lot of what's already available,
> I'd like to share some thoughts on how the user interaction could
> be interpreted.
> 
> Currently, a Hand decides what a user event (mouse or keystroke) does.
> There is some rather complex logic to decide the destination of, say,
> a mouse click; the Hand may decide that the click means to pick up
> a Morph, or it may send the mouse click to the Morph underneath, or
> it may ignore the Morph's desire for mouse clicks and send the click
> instead to the PasteUpMorph underneath.
> 
> All of this establishes fixed semantics for user interaction.
> 
> However, what user gestures mean _should_ depend on the context.

Right, the active Tool.

> I'd like to be able to make editors that interpret mouse gestures and
> keystrokes differently -- without having to make special Morphs to use
> in those editors. Imagine an editor that allows you to connect Morphs
> together. I should be able to use any kind of Morph, whether or not
> its author has programmed it to work in my editor.

Yes, but this needs to be done in the right way. Having the editor itself do
it amounts to a mode, and that is a _bad_ idea, as widely known in the
Smalltalk community (remember Byte August 1981). However, tools are not
modes since they clearly indicate (with the pointer image) that the pointer
actions will be interpreted in a certain way.

So the GUI "editor" should not be an editor (window), but one or more tools,
and sets of components. Hence, when you activate the "compose UI" tool, a
Button will always be grabbed and dragged, not clicked. And so forth. And
the active tool is global. You select a tool, period. Not "a tool that only
applies to this morph here; when I work on that morph there I will want to
use that tool there."

The idea of an Editor in its own window is an artifact of the Application
way of thinking, and goes counter to Morphic philosophy as well as good
interaction design. Let's not go into Application-style thinking when we
aren't forced to by the OS.

< many points where I agree with Ned advocating Tools >

> Likewise, a Morph that usually wants mouse events (like a TextMorph) should
> be usable in, say, a GUI designer without change. In the GUI designer, a mouse
> down may mean to drag the Morph, and a tab key may mean to go to the next
> GUI element (not merely the next text field).

Yes, it may be appropriate for a tool to give special behavior to eg. arrow
keys and such.

> I think that the Hand should:
> * run its event loop

I don't see a need for this; events are not even polled any longer. Got an
example?

> * provide pointer behavior (a visible Morph that gets moved about)
> * allow for drag/drop by providing a framework for it
> * defer all other behavior to a policy object

Right, provide non-changable default behavior in the hand, and/or in a
generic MorphicTool class that each tool then inherits. Drag n drop is a
great thing to have there.

Raab, Andreas wrote:

> Use by "composition and parametrization" requires a lot of flexibility from
> the outside. That is what (I think) you are referring to with something like
> the "event handling policy". The task of the framework is to provide
> powerful enough hooks for the user to change the behavior of objects within
> the limits of the framework itself.

Yes, tools are given hooks from the hand, where they can plug in
higher-level behaviors for such things as drags, clicks, etc.

> However, because our framework *is* broken in so many places it often
> seems to be easier to bypass all those places that are broken by having some
> "external global strategy" which doesn't even bother to call the appropriate
> methods. But this is even more dangerous because it leads to the pollution
> that we find in HandMorph with #handlesMouseDown: #preemptsMouseDown: and
> #trumpsMouseDown:.

Right -- these are examples of having to work around the lack of proper tool
support.

> I find it very likely that any global policy will run
> into situations where it needs to do things a little differently and if the
> methods that are used by one global policy don't match exactly the
> intentions of another global policy you'll start to introduce more hacks to
> get around this. Then somebody will see that the two existing global
> strategies don't quite do what she wants to do and since both are already so
> ugly you better define a new one. And so on...

A tool's behavior is not a global policy in this sense (cf. above). There is
only one active at each time for each hand, so there can be no such
conflicts (disregarding very badly written code). One behavior should reside
entirely within the tool's methods, with the possible exception of some
methods in morphs, but that respond to the specific high-level events
defined by the tool--this is the crucial difference. Ie. not mouseDown,
wantsMouseDown, etc. but, say, #addGUIelement: (or whatever) in the case of
a GUI editing tool.

In this manner, tools should take lower level events, preferrably not even
mouseDown etc. but click, drag-n-drop, etc. and transform these into things
like dragAGUIElementAndAddItTothisOneHereOrWhatever (one should be able to
hide a respondsTo: theSelectorInQuestion in the default tool class to avoid
default methods clutter in Morph)

> Of course, even if the framework would be okay, there is a question of
> whether something like your hand-based policy for event handling should be
> doable or not. ... If we would have an event handling policy that simply does
move
> morphs from A to B no matter what this morph would usually do with events,
> how would you ever get out of this mode?! ...

Yepp, you're right--there always need to be one meta thing; for convenience,
eg. a rule that for every new tool, whatever double click means for that
specific tool, when you double click on another tool it means activate
it--plus a "secret handshake" to select a new tool, for the cases where you
forgot to call super in your new Tool subclass. Can you notice that I've
been there and made the mistake?

> In the end, I think that having any "outside" strategy for
> framework-critical behavior is a Very Bad Idea. HandMorph's event handling
> rules have clearly shown that.

To summarize: the hand (with possibly a pluggable global policy) transforms
low-level events to mid-level ones, like click, drag, etc., and delegates
these to a tool, which transforms these into tool-specific high-level events
that morphs may, but rarely will, intercept. It is much more logical for
behavior to be defined by the tool than by the material acted upon. With a
tool scheme, hopefully the eventHandler will not be used much.

Currently, Morphic interaction is handled on a too-low level; you're right
that messing around with _every_ mouseDown etc that comes to a morph no
matter what is a Bad Idea. Countless are the times when I've failed to lose
the dragged morphs on a mouseUp.

> 
> Do I need to mention what I'm working at these days?! :-)
> 

I'll happily share my code, but since it's not written for standard morphic
purposes it can probably only be used for ideas & inspiration.

Henrik






More information about the Squeak-dev mailing list