Custom Event Handler

Andreas Raab Andreas.Raab at gmx.de
Wed Aug 8 01:15:09 UTC 2001


Josh,

The entire issue of tools is a very interesting one in object systems. In
effect, it raises the question of whether a tool "should be applied" to an
object or whether an object "should provide a service" if stimulated with a
certain tool. In my understanding, the latter is almost always the better
way to go although I do understand that changing existing (e.g., predefined)
classes is not always a good thing to do.

Leaving the above issue aside, I do understand that there's some interest in
having some sort of a "passive" mode for morphs where they're basically just
"probed" and control is handled by some specialized mechanism (call it tool
or custom event handler or whatever). The way I would go about this is to
define a common protocol that's hooked right into the core of Morphic. E.g.,
one way to do this is to define that a hand can hold (either one or a number
of) tools, and, if so, uses a well-defined interface for doing the right
thing. The protocol should probably be capable of figuring out whether a
certain tool is applicable to some object (for either entering, leaving,
clicking, or dragging) and tell the tool what it is being applied to. At
which point it is in the responsibility of the tool to either apply itself
to the object or to request a service from the object.

Any takers?!

Cheers,
  - Andreas


> -----Original Message-----
> From: squeak-dev-admin at lists.squeakfoundation.org
> [mailto:squeak-dev-admin at lists.squeakfoundation.org]On Behalf
> Of Joshua
> 'Schwa' Gargus
> Sent: Tuesday, August 07, 2001 9:20 AM
> To: squeak-dev at lists.squeakfoundation.org
> Subject: Re: Custom Event Handler
>
>
> On Tue, Aug 07, 2001 at 08:43:37AM -0700, Andreas Raab wrote:
> > Josh,
> >
> > > However, I can't figure out how to get access to
> > > mouseEnter/Leave events without generating my own
> > > by watching when and where mouseOver events happen.
> > > It seems that the processEvent:using: hook just provides
> > > access to the raw hardware input events, and not to
> > > events that are subsequently derived from
> > > these events.  Is this correct?
> >
> > Almost. Enter/leave events are generated based on which
> morphs the hand is
> > currently "over". This is determined by sending mouse over
> events which are
> > handled by morphs in such a way that they tell the hand if
> the event ends up
> > inside. Based on this information, the hand sends enter/leave events
> > directly to those morphs.
>
> Yes, this was my understanding after reading the code.
>
> > Note that we really can't go through the event
> > dispatcher here - we need to figure out what has changed
> since the last
> > mouse position which is different from just delivering an
> event to the right
> > receiver.
>
> Perhaps I'm a bit confused about the intended purpose of the hook
> provided by processEvent:using:
>
> > > Can anyone with experience with custom event handlers suggest
> > > another way?
> >
> > For what? I don't quite see how one would reasonably
> implement a dispatch
> > strategy for enter/leave. I'm also a bit confused about the
> subject of your
> > message - what you're describing in the above means you're
> changing the way
> > events are dispatched (e.g., delivered); not how they are
> handled. Perhaps
> > you can explain what you're trying to achieve?!
>
> Your confusion is entirely based on my confusion.
>
> I'm implementing a drawing app where there are multiple 'tools'
> available, one of which is active at any given time.  Depending on the
> tool selected, mouse actions have different effects.  For example, if
> the 'selection' tool is active, clicking on a morph will toggle
> whether it is selected.  If the 'delete' tool is active, clicking on a
> morph will delete it (and possibly other morphs in its group).
>
> One reason that I care about mouseEnter/Leave is that I want all
> morphs in a group to be highlighted whenever the pointer enters any
> morph in the group.
>
> An alternate approach that I discarded involved installing my own
> event handlers.  In effect, edited morphs would be told: "Whenever the
> mouse enters or leaves you, tell the drawing app and let it decide
> what to do."  This is problematic because I might want to edit the
> graphical properties of morphs that already have their own event
> handlers.
>
> I suppose that I could hack EventHandler so that each event type is
> associated with a collection of target/selector/argument tuples,
> instead of a single one.  #on:send:to would then add the tuple to
> the event handler instead of replacing it.
>
> Does this sound reasonable?  Do you have another suggestion?
>
> Thanks,
> Joshua
>
>
> >
> > Cheers,
> >   - Andreas
> >
> >
>
>





More information about the Squeak-dev mailing list