[Vm-dev] Touch/MultiTouch Events

Phil B pbpublist at gmail.com
Sat Sep 19 18:16:42 UTC 2020


It would be appreciated if we didn't shoehorn touches the way some of the
mouse events are (i.e. scroll wheel).  At a low level, it would be nice to
be able to access any and all touch event data (position, radius, pressure
for each touch... if available.  With in-screen fingerprint readers
becoming more common, also consider a day when you'll know which finger
(and possibly which user) is registering which touch) and treat them as
touches rather than pseudo-mouse events.  For example, when you lift your
finger(s) from the screen, there is no valid current mouse position as far
as touch is concerned.  At a higher level, if it's a single touch it could
be synthesized into a higher level click/select or move event... but for
some applications you don't want them treated that way.  I'm basically just
asking that we don't 'cook' the touch events too much in the VM... pass a
(somewhat abstracted, so it can be platform neutral) touch event along and
let the image decide how to process it.

On Sat, Sep 19, 2020 at 10:44 AM <ken.dickey at whidbey.com> wrote:

>
> Greetings all,
>
> I am cross posting to vm-dev because of wide potential interest, e.g.
> for Squeak/Cuis/Whatever.. on smartphones and touchscreen tablets.
>
> I would like to have a solution across X11, vm-display-fbdev, MacOS,
> Windose, et al.
>
> I think we need to approach from both ends of the problem: user gesture
> recognition and hardware driver/library availability.  What the user
> sees, and what the VM sees and delivers.
>
> My thought is to start by looking at what convergence in thinking and
> APIs has already been done.
>
> A useful set of user gestures is captured in
>    https://static.lukew.com/TouchGestureGuide.pdf
>
> One description of basic touch event properties is
>    https://developer.mozilla.org/en-US/docs/Web/API/Touch
>
> For low level vm event input on Linux, I am looking at libevent, which
> is used by Wayland (successor to but useful within X11):
>    https://wayland.freedesktop.org/libinput/doc/latest/index.html
>
> How gestures are recognized on Android:
>    https://developer.android.com/training/gestures
>
> Recognized using libevent:
>    https://github.com/bulletmark/libinput-gestures
>
> I am just getting oriented, but if we can largely agree on gesture usage
> in the UI and what gesture events the VM delivers, I suspect
> implementation convergence details will get worked out in as we
> experiment/prototype.
>
> My tummy is now full and I need to digest and think about this..
> -KenD
>
>
> On 2020-09-19 00:16, Beckmann, Tom wrote:
> > Hi all,
> >
> > thank you Ken for bringing this up.
> >
> > I'll go ahead and share my thoughts thus far. Maybe you could check
> > with the respective APIs you use/know of, if this protocol appears
> > like something that would be compatible.
> >
> > VM Side
> > -----------
> >
> > In [1] is the definition of a VM-side sqTouchEvent. When compared to
> > for example the Javascript/Browser API, we would not be able to
> > represent the touch area fields radiusX,radiusY,rotationAngle,force
> > [2] with this definition.
> >
> > I noticed that there is a sqComplexEvent [3] that appears to have been
> > used for touch events on the iPhone. While the constant for
> > EventTypeComplex is defined in my image, I see no code handling this
> > type of event. I'd be very curious to learn how the objectPointer was
> > handled on the image-side. This may also be an option for us to
> > support more properties such as the touch area.
> >
> > Looking at the properties provided by the iPhone API [4], I would
> > prefer to derive some of those on the image-side (e.g.
> > phase=stationary or tapCount). The current implementation in [5] seems
> > to also bundle all active touch points in each event (not quite sure
> > about this since it also assigns a single phase as the event type?);
> > I'd be more in favor of identifying ongoing touch sequences of one
> > finger via an identifier. On XInput2, the sequence field is a simple
> > integer that increments each time a finger is touching the screen
> > anew.
> > One more consideration for the info provided by the VM: the iPhone API
> > also provides info on the device type [6], which I think could be an
> > interesting addition, allowing us to react appropriately to stylus
> > input. This may also require us to then not only provide radius
> > information but also tilt angle of the pen.
> >
> > The properties I would currently see of interest to us:
> > 1. event type (EventTypeTouch)
> > 2. timestamp
> > 3. x coordinate
> > 4. y coordinate
> > 5. phase/touch type (begin,update,end,cancel)
> > 6. sequence (identifier for continuous events for the same finger
> > stroke)
> > 7. windowIndex (for host window plugin)
> > 8. radiusX
> > 9. radiusY
> > 10. rotationAngle ("Returns the angle (in degrees) that the ellipse
> > described by radiusX and radiusY must be rotated, clockwise, to most
> > accurately cover the area of contact between the user and the
> > surface." [2])
> > 11. force
> > 12. tiltX
> > 13. tiltY
> > 14. deviceType (touch,pen)
> >
> > It could be considered to make the interpretation of fields 8 and 9
> > depend on the deviceType and thus merge the radius and tilt fields.
> > In practice, field 6 would likely to turn into an objectPointer as for
> > the the ComplexEvent and bundle the fields>=8.
> >
> > Image Side
> > ---------------
> > I have not invested much thought on the image-side handling just yet.
> > The suggestion of mapping to multiple hands sounds sensible. I would
> > assume we would still call our touch events mouse events such that
> > existing handler code keeps working on a touch-only device? The touch
> > event classes could then also simply extend the existing mouse event
> > classes.
> >
> > An alternative could be to check for the implementation of
> > touchBegin:/Move:/End: methods on the receiving Morph, similar to
> > `Morph>>#wantsStep` but I would prefer not to. I think handling touch
> > and mouse for most purposes synonymous avoids a lot of confusion for
> > the user. I might be wrong though :)
> >
> > In terms of what would break with this implementation: I have on
> > various occasions written event handling code that remembers the
> > lastX/Y position of a mouseMove: event to for example paint a stroke.
> > This would no longer work with multiple hands sending interleaved
> > events to the same Morph. I suppose relying on MouseMoveEvent's
> > startPoint and endPoint could be a better pattern here. It will also
> > be interesting to see how our current keyboard focus system will be
> > able to cope.
> >
> > Looking forward to reading your thoughts! If you feel like this is
> > appropriate, please also include the squeak-dev list in your reply.
> >
> > Best,
> > Tom
> >
> > (please excuse that I linked to my fork each time, the only changes to
> > upstream are in the X11 event plugin and sq.h)
> > [1]
> >
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/Cross/vm/sq.h#L489
> > [2] https://developer.mozilla.org/en-US/docs/Web/API/Touch
> > [3]
> >
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/Cross/vm/sq.h#L568
> > [4] https://developer.apple.com/documentation/uikit/uitouch/phase
> > [5]
> >
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/iOS/vm/iPhone/Classes/sqSqueakIPhoneApplication+events.m#L145
> > [6] https://developer.apple.com/documentation/uikit/uitouch/touchtype
> > ________________________________________
> > From: ken.dickey at whidbey.com <ken.dickey at whidbey.com>
> > Sent: Monday, September 14, 2020 5:15:17 PM
> > To: Beckmann, Tom
> > Cc: Tonyg; Eliot Miranda; Ken.Dickey at whidbey.com
> > Subject: Touch/MultiTouch Events
> >
> > Tom,
> >
> > I noticed your recent post on "cellphone responds to touchscreen".
> >
> > Tony Garnock-Jones has been gotten vm-display-fbdev up on postmarketOS
> > (Alpine Linux) and I was wondering about getting up touch event
> > gestures
> > using libevdev.
> >
> > Early days, but perhaps we can share some thoughts about
> > InputSensor/EventSensor and gesture strategy?
> >
> > -KenD
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squeakfoundation.org/pipermail/vm-dev/attachments/20200919/09f4ff92/attachment-0001.html>


More information about the Vm-dev mailing list