[squeak-dev] Squeak/Pharo on touchscreen: requesting opinions

Dimitry Golubovsky golubovsky at gmail.com
Thu Aug 18 18:40:20 UTC 2011


As I am progressing through the Stack Cog port for Android (basically
I can load a recent PharoCore or Squeak image althoug some stability
issues still exist), I would like to collect opinions from prospective
users, which is a better way to interact with Squeak (meaning Pharo
and others as well) on a touch screen.

9" tabet screen is not much smaller than say EEE PC netbook's, so at
800x480 px resolution menus and other Squeak GUI elements are well
readable, and can be easily pointed at with a stylus. So Squeak
environment itself can be as well used even on my rather cheap device.

Squeak requires a lot of mousing to interact. On an Android device,
there is a touch screen, and possibly several hardware buttons. In the
current implementation taps on the screen are treated as "red" clicks,
and pressing one of hardware buttons prior to tapping the screen
change click color just for one click: see this picture:


That is, in order to get a context menu in Squeak, one has to press
the "Yellow" hardware button (in PharoCore it is the "Blue" button),
and then tap the screen; the menu appears.

See the whole explanation at

This is how the classic (based on Andreas' code) Squeak VM port works
now, and I have the same working in Stack Cog.

Advantages of this method: it is close to the way traditional mobile
applications interact. In a Squeak application, a Morph has to set
proper handling of red button clicks in order to enable interaction
with user. Morph drag can be done with one hand (however by a reason I
could not explain, morphs disappear from the screen while being
dragged, and reappear in new location only when tap is released).

Disadvantages: current mouse position is unknown/invisible to the
user, mouse over is impossible (hence no balloon help). Also due to
high touchscreen sensitivity, holding a finger/stylus on the screen
may generate many events within short period of time, since even one
pixel change in touch position causes an event to be generated, and
some involuntary finger movements always take place. Such frequently
reported events may "choke" the interpreter, given the CPU is slow,
and Android OS quickly kills it due to unresponsiveness.

While I am trying to address these issues in various ways, there seems
to be another way to interact via touchscreen. This method has not
been implemented yet, and I would like to hear from the community
whether it could be good to have.

A mouse pointer is displayed as part of the activity interface (maybe
even done entirely in Java, so mouse movement itself will not put any
load on the interpreter). Finger movements on the screen move the
pointer, but new position is reported (with button mask 0) only when
the screen tap is released. So mouse over becomes possible: just leave
the pointer where needed. The hardware buttons are used as before (one
becomes Red, another Blue, and a chord would be Yellow), but clicks
only are reported when those buttons are pressed or released*. To drag
a Morph, one would have to hold one of the hardware buttons, and slide
their finger/stylus on the screen.

So this hypothetical mode is more like laptops' trackpads work. One
obvious advantage of this method is less load on the interpreter in
terms of the frequency of events, and hence greater stability,
although less convenience. Again, to run an end-user application, the
former mouse tracking method may be enabled.

I am asking your opinion here (as much of opinion can be had about
something only imaginary) - is the latter method worth implementing:
would one like to use it if available?


* or long taps may be recognized as red clicks

Dimitry Golubovsky

Anywhere on the Web

More information about the Squeak-dev mailing list