[squeak-dev] touch-screen interfaces

Casey Ransberger casey.obrien.r at gmail.com
Thu Mar 29 23:25:30 UTC 2012


Top post:

I feel you. 

OTOH, touch really shines in some areas. E.g., in a synth. The UI can be presented identically to real equipment.  Knobs turned, toggles toggled, etc. Patching mixers and effects is done by dragging cables and plugging them into jacks. If you know how to use the real equipment that's under simulation, you can figure out how to work the simulator without caring much that you're interacting with a computer in the shortest time. Obviously I'm not playing an instrument on the screen, but a hardware keyboard/variaxe/etc can be plugged into the software trivially. Bluetooth would be fast enough. 

One problem I do see for us specifically, though, is consensus about what various gestures mean. Another: these small gadgets are pushing a lot of pixels. I hope we can get a bit more flexible around the size of fonts and the surrounding UI. I don't think #bigFonts or whatever is going to stand the time test. 

As far as touch being intrinsically hard to learn and use... I'd say maybe that depends as much on UI design and problem domain as touch (like you said with in-car nav and such.)

There's also something to be said for what you're familiar with. Touch definitely still sucks for conventional programming (though I do think stuff like tile script works great.)

The most underrated feature of touch-based interfaces, IMHO, is how immediately intuitive and accessible they are to kittens:

http://www.youtube.com/watch?v=CdEBgZ5Y46U&feature=youtube_gdata_player

On Mar 29, 2012, at 8:00 AM, Chris Muller <asqueaker at gmail.com> wrote:

>> And the current Squeak interface is made for a mouse. There would have to be a serious UI redesign to make a nice touch-based Smalltalk development environment.
> 
> I think there is a place for touch-screen interfaces:  Like in-car
> navigation systems or airport kiosks, or libraries, etc.
> 
> However, I think they really are a downgrade in user-interface power.
> Most obviously, the power of pointing is gone.  *Pointing* at
> something provides a non-mutative way to interact with another entity
> (person or computer).  You can't do that with an iPad -- you can only
> "click" (e.g., agitate the interface).
> 
> I think this is one reason I find Android / iPad so inherently
> non-intuitive to learn and use.  I can't "hover" over the icons for a
> description of what they do -- I can only click on them and, if it
> isn't what I wanted, frantically look for the "back" button before I
> get signed up for something I didn't want.
> 
> Which leads me straight to the other big problem I've noticed with TS
> interfaces:  Modality.  Regardless whether there are multiple programs
> running, all I ever SEE are people looking/working on one thing at a
> time.  Shit we've gone back to the days of 8-bit computers with all of
> this modality.
> 
> Also, with touch-screen interfaces, the back of my hand is constantly
> occluding the UI itself and the screen gets finger prints.  This
> reduces immersive effectiveness by reminding you that you're just
> looking and interacting with a *screen*, not the world of objects
> rendered on that screen..
> 
> Apple made touch-screen interfaces "take over the world" even though
> they're bad on several levels.  But, like I said, I think they're
> appropriate for in-car nav or limited-path apps.  Not creative
> development though..
> 
>  - Chris
> 


More information about the Squeak-dev mailing list