[UI] From the text-mode corner.

Blake blake at kingdomrpg.com
Sun Sep 9 20:09:38 UTC 2007


On Fri, 07 Sep 2007 09:48:15 -0700, Bert Freudenberg  
<bert at freudenbergs.de> wrote:

> On Sep 7, 2007, at 2:20 , Blake wrote:
>
>> Just a couple of pennies:
>>
>> It would be swell, at least in my opinion, to ensure that the model  
>> used for the UI was such that the layer immediately under it could be  
>> easily adapted to other UIs, like a text-mode UI, a voice UI or a GUI  
>> based on a different principle than the one finally settled upon.
>
> Do you have examples for this? For GUIs the ToolBuilder framework  
> provides that abstraction, but it wouldn't extend to non-graphical UIs.

Well, what I envision is that a UI--any UI--is connected to actions in the  
program (or in the UI itself, as in, say, the Squeak mouse's eyes  
following the mouse pointer around).

In that sense, controls are controls and events are events. Most UI  
builders I have used do not allow you to swap control A for control B,  
even if control B has all the needed functionality.

As a simplest case, let's say you design a screen that has a button and a  
light that acts as the control panel of a machine. The button toggles the  
power. When the power is on, the light goes on. When the power is off, the  
light goes off. We have a line coming in from the UI (powerButtonClicked)  
and a line going out to the UI (powerStateChanged).

It shouldn't matter what form that screen takes, whether the button is a  
animated GIF, some ### characters lined up, or a voice that says "say  
'power on' to turn on the power", as long as it can provide the necessary  
events.

I guess I'm really just advocating for a protocol and a structure that  
encourages a very clean separation.


More information about the UI mailing list