lessphic? may be a future for morphic

Igor Stasenko siguctua at gmail.com
Thu Jan 31 20:06:00 UTC 2008


On 31/01/2008, Juan Vuletich <juan at jvuletich.org> wrote:
> >>
> >
> > What i'm against, is to bind rendering subsystem to specific hardware.
> > There should be a layer, which should offer a rendering services to
> > application, and number of layers to deliver graphics to device(s).
> > In perfect, it should be able to render itself using any device:
> > screen or printer or remote (networked) canvas.
> > There also can be a different options in what a rendering media is:
> > it's wrong to assume that rendering surface is planar (it can be a 3D
> > holo-projector, for instance).
> > What is hard, is to design such system to be fast and optimal and
> > still be generic enough to be able to render anywhere.
> >
> >
> >
>
> Morphic 3 is not tied to any hardware! It only assumes the Display in
> Squeak. And it will not be too hard to separate it from the rendering
> engine. Non planar targets could be addressed by a custom coordinate system.
>

I didn't examined your Morphic 3 design precisely, but using a Display
(as it currently represented in Squeak) is exactly what i'm against.
It's like interacting with hardware directly, bypassing software drivers.
Display should represent a device with own set of capabilities. Canvas
providing a generic abstract layer for interacting it. Morphs should
use canvas , but assume nothing about existence of  Display or Printer
or Whatever.
Otherwise, once you start using Display, soon it will become too tied
together, and you start mixing things in one cup, just because you
assuming that everything what you doing will be rendered using
Display, so you starting care less about other devices/surfaces,
limiting it and finally bury it down under heap of optimizations :)

> Cheers,
> Juan Vuletich
>
>


-- 
Best regards,
Igor Stasenko AKA sig.



More information about the Squeak-dev mailing list