[Morphic] Morph 3.0

Juan Vuletich juan at jvuletich.org
Fri Jun 22 01:49:58 UTC 2007


Hi Jerome,

Jerome Peace escribió:
> --- Juan Vuletich <juan at jvuletich.org> wrote:
>
>   
>> Hi Jerome!
>>
>> I'm so glad you're looking at my work! 
>>
>> Jerome Peace escribió:
>>     
>>> Hi Juan,
>>>
>>> Two thought have just come together and I am
>>>       
>> excited.
>>     
>>>   
>>>       
>> :)
>>     
>>> >From morphic 3.o I was looking at the ivars for
>>> Location
>>> and comparing them to problems I've been working
>>>       
>> thru
>>     
>>> with making polygons tilt just like rectanges and
>>> ellipses.
>>>
>>> The problem I noticed with Location was that you
>>>       
>> had
>>     
>>> angle but no scale.
>>>
>>> >From my work with polygons I know that angle and
>>>       
>> scale
>>     
>>> are natural partners.
>>>   
>>>       
>> Well, having scale means that somewhere you also
>> have the natural size, 
>> i.e. the size when scale is 1. I think that rotating
>> an object is 
>> something natural in the real world, but applying a
>> scale factor is not.
>>     
>
> Hmm. What I am using for inspiration is the contrast
> between the yellow growth handle on morphs and the
> outer handle on a star morph.
>
> With a star as you use the outer handle, the
> invariable is the shape. The parts stay the same
> relative to each other but the scale and the angle may
> both change.
>
> This is a really useful transformation. It allows you
> to have all sorts of objects of similar shape but
> different scale and orientation.
>
> It is done by matching the scale and angle to the
> polar coordinates of the vector from center to outer
> handle.
>
>   
Yes, it is very cool.
> The yellow growth handle on the other hand affects the
> aspect ratio of x and y coordinates.  If left
> completely unfettered you would have a morph transform
> (around its reference point) in response to the x and
> y magnitude (and sign) of the vector from reference
> point to yellow handle.  
>
> The two interpertations of the vector are mutually
> exclusive though they can overlap. You could constrain
> the growth handle to just scale (as the orange handle
> does). 
>
>   
I believe it is better to have separate handles for rotation and 
"scale". With the star outer handle it is hard to do just rotation or 
just rescale. But I believe the default should be to keep the aspect 
ratio. Perhaps a modifier key could allow changing it.
> With the yellow handle presently there are limits put
> on the range of growth and shrinkage.
>
> The blue handle is similar to the stars outer handle
> but constrains the rotation of morphs to maintain the
> same size. A rotated morph can only be scaled (orange
> handle) it can no longer be stretched. 
>
>  
>   
Well, in current Morphic a morph can be both scaled and stretched. I 
think this is confusing. I believe the default operation is scaling. 
That is the the yellow handle does in morphic 3: change the space the 
morph occupies in its owner, but do not change the morph real (or 
internal) size, that is, its own coordinate system.

Only some special morphs (currently none) will support scrolling and a 
new operation that zooms the morph's contents, but not its external 
size, i.e. the extent in the location (to make scrolling necessary). On 
those morphs, there will be yet another new operation that changes both 
the external extent and the internal scale factor, to keep constant the 
apparent size of the contents. For example, for a text editor, where 
resizing it does not change the size of the text. Forgive me if I'm not 
clear enough on this. I'll write a TWJ (the weekly juan) on this when it 
starts working. This is related to 
http://www.jvuletich.org/issues/Issue0009.htm but my ideas changed quite 
a bit since then, thanks to feedback from Bert and Tim.
>> What would be the meaning for a scale factor in M3
>> morphs?
>>     
>
> angle and scale would belong to the location.
>
> xaspect and yaspect would not belong to location but
> be a parameter of a cartesian coordinate system.
> In my conception I would have a class of orthogonal
> coordinates which would have points coded in the range
> -1 <=coord<=+1 then multiply by and coord aspect.
>
> so the aspects could both be 0.5 and you would get
> your unit square. or they could be some other aspect
> ratio.
>   
But doing this you restrict yourself to Cartesian coordinate systems, in 
particular to [-1, ..1] x [-1, .. 1]. To be as general as possible on 
this, I allow nonlinear coordinate systems too. I want to use "real 
world" coordinates. For example, if I have a morph that represents the 
map of Argentina, the "x" (longitude) could go from 80 west to 48 west, 
and the "y" (latitude) could go from 56 south to 20 south or something. 
This coordinate system (the one that is good for plotting a map of 
Argentina) is specified by the bounds of the coordinates (the valid 
coordinate values) and the transformation it does to a cartesian system, 
i.e. the kind of projection. If you see it this way, it doesn't make 
sense to modify the coordinate system just to stretch the map. The 
coordinate system is exactly the same. All that changes is the extent it 
will use in its owner (the sheet of paper where it is drawn). Thats why 
I consider the coordinate system something intrinsic of the morph, part 
of its definition, and I consider the location an "accident" something 
that happens to be the way it is, but could change without affecting the 
essence of the morph.
> In manipulating the morph you could put a yellow
> handle to adjust (stretch) the aspect ratio of the
> coord system. Circles become ellipses. Squares become
> rectangles. The yellow handle growth would not affect
> the angle and it would always (for cartesian coords)
> apply to the coord system. Furthmore you could honor
> negative aspects as directions to reflect one or more
> axes. 
>
>   
>>> xextent and yextent are not substitutes. Indeed
>>>       
>> they
>>     
>>> cause ambiguity because unless you look at the
>>>       
>> works
>>     
>>> you wonder if you turn the morph first then apply
>>>       
>> the
>>     
>>> extent scalers or the other way around. 
>>>   
>>>       
>> In my opinion, rotating an object does not change
>> width and height. 
>> Let's suppose you have a morph with angle zero. You
>> adjust the width to 
>> be w1 and the height to be h1. The width w1 is
>> measured along the X axis 
>> of the owner and the height h1 is measured along the
>> Y axis of the 
>> owner. Then we rotate the morph, for example, by 30
>> degrees. If we 
>> measure the extent along the X axis of the owner, it
>> is no longer w1, 
>> and the extent along the Y axis is no longer h1.
>> However, these are not 
>> the width and height of the morph, which are the
>> same as before. In 
>> other words: my height is about 5'6". If I lay on a
>> bed, my height is 
>> not 1'. It is the same as before. Do you agree?
>>     
>
> It would depend on whether you had this-end-up stamped
> on your forehead or along your left arm. :-)
>   
Sure, but that wouldn't change for laying down!
> In the reference system you take with you to bed its
> one way. In the reference system gravity supplies to
> the world its the other. And looked at from a more
> distant reference it will change in 6 hours time while
> you sleep.
>
>   
Well, I see it is needed to make a decision here. I believe it is 
simpler to ignore the reference system and just focus on the morph (or 
my body in this example).
> The bug in morphic is that the words we use can't
> distinguish.
>
> When I rotate a polygon, Its submorphs stay in their
> upright orientation and translate with respect to the
> changing #topLeft boundry point. 
>
> When I rotate an image or a rectangle all the
> submorphs rotate around their refpoints which in turn
> rotate around the rectangles reference point.
>
> When I set out to fix the difference I came to the
> understanding that rotate and scale which must
> maintain the aspect ratio of things, differ from
> stretch and reflect which can change the aspect ratio
> of things but must maintain the orientation.
>   
I agree.
> The way they are done now (with stretch implemented by
> extent:) rotation and stretch are not commutative
> operations. You can see this with polygons most
> easily.
>
> What I've been striving for is elegance and the lack
> of limitations on transform operations. And that
> starts to come when I recognise that I can ask for
> rotate and scale in the same manipulation. Or I can
> ask for stretch/reflect at the same time. But I cannot
> mix the two.
>
>   
I agree, but I believe it is good to distinguish the intrinsics of the 
morph (its extent expressed in its own coordinate system) from the 
accidental (position, angle, "zoom"). I believe you think that rotate 
and scale don't change the morph (that's why the contens rotates and 
scales as a whole). But I don't see how you view stretch / reflect. Do 
they change the morph? Do you think they are operations we can perform 
on any object in the real world?

It would be great if you try to experiment with this in M3.
> So seeing the ivars for location the lack of the scale
> ivar jumped out. As did the presence of the aspect
> ivars clearly there to facilitate scale. Aspect
> belongs to the coord system. Even though aspect and
> scale can overlap achieving generality is eased when
> they are both present and can be treated separately.
>
>   
I hope my view on coordinate systems (i.e. the map of Argentina) above 
is clear enough on this.
>>> What I found was in any given operation you had to
>>> choose rotate and scale 
>>> or 
>>> stretch and reflect (That's what should happen
>>>       
>> when
>>     
>>> one of the extent scalers go negative)
>>>
>>>   
>>>       
>> I don't understand this. Please elaborate.
>>     
>>> The other decision I realized from my work with
>>> polygons was to get polygons to truely rotate
>>>       
>> (with
>>     
>>> their submorphs doing the same thing) I had to
>>>       
>> collect
>>     
>>> the submorphs refernce locations and subject them
>>>       
>> to
>>     
>>> the same transformation as the polygon (rotate and
>>> scale or stretch and reflect) 
>>>   
>>>       
>> Well, this is not needed in M3, as the location is
>> always relative to 
>> the owner. If you change the ivar angle in the
>> location in the owner, 
>> all the submorphs rotate correctly. You can try it
>> yourself in my image, 
>> with the halo.
>>     
>
> I did. That's why I know you are on the right track.
>
>   
:)
> I believe there is a very easy way of doing morphic
> which people didn't see at first. We can see farther
> because we stand on the shoulders of giants. :-)
>
> Yours in curiosity and service, --Jerome Peace
>   
Indeed. BTW, John Maloney himself said (in the SqueakNews interview) 
that rotation and flexing were added as aftethoughts, and that a 
reimplementation where each morph defined its own coordinate system was 
needed.

Cheers,
Juan Vuletich


More information about the Morphic mailing list