BitBlt endianness

tim Rowledge tim at rowledge.org
Sat Jun 17 16:13:00 UTC 2006


On 17-Jun-06, at 8:12 AM, Gerardo Richarte wrote:

> tim said:
>
>> On all current platforms VMs there are function to handle the pixel
>> transforming and copying transparently without any user involvement.
>> What are you doing that needs to handle endianness?
>
> 	I guess that's the main problem. I should have said it before, the  
> question is
> related to SqueakNOS, so I guess I'm implementing a new platform.

That does make a bit of a difference!


> The native side answers
> ioScreenDepth() with one of 32, 16 or 8. Then  
> DisplayScreen>>startUp fixes Squeak's
> Display depth to match it: Display>>setExtent:depth: calls  
> Display>>supportsDisplayDepth:
> which will be true only for a single depth (the native depth). So  
> there is one single depth
> across the system (the native).
The supportDisplayDepth stuff is a way to let the image know which  
depths you can support; on a direct frame buffer device I'd suggest  
declaring that you only support the frame buffer depth . You could of  
course support several if your device can be tweaked.

Take a look at some of the implementations in other platforms to  
check on it. I think Windows supports a variety of -ve depths as well  
as the usual 4/8/16/24/32 values. RISC OS only supports +ve values.  
Does the unix fb stuff not help as well?


>
> 	I'm developing on Intel PCs, with what I think is direct video  
> memory access (frame buffer?).
>
> 	I know all this:
>
> If depth is 32, a simple copy from Squeak's Display to the screen  
> is perfect, same colors as linux squeak.
So it seems clear that the device 32bpp mode is known and agrees with  
the Squeak format
>
> If depth is 16, a simple copy will swap odd and even pixels, and  
> use a "blue palette" for colors.
You did set the device up to expect 16bpp first, didn't you? :-)
>
> If depth is 16 and I turn the pixels around during the copy  
> (xchange 2 lower bytes with 2 higher bytes),
> the form of the screen is fine, but the colors still look weird .
>
> If depth is 16 and I swap the bytes around (change byte endianness  
> of 32 bits word), it looks a little
> bit better, but colors are still weird.
You need to find documentation of the expected pixel formats.

>
> The colors may be just a problem with the palette, although it  
> could be something else (like bit ordering
> which I doubt).
If you have a settable palette in the video hardware you should be  
able to correct the RGB/BGR/BRG type problems with that. Just create  
a suitable mapping
>
> also similar things happen for 8 bits mode (if straight copied, 4  
> pixels swapped and wrong colors.
> If byte swapped, just wrong colors)
>
> 	I have two different questions: I know how to solve the pixel  
> ordering (swapping).
It's not hard for 8/16/32 bpp but gets more interesting for 4/2/1 -  
take a look at my code in the RISC OS tree for details of how managed  
it.

> Any idea whats
> going on with the colors? And then. I don't think the best way to  
> solve this is to swap the bytes around
> when rendering, but better just to tell BitBlt to always use a  
> different representation, so bytes are
> never swapped (for performance). Do you think this is true? and,  
> how do I do it?
It ought to work just by using a -16 depth (for example), though as  
I've previously mentioned it never did for RISC OS.
As long as you are copying from Display to actual display you can  
correct the format anyway that works for you and you never have to  
worry about converting back since the Squeak DIspaly is left alone.

If you want the Display bitmap to be that actual memory rendered by  
the real display hardware then things can get more complicated and  
you need to see about the previously mentioned palette manipulation  
or better yet more clever tricks that can be done with the video  
card. Or you look into the SurfacePlugin to use an external memory  
chunk as the Display bitmap....

... or you go completely insane and rewrite the bitblt to work  
littleendian and then rewrite a lot of image code to understand  
endianness and pixel format properly. I actually did a littleendian  
bitblt back in 1.1x days but it would need a lot of updating to e of  
any use now. I think that with all the other bits of code that  
implicitly rely upon pixel format it would be horribly impractical to  
try that route but you may have different opinions and tolerances for  
labour than I do.

Hmph, I've rambled a bit but it *is* early morning. Basically, you  
really need to find out the pixel format you video hardware wants, or  
better yet find a way to *set* the video hardware to accept squeak's  
format. With such info we can be a lot more helpful.

tim
--
tim Rowledge; tim at rowledge.org; http://www.rowledge.org/tim
ASCII to  ASCII, DOS to DOS.





More information about the Squeak-dev mailing list