BitBlt endianness

Gerardo Richarte gera at corest.com
Sat Jun 17 15:12:00 UTC 2006


tim said:

> On all current platforms VMs there are function to handle the pixel
> transforming and copying transparently without any user involvement.
> What are you doing that needs to handle endianness?

	I guess that's the main problem. I should have said it before, the question is
related to SqueakNOS, so I guess I'm implementing a new platform. The native side answers
ioScreenDepth() with one of 32, 16 or 8. Then DisplayScreen>>startUp fixes Squeak's
Display depth to match it: Display>>setExtent:depth: calls Display>>supportsDisplayDepth:
which will be true only for a single depth (the native depth). So there is one single depth
across the system (the native).

	I'm developing on Intel PCs, with what I think is direct video memory access (frame buffer?).

	I know all this:

If depth is 32, a simple copy from Squeak's Display to the screen is perfect, same colors as linux squeak.

If depth is 16, a simple copy will swap odd and even pixels, and use a "blue palette" for colors. 

If depth is 16 and I turn the pixels around during the copy (xchange 2 lower bytes with 2 higher bytes),
the form of the screen is fine, but the colors still look weird .

If depth is 16 and I swap the bytes around (change byte endianness of 32 bits word), it looks a little
bit better, but colors are still weird.

The colors may be just a problem with the palette, although it could be something else (like bit ordering
which I doubt).

also similar things happen for 8 bits mode (if straight copied, 4 pixels swapped and wrong colors.
If byte swapped, just wrong colors)

	I have two different questions: I know how to solve the pixel ordering (swapping). Any idea whats
going on with the colors? And then. I don't think the best way to solve this is to swap the bytes around
when rendering, but better just to tell BitBlt to always use a different representation, so bytes are
never swapped (for performance). Do you think this is true? and, how do I do it?

	I tried setting -16 bits depth as supported depth. (changing display primitives), but nothing's shown
on screen if I do it. And if I make -16 supported by primitives and then I do Display newDepth: -16; restore.
The screen is never redrawn again...

I uploaded a partial screenshot to http://minnow.cc.gatech.edu/squeak/uploads/1762/SqueakNOS-16bits.png.
This is the 16 bits straight copy version. I also uploaded a zip with 5 images to
http://minnow.cc.gatech.edu/squeak/uploads/1762/SqueakNOS-Display.zip... (I didn't know if upload it to
the swiki or not, but I don't have another place and it's going to help the project, so I guess it's fine,
even if it can't be deleted latter... actually, this could also be for for other people).

	Thanks all for answers.

	gera




More information about the Squeak-dev mailing list