problems with OpenGL hardware acceleration on linux

Joshua 'Schwa' Gargus schwa at cc.gatech.edu
Fri Jun 28 21:44:31 UTC 2002


Hi Felix,

I'm engaged in the same struggle as you.

I was able to hack hardware acceleration to work with the
following changes to sqUnixOpenGL.c

1) Change the "static" in front of stDisplay and stWindow to "extern"
2) Comment out the first two sections of glInitialize (where stDisplay
and stWindow are assigned values).

I then built a VM with B3DAcceleratorPlugin as an internal plugin.
I'm not sure whether it would work as an external plugin.  If you 
later built another VM with different sources, the external plugin
would be likely to fail.

Joshua


On Fri, Jun 28, 2002 at 11:11:20PM +0200, Felix Franz wrote:
> Hi all,
> 
> after installing the nvidia Accelerated Linux drivers (and watching a 
> fast teapot :) I tried Balloon-3d in squeak: the performance is the same 
> with or without  "enable hardware acceleration" in the 
> WonderlandCameraMorph-menu. I rebuilt the vm (I am using Ians 3.2.2 
> release). ldd said it linked to the right libGL (the one provided by the 
> nvidia-driver).
> 
> B3DAcceleratorPlugin is listed in "Smalltalk listBuiltinModules" (don't 
> know if this is enough, it is the first time I had problems with a plugin).
> 
> I tried " B3DHardwareEngine primitiveSetVerboseLevel: 5" which should 
> print some infos, but I see no debugging-information.

These debug messages are appended to SqueakDebug.log
> 
> Now I don't know what to do ...
> 
> Is anyone enjoying a hardware accelerated 3d-squeak and can give me some 
> advice? If you need more information: just ask, I'll be happy to provide 
> them.
> 
> Thanks in advance,
> 
> 
> felix
> 
> 
> 
> 
> 
> 
> 
> 
> 



More information about the Squeak-dev mailing list