Croquet ARB_imaging

Andreas Raab andreas.raab at gmx.de
Sun Dec 22 00:26:33 UTC 2002


Joshua,

> Unfortunately, I wasn't able to try it because wglGetProcAddress: (as
> you defined it below) always returns 0, even when I try it on
> functions that I know are there, such as glColor3f().  I don't
> understand why this would be; 'Smalltalk listLoadedModules' includes
> opengl32.dll.

wglGetProcAddress _exclusively_ works for extension functions. Trying it
on any of the "standard" OpenGL functions will not work because they use
the "standard" OS lookup mechanisms (and yes this all sucks but I didn't
invent any of it). 

In short, if the function you are using is part of the OpenGL standard
(modulo versions of course) then you can and must use a "regular" ffi
call. If it's not (e.g., part of an extension) then you must use
wglGetProcAddress to use it. For some functions there are actually two
variants (due to an extension becoming part of the standard) - for
example I think you can still use glBindTextureEXT (used to be part of
the texture object extension) as well as glBindTexture (it became part
of OGL 1.1). 

Generally, extensions have those "EXT", "ARB", "NV", "SGI", "ATI",
"APPLE" (or whatever) postfixes which tell you that they're extensions.
So the rule of thumb is: if it has a postfix, it's an extension if it
doesn't it's part of the standard. (unfortunately this rule is not
_always_ true such as in ARB_imaging)

I know it's confusing.

> About your hack... when you say "if your renderer changes", you mean
> like if I have a dual-head setup with two different graphics cards,
> and I drag the Squeak window from one to the other?  Wouldn't this
> be a problem even if the function could be found directly in 
> opengl32.dll?  Oh, I think I see... the function in opengl32.dll
> probably redirects the call to whichever driver is currently active.
> But when you use wglGetProcAddress, that indirection no longer
> occurs, so if the renderer changes, you're sunk.  Does that sound
> right?

That sounds right but I cannot say if there aren't any other cases in
which the "renderer changes". I would suspect that if it for example
switches to software rendering for any reason you might crash too. In
short, if the renderer is destroyed and then recreated you have a chance
that it'll crash. Oh, and if you use *TWO* GLX'es which are differently
allocated (e.g., on two screens or one hardware and one software) I
would almost swear that it'll go nuts...

Cheers,
  - Andreas




More information about the Squeak-dev mailing list