-
GLUT in 16-bit RGB?
I started to really sit down and start studying OpenGL. I love it and questions are already popping up. Is there a way to tell OpenGL's GLUT to display in 16-bit RGB instead of always GLUT_RGBA? I'm figuring there's a way with glutInitDisplayString(). Thanks in advanced.
-
Pass GLUT_RGB instead of GLUT_RGBA to glutInitDisplayMode().
Ex:
Code:
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
-
Hmm, cool. I always assumed that was X.8.8.8. Thanks for the help!
-
Yeah. I was going to say pass GLUT_RGB but thought that would be 24bit. Would've made me look quite the fool. Sure side-stepped that landmine !
-
Eh, actually now that I think about it, GLUT_RGB probably would be 24 bit. I'm not sure if there's a 16 bit flag for GLUT or not. I'll look it up if I have time (at school ATM).
-
There is the following...
GLUT_RGB assuming X.8.8.8 still alligned to 32-bit
GLUT_RGBA (A.8.8.8) also 32-bit but with alpha
I doubt there is a flag for 16-bit (1.5.5.5 or 5.6.5), but glutInitDisplayString() is giving me some hope.
I have another question related to display modes. How do I query the user's video card to determine if I can use a certain resolution, such as 800x600x32, 640x480x16, and so on? I just hate making blind calls. Do I need to drop into the Win32 API to check on this? Thanks again.