Thread: GLUT in 16-bit RGB?

  1. #1
    Software engineer
    Join Date
    Aug 2005
    Location
    Oregon
    Posts
    283

    GLUT in 16-bit RGB?

    I started to really sit down and start studying OpenGL. I love it and questions are already popping up. Is there a way to tell OpenGL's GLUT to display in 16-bit RGB instead of always GLUT_RGBA? I'm figuring there's a way with glutInitDisplayString(). Thanks in advanced.

  2. #2
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,071
    Pass GLUT_RGB instead of GLUT_RGBA to glutInitDisplayMode().

    Ex:
    Code:
    glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
    M.Eng Computer Engineering Candidate
    B.Sc Computer Science

    Robotics and graphics enthusiast.

  3. #3
    Software engineer
    Join Date
    Aug 2005
    Location
    Oregon
    Posts
    283
    Hmm, cool. I always assumed that was X.8.8.8. Thanks for the help!

  4. #4
    Supermassive black hole cboard_member's Avatar
    Join Date
    Jul 2005
    Posts
    1,709
    Yeah. I was going to say pass GLUT_RGB but thought that would be 24bit. Would've made me look quite the fool. Sure side-stepped that landmine !
    Good class architecture is not like a Swiss Army Knife; it should be more like a well balanced throwing knife.

    - Mike McShaffry

  5. #5
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,071
    Eh, actually now that I think about it, GLUT_RGB probably would be 24 bit. I'm not sure if there's a 16 bit flag for GLUT or not. I'll look it up if I have time (at school ATM).
    M.Eng Computer Engineering Candidate
    B.Sc Computer Science

    Robotics and graphics enthusiast.

  6. #6
    Software engineer
    Join Date
    Aug 2005
    Location
    Oregon
    Posts
    283
    There is the following...

    GLUT_RGB assuming X.8.8.8 still alligned to 32-bit
    GLUT_RGBA (A.8.8.8) also 32-bit but with alpha

    I doubt there is a flag for 16-bit (1.5.5.5 or 5.6.5), but glutInitDisplayString() is giving me some hope.

    I have another question related to display modes. How do I query the user's video card to determine if I can use a certain resolution, such as 800x600x32, 640x480x16, and so on? I just hate making blind calls. Do I need to drop into the Win32 API to check on this? Thanks again.
    Last edited by dxfoo; 09-14-2006 at 09:14 AM.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. How does 0xff get represented on a 16 bit machine?
    By Overworked_PhD in forum C Programming
    Replies: 2
    Last Post: 10-27-2007, 11:32 AM
  2. 16 bit word
    By chico1st in forum C Programming
    Replies: 6
    Last Post: 08-09-2007, 02:55 PM
  3. 16 bit MS-Dos Subsystem Prob
    By darkmessiah in forum Windows Programming
    Replies: 2
    Last Post: 07-10-2005, 11:37 AM
  4. 16 bit colors
    By morbuz in forum Game Programming
    Replies: 13
    Last Post: 11-10-2001, 01:49 AM
  5. Array of boolean
    By DMaxJ in forum C++ Programming
    Replies: 11
    Last Post: 10-25-2001, 11:45 PM