Thread: Loading a Bitmap resource for OpenGL Texture[x]

  1. #1

    Loading a Bitmap resource for OpenGL Texture[x]

    I have been using this function to load Bitmaps for Textures in my OpenGL applications, taken from NeHe's tutorial:
    Code:
    AUX_RGBImageRec *LoadBMP(char *Filename) // Loads A Bitmap Image
    {
    FILE *File=NULL; // File Handle
    if (!Filename) // Make Sure A Filename Was Given
    {
    return NULL; // If Not Return NULL
    }
    File=fopen(Filename,"r"); // Check To See If The File Exists
    if (File) // Does The File Exist?
    {
    fclose(File); // Close The Handle
    return auxDIBImageLoad(Filename); // Load The Bitmap And Return A Pointer
    }
    return NULL; // If Load Failed Return NULL
    }
    I'm now dealing with a resource bitmap though, and I need something to replace Texture[0] = LoadBMP("bitmap.bmp").


    I tried TextureImage[0] = auxDIBImageLoad(MAKEINTRESOURCE(IDB_BITMAP1), which compiles but the App crashes,

    and TextureImage[0] = LoadImage(hInstance,MAKEINTRESOURCE(IDB_BITMAP1),I MAGE_BITMAP,0,0,LR_LOADFROMFILE);, where the compiler gives me a conversion error.

    What could I use as a replacement?

    Many people seen to be using this code to work with resources:
    Code:
    void LoadGLTextures()
    {
    // load bitmap from resource file
    HBITMAP bitmap = LoadBitmap(GetModuleHandle(NULL), 
    MAKEINTRESOURCE(IDB_BITMAP2));
    
    // setup 24 bits bitmap structure
    // works on 8 bits bitmaps also!
    
    BITMAPINFO info;
    BITMAPINFOHEADER header;
    header.biSize = sizeof(BITMAPINFOHEADER); 
    header.biWidth = 256; 
    header.biHeight = 256; 
    header.biPlanes = 1; 
    header.biBitCount = 24; 
    header.biCompression = BI_RGB;
    header.biSizeImage = 0; 
    header.biClrUsed = 0; 
    header.biClrImportant = 0; 
    info.bmiHeader = header;
    info.bmiColors->rgbRed = NULL;
    info.bmiColors->rgbGreen = NULL;
    info.bmiColors->rgbBlue = NULL;
    info.bmiColors->rgbReserved = NULL;
    
    // store bitmap data in a vector
    const int size = 256*256*3;
    unsigned char data[size]; 
    HDC hdc = GetDC(g_hWndRender);
    GetDIBits(hdc, bitmap, 0, 256, &data, &info, DIB_RGB_COLORS);
    ReleaseDC(g_hWndRender, hdc);
    
    // convert from BGR to RGB
    unsigned char buff;
    for(int i=0; i<256*256; i++)
    {
    buff = data[i*3];
    if(i>=3)
    {
    data[i*3] = data[i*3+2];
    data[i*3+2] = buff;
    }
    }
    
    // create one texture
    glGenTextures(1, &m_texture[0]);
    
    // select texture
    glBindTexture(GL_TEXTURE_2D, m_texture[0]);
    
    // SetTextureParameters
    
    // generate texture
    glTexImage2D(GL_TEXTURE_2D, 0, 3, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE, &data);
    }
    But I'm still new to programming, and I'm having trouble figuring out how that will help me assigning the .bmp to TextureImage[], as in the non-resource code:

    Code:
    AUX_RGBImageRec *TextureImage[1]; // Create Storage Space For The Texture
    ...
    TextureImage[0] = LoadBMP("bitmap.bmp)
    Last edited by the dead tree; 08-25-2004 at 03:07 PM.

  2. #2
    Hereīs the pertinent code I'm using on my App. Itīs now running, but no texture is being displayed.

    Code:
    int LoadGLTextures() // Load Bitmaps And Convert To Textures
    {
     // load bitmap from resource file
     HBITMAP bitmap = LoadBitmap(GetModuleHandle(NULL), MAKEINTRESOURCE(IDB_BITMAP1));
     
     // setup 24 bits bitmap structure
     // works on 8 bits bitmaps also!
     BITMAPINFO info;
     BITMAPINFOHEADER header;
     header.biSize = sizeof(BITMAPINFOHEADER); 
     header.biWidth = 256; 
     header.biHeight = 128; 
     header.biPlanes = 1; 
     header.biBitCount = 24; 
     header.biCompression = BI_RGB;
     header.biSizeImage = 0; 
     header.biClrUsed = 0; 
     header.biClrImportant = 0; 
     info.bmiHeader = header;
     info.bmiColors->rgbRed = NULL;
     info.bmiColors->rgbGreen = NULL;
     info.bmiColors->rgbBlue = NULL;
     info.bmiColors->rgbReserved = NULL;
     // store bitmap data in a vector
     const int size = 256*256*3;
     unsigned char data[size]; 
     GetDIBits(hDC, bitmap, 0, 256, &data, &info, DIB_RGB_COLORS);
     ReleaseDC(hWnd,hDC);
     // convert from BGR to RGB
     unsigned char buff;
     for(int i=0; i<256*256; i++)
     {
      buff = data[i*3];
      if(i>=3)
      {
       data[i*3] = data[i*3+2];
       data[i*3+2] = buff;
      }
     }
     
      glGenTextures(1, &texture[0]); // Create The Texture
      // Build Texture
      {
      // Typical Texture Generation Using Data From The Bitmap
       glBindTexture(GL_TEXTURE_2D, texture[0]);
       // Generate The Texture 
       glTexImage2D(GL_TEXTURE_2D, 0, 3, 256, 128, 0, GL_RGB, GL_UNSIGNED_BYTE, &data);
       glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_LINEAR); // Linear Filtering
       glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR_MIPMAP_LINEAR); // Linear Filtering
      }
     }
     return true;
    }

  3. #3
    Has a Masters in B.S.
    Join Date
    Aug 2001
    Posts
    2,263
    i dont remember if you need to set the 'glTexEnvi' mode or not? im fairly certain you do.

    but you need to set yor texture parameters and enviroments before you call 'glTexImage2D'. otherwise they're going to be applied to next texture instead of the one you want them for.

    Code:
    glBindTexture(GL_TEXTURE_2D, texture[0]);
    
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_LINEAR); // Linear Filtering
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR_MIPMAP_LINEAR); // Linear Filtering
    
    
    glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);  
    
    glTexImage2D(GL_TEXTURE_2D, 0, 3, 256, 128, 0, GL_RGB, GL_UNSIGNED_BYTE, &data);

    note the "GL_MO DULATE" should be with no spaces... same with "GL_TEXTURE_MIN_FILTE R"
    Last edited by no-one; 08-26-2004 at 10:58 AM.
    ADVISORY: This users posts are rated CP-MA, for Mature Audiences only.

  4. #4
    Thanks no-one

  5. #5
    Yes, my avatar is stolen anonytmouse's Avatar
    Join Date
    Dec 2002
    Posts
    2,544
    You should DeleteObject(hBitmap) at the end of your function to avoid a memory leak.

    There is another method to do this that involves using a dibsection rather than GetDIBits. You can read an outline here.

    If you can use the GL_BGR_EXT format with glTexImage2D you can skip converting the bitmap from BGR to RGB.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Need help with Bitmap Display
    By The Brain in forum Windows Programming
    Replies: 7
    Last Post: 03-23-2009, 05:33 AM
  2. Loading a bitmap (Without using glaux)
    By Shamino in forum Game Programming
    Replies: 7
    Last Post: 03-16-2006, 09:43 AM
  3. Loading BITMAP Resource
    By FWGaming in forum C++ Programming
    Replies: 1
    Last Post: 07-19-2005, 12:07 PM
  4. OpenGL -- Bitmaps
    By HQSneaker in forum Game Programming
    Replies: 14
    Last Post: 09-06-2004, 04:04 PM
  5. No one can seem to help me, loading bitmap
    By Shadow12345 in forum C++ Programming
    Replies: 7
    Last Post: 12-28-2002, 01:22 PM