Thread: How Can I.....(OpenGL)

  1. #31
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    This is the result of people jumping into graphics programming without even understanding what a buffer is.

    A buffer is the 'virtual screen' in memory. I hope you know that all pixels are represented by DWORDs or unsigned integers (in 32-bit color and in 32-bit code). Since the actual video memory is simply an area of memory mapped I/O in the computer, you can mimick this 'screen' by creating your own buffer of DWORDs the same exact size as the video buffer.

    The only thing that makes the 'screen' what you see is the fact that the card scans the area of memory deemed to be the 'screen' DWORD by DWORD hundreds of times per second - actually much faster than that. The card translates the RGB values into the correct analog voltages and sends them to the monitor. Some monitor's and some cards are pure digital depending on whether you have a flat screen array of transistors or an analog CRT.

    The screen may look two dimensional on the monitor, but in memory it's just one long huge line of numbers. To plot a pixel at the correct location you must use these:

    DWORD offset=y*(width*(sizeofdatatype))+x;

    Note in DirectX and OpenGL (width*sizeofdatatype) has been replaced by something called buffer pitch. This basically says how many bytes of data exist in one scan line. The reason for pitch is because video manufacturers figured out that aligning the data on certain boundaries is much faster and more in tune with the architecture of the system as well as their GPU.

    So in the old DOS days here are the size of some of the screens:

    256 colors
    640x480x256 colors - (640*480)
    800x600x256 colors - (800*600)

    To make a buffer for 640x480x256 colors you need to first figure out the data type being used. Since 256 color is simply paint by the palette number, all colors can be represented by 1 byte. Each RGB can be from 0 to 32 but only 256 distinct RGB triplets are allowed at any one time.

    Code:
    typedef unsigned char BYTE;
    BYTE far *Buffer=new unsigned char[(unsigned long)(640*480)];
    That code will fail in real DOS mode however because there is not enough memory to hold the data. I will not explain here the differences between 16-bit programming and 32-bit. Do some research.

    To plot a pixel in that 'virtual screen' you would do this:

    Code:
    #define XYTOMEM (x,y,pitch)  (unsigned long)((y)*(pitch)+(x))
    
    void Pixel(int x,int y,BYTE color)
    {
      Surface[XYTOMEM(x,y)=color];
    }
    Surface is simply a BYTE far pointer that points to either buffer or screen.

    This is a simpe mode 13h unit. Try it in Turbo C++ and see what you get. It's old, but it might help you understand what the heck a backbuffer is. The keyword FAR is not used anymore - but you still must use it when coding for 16-bit pure or emulated DOS. It informs the compiler that the pointer to the memory does not exist in our segment - in exists outside of our current segment. To explain that I would have to explain assembly language programming and I'm also not going to do that.

    Code:
    #ifndef CVIDEO
    #define CVIDEO
    
    
    #define TRUE 1
    #define FALSE 0
    
    class CVideo
    {
      protected:
    
        //Pointer to actual screen - video memory
        BYTE far *Screen;
    
        //Pointer to back buffer - virtual screen
        BYTE far *Buffer;
    
        //Pointer to current pointer being used
        BYTE far *Surface;
    
        unsigned int Width;
        unsigned int Height;
        unsigned int Pitch;
    
        //Current drawing mode 
        BYTE BufferMode;
    
        public:
          CVideo(void):Screen(0),Buffer(0),Surface(0),Width(0),Height(0),Pitch(0) {}
          virtual ~CVideo(void)
          {
            if (Buffer) 
            {
              delete [] Buffer;
              Buffer=0;
            }
          }
    
          BYTE SetMode13h(void)
          {
            REGS regs;
            regs.x.ax=0x10;
            int86(0x10,& regs, & regs);
          
            Width=320;
            Height=200;
            Pitch=Width;       //pitch and width synonymous in this mode
            Screen=(BYTE far *)MK_FP(0xA000,0);
            Surface=Screen;
            Buffer=new BYTE[64000L];
            if (Buffer)
            {
               return true;
            } else return false;
         }
    
          void Pixel(int x,int y,BYTE color
          {
            Surface[XYTOMEM(x,y)=color];
          }
    
          BYTE far *GetBuffer(void) {return Buffer;}
          BYTE far *GetScreen(void) {return Screen;}
          BYTE far *GetSurface(void) {return Surface;}
    
          void Flip(void)
          {
             //No vsync here
            
             //Copy back buffer to screen
             memcpy((BYTE far *)Screen,(BYTE far *)Buffer,64000L);
           }
    
           void Flip16(void)
           {
              asm {
                push ds
                les di,[Screen]
                lds si,[Buffer]
                mov cx,32000d
                rep  movsw
                pop ds
              }
           }
    
           void Flip32(void)
           {
            asm {
              db 66h
              push ds
              
              db 66h
              les di,[Screen]
              db 66h
              lds si,[Buffer]
              db 66h
              mov cx,32000d
              db 67h
              rep  movsw
    
              db 66h
              pop ds
            }
          }
    
          void SetBufferMode(BYTE mode=TRUE)
          {
            if (mode)
            {
               Surface=Buffer;
               BufferMode=TRUE;
            }
             else
            {
               Surface=Screen;
               BufferMode=FALSE;
            }
          }
          
           BYTE GetBufferMode(void) {return BufferMode;}
    
           void CLS(BYTE color)
           {
              BYTE far *ptrSurface=0;
              if (BufferMode)
              {
                 ptrSurface=Buffer;
              } else ptrSurface=Screen;
    
              asm {
                les di,[ptrSurface]
                mov cx,32000d
                mov ah,[color]
                mov al,[color]
                rep stosw
              }
           }
     
           //inlined
           unsigned int XYTOMEM(int x,int y)
           {
              return (y<<8)+(y<<6)+x;
           }
    
    
    };
    
    #endif
    This will stick the computer in old school DOS 320x200x256 color mode and will setup the screen, back buffer, and current buffer pointer.

    You can add functions to the class to draw lines, circles, bitmaps, etc. The only function I'm leery of is Flip32 which is an attempt to use 32-bit code in a 16-bit environment. If it works, it should copy 4 pixels at a time to the screen instead of 2 like Flip16. Flip simply copies 1 byte at a time. Theoretically Flip16 is twice as fast as Flip and Flip32 is twice as fast as Flip16 or four times as fast as Flip. But I'm not positive the code will work. It's been a LONG time since the DOS days.

    Play with this code. Set the screen up by calling SetMode13h. Then clear the screen by calling CLS(BYTE color). This will clear the screen if BufferMode is FALSE and will clear the buffer if it is TRUR. The parameter is the color the screen will be cleared to. Set the buffer mode to TRUE by calling SetBufferMode(TRUE). Draw some pixels. Run it. Nothing shows up. This is because you are drawing to a back buffer - a portion of memory that looks just like the screen, but isn't. Now call Flip() after you draw and everything should show up. This is what OpenGL and Direct3D are doing for you.

    Drawing to the front buffer, in most cases, is not needed.

    This code should work with that class:

    Code:
    #include "CVideo.h"
    #include <conio.h>
    #include <stdio.h>
    
    CVideo Display;
    
    int main(void)
    {
       Display->SetMode13h();
       Display->SetBufferMode(TRUE);
       Display->CLS(0);
    
       Display->Pixel(160,100,15);
    
       Display->Flip();
    
       getch();
       return (0);
    }
    This should display a white dot in the center of the screen.

    Sorry for such a long post, but to show you buffers in DirectX code is even uglier than what I just posted...and none of it would show you how it works - just how to do it in DirectX.

    This code will show you EXACTLY what is going on. Download Turbo C++ 1.0 from www.inprise.com and mess with it. Then go back to OpenGL and please - rethink what you are doing.

    If the code doesn't work, let me know what's wrong with it. I wrote it sitting here so I'm sure there are some errors.
    Last edited by VirtualAce; 05-28-2005 at 04:06 AM.

  2. #32
    Absent Minded Programmer
    Join Date
    May 2005
    Posts
    968
    Yah, exactly, jumping into OpenGL before gaining a solid background of memory and buffers and things like that...

    Theres a lesson to be learned here.. I'll worry about it later

    So what I'm doing is this....

    1. Drawing to the backbuffer.
    2. Bringing things I want from the backbuffer to the "real" screen.
    3. Taking away from the "real" screen, and throwing back into memory what I don't need.

    I believe I understand the concept now, I was never taught, self learning always has its flaws, you learn things you are directly interested in. Everyone is acting like I should magically know these things, I've only been programming graphics for 2 months, and doing C++ for 8 months....

    Hm, I aught to pick up a good book.. I'm so cheap though

  3. #33
    ---
    Join Date
    May 2004
    Posts
    1,379
    You don't 'take away' from the real screen.
    Every time you send the back buffer to the front buffer the front buffer is over written. Think of it like a cartoon how it is made up of frames, when played the pictures come to life one frame at a time. All you are doing in drawing one frame at a time to the back buffer and then you are displaying it.

  4. #34
    Absent Minded Programmer
    Join Date
    May 2005
    Posts
    968
    hmm, can anyone contact me via IM or Email?

    [email protected]
    or
    Shamino55 on AOL

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Linking OpenGL in Dev-C++
    By linkofazeroth in forum Game Programming
    Replies: 4
    Last Post: 09-13-2005, 10:17 AM
  2. OpenGL Window
    By Morgul in forum Game Programming
    Replies: 1
    Last Post: 05-15-2005, 12:34 PM
  3. OpenGL .dll vs video card dll
    By Silvercord in forum Game Programming
    Replies: 14
    Last Post: 02-12-2003, 07:57 PM
  4. OpenGL and Windows
    By sean345 in forum Game Programming
    Replies: 5
    Last Post: 06-24-2002, 10:14 PM
  5. opengl code not working
    By Unregistered in forum Windows Programming
    Replies: 4
    Last Post: 02-14-2002, 10:01 PM