Thread: Nothing is being rendered using openGL but glClear() wors perfectly!

  1. #1
    Registered User
    Join Date
    Jan 2019
    Posts
    3

    Question Nothing is being rendered using openGL but glClear() wors perfectly!

    Hi! I am developing a simple convex hull algorithm and visual representation of the solution. I want to render it using glfw3 and OpenGL. However, the window is created and everything it clears everything with the exact color I need but it will not render anything... Not even a simple quad.

    main.cpp
    Code:
    int main(){
    
    
        const char *title = "testing";
        GLFWwindow* window = initWindow(640, 480,title);
        while(!glfwWindowShouldClose(window)){
    
    
            glClearColor(0.658824, 0.658824, 0.658824, 0.658824);
            glClear(GL_COLOR_BUFFER_BIT);
            glOrtho(0, 680, 0, 480, 0, 680);
    
    
            glColor3f(1.0,0.0,0.0);
            glBegin(GL_TRIANGLES);
                drawPoint(300, 300, 10);
            glEnd();
    
    
            glFlush();
            glfwSwapBuffers(window);
            glfwPollEvents();
        }
    
    
        return 0;
    }
    initWindow
    Code:
    GLFWwindow* initWindow(int screen_width, int screen_height, const char* title){
        GLFWwindow* window;
        glfwSetErrorCallback(error_callback);
    
    
        if(!glfwInit())
            exit(EXIT_FAILURE);
    
    
        glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 2);
        glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 0);
    
    
        window = glfwCreateWindow(screen_width, screen_height, title, NULL, NULL);
    
    
        glfwSetKeyCallback(window, key_callback);
    
    
        glfwMakeContextCurrent(window);
    
    
        glfwSwapInterval(1);
    
    
        return window;
    }
    drawPoint
    Code:
    void drawPoint(int a_x, int a_y, int size=DEFAULT_POINT_SIZE){
    
    
        //set the position of the point
        int x = a_x - (int)size/2;
        int y = a_y - (int)size/2;
    
    
        //draw first triangle
        glVertex2f(x,y);
        glVertex2f(x,y+size+1);
        glVertex2f(x+size+1,y);
    
    
        //draw second triangle
        glVertex2f(x,y+size+1);
        glVertex2f(x+size+1, y+size+1);
        glVertex2f(x+size+1,y);
    
    
    }
    Any ideas? Thanks in advance!

    EDIT: I probably should mention that I a using a Linux machine
    Last edited by 64humans; 01-23-2019 at 08:04 AM.

  2. #2
    Guest
    Guest
    Does it work without glOrtho and using vertex coordinates near -1.0 to 1.0? If you plan to to do more OpenGL stuff in the future, I recommend you learn the modern OpenGL 3+ pipeline.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 3
    Last Post: 04-11-2017, 04:41 PM
  2. NVidia max pre-rendered frames
    By VirtualAce in forum General Discussions
    Replies: 1
    Last Post: 07-18-2010, 08:01 AM
  3. trying to understand what is being rendered ??
    By caricas in forum Windows Programming
    Replies: 4
    Last Post: 04-20-2007, 02:13 PM
  4. How to encrypt a CD perfectly?
    By Yin in forum A Brief History of Cprogramming.com
    Replies: 8
    Last Post: 03-13-2002, 09:02 AM
  5. Final Fantasy Movie: Rendered in Real-Time.
    By Cheeze-It in forum A Brief History of Cprogramming.com
    Replies: 4
    Last Post: 08-15-2001, 09:02 PM

Tags for this Thread