glVertexAttrib3fARB(): not working on NVidia hardware

This is a discussion on glVertexAttrib3fARB(): not working on NVidia hardware within the Game Programming forums, part of the General Programming Boards category; A while back, It was reported to me that my engines default lighting shaders didn't work correctly on GeForce cards. ...

  1. #1
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,070

    glVertexAttrib3fARB(): not working on NVidia hardware

    A while back, It was reported to me that my engines default lighting shaders didn't work correctly on GeForce cards. I couldn't do anything about it, because I had an ATI card.

    Well, recently I got myself a GeForce 7600 GS, and immediatly saw the issue.

    I've narrowed it down to glVertexAttrib3fARB(). If I avoid this function, my shaders run perfectly.

    -There's no info on google about this (that I could find).
    -GLSLValidate says my shaders are standard.
    -It works on ATI cards.

    The above three points are very fusturating.

    To pass a vertex attribute to the vertex shader, I use this code (sorry, it's C++/CLI, but it's the exact same idea. and no, problem isn't related to the CLIness of the code, because I have the same issue with the native equivalent):
    Code:
    void CGlsl::Attribute3f(int prog, String ^name, float f0, float f1, float f2){
    	int loc = Gl::glGetAttribLocationARB(prog, name);
    	Gl::glVertexAttrib3fARB(loc, f0, f1, f2);
    }
    I really need this working. Shaders are no fun if they're broken .

    Thanks.
    Memorial University of Newfoundland
    Computer Science

    Mac and OpenGL evangelist.

  2. #2
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    I don't know what the equivalent assembly or HLSL function for that is. Perhaps if you gave more information to me I could give you an assembly workaround.

  3. #3
    User
    Join Date
    Jan 2006
    Location
    Canada
    Posts
    498
    Hmm.... saw this on DaniWeb:
    http://www.daniweb.com/techtalkforums/post244235.html

    Too bad Why does NVidia have these problems? Are not they in competition with ATI? If all games that use glVertexAttrib3fARB() get polygon soup on NVidas, then ATI's already winning.

  4. #4
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,070
    Well, at least i'm not the only one.

    I worked around the problem by passing the attribute as a 3D texture coord in one of my unused texture slots. Incorrectly named, and is a little funny to look at, but data is data. The shader is still working incorrectly, but at least the verticies are where they're supposed to be.
    Memorial University of Newfoundland
    Computer Science

    Mac and OpenGL evangelist.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. X-10 Hardware
    By jmd15 in forum Tech Board
    Replies: 1
    Last Post: 09-29-2005, 08:34 PM
  2. hardware driver
    By ober in forum C Programming
    Replies: 5
    Last Post: 03-09-2004, 12:28 PM
  3. working out...
    By TechWins in forum A Brief History of Cprogramming.com
    Replies: 11
    Last Post: 04-10-2003, 11:20 AM
  4. How does hardware interpret data?
    By Silvercord in forum Tech Board
    Replies: 3
    Last Post: 01-29-2003, 01:46 PM
  5. Nvidia Jobs: Read
    By RoD in forum A Brief History of Cprogramming.com
    Replies: 8
    Last Post: 01-28-2003, 05:27 AM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21