Thread: glVertexAttrib3fARB(): not working on NVidia hardware

  1. #1
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,071

    glVertexAttrib3fARB(): not working on NVidia hardware

    A while back, It was reported to me that my engines default lighting shaders didn't work correctly on GeForce cards. I couldn't do anything about it, because I had an ATI card.

    Well, recently I got myself a GeForce 7600 GS, and immediatly saw the issue.

    I've narrowed it down to glVertexAttrib3fARB(). If I avoid this function, my shaders run perfectly.

    -There's no info on google about this (that I could find).
    -GLSLValidate says my shaders are standard.
    -It works on ATI cards.

    The above three points are very fusturating.

    To pass a vertex attribute to the vertex shader, I use this code (sorry, it's C++/CLI, but it's the exact same idea. and no, problem isn't related to the CLIness of the code, because I have the same issue with the native equivalent):
    Code:
    void CGlsl::Attribute3f(int prog, String ^name, float f0, float f1, float f2){
    	int loc = Gl::glGetAttribLocationARB(prog, name);
    	Gl::glVertexAttrib3fARB(loc, f0, f1, f2);
    }
    I really need this working. Shaders are no fun if they're broken .

    Thanks.
    M.Eng Computer Engineering Candidate
    B.Sc Computer Science

    Robotics and graphics enthusiast.

  2. #2
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    I don't know what the equivalent assembly or HLSL function for that is. Perhaps if you gave more information to me I could give you an assembly workaround.

  3. #3
    User
    Join Date
    Jan 2006
    Location
    Canada
    Posts
    499
    Hmm.... saw this on DaniWeb:
    http://www.daniweb.com/techtalkforums/post244235.html

    Too bad Why does NVidia have these problems? Are not they in competition with ATI? If all games that use glVertexAttrib3fARB() get polygon soup on NVidas, then ATI's already winning.

  4. #4
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,071
    Well, at least i'm not the only one.

    I worked around the problem by passing the attribute as a 3D texture coord in one of my unused texture slots. Incorrectly named, and is a little funny to look at, but data is data. The shader is still working incorrectly, but at least the verticies are where they're supposed to be.
    M.Eng Computer Engineering Candidate
    B.Sc Computer Science

    Robotics and graphics enthusiast.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. X-10 Hardware
    By jmd15 in forum Tech Board
    Replies: 1
    Last Post: 09-29-2005, 07:34 PM
  2. hardware driver
    By ober in forum C Programming
    Replies: 5
    Last Post: 03-09-2004, 12:28 PM
  3. working out...
    By TechWins in forum A Brief History of Cprogramming.com
    Replies: 11
    Last Post: 04-10-2003, 10:20 AM
  4. How does hardware interpret data?
    By Silvercord in forum Tech Board
    Replies: 3
    Last Post: 01-29-2003, 01:46 PM
  5. Nvidia Jobs: Read
    By RoD in forum A Brief History of Cprogramming.com
    Replies: 8
    Last Post: 01-28-2003, 05:27 AM