Thread: Wait for DX11?

  1. #16
    Hail to the king, baby. Akkernight's Avatar
    Join Date
    Oct 2008
    Location
    Faroe Islands
    Posts
    717
    Why doesn't someone make a DirectX thing that's multi-platform >.< and I mean fully, as in, not emulated or anything :P
    I've never seen any good stuff made using OpenGL :P DirectX is the only reason I still keep my windows :P
    Currently research OpenGL

  2. #17
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    I've never seen any good stuff made using OpenGL
    Quake?
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  3. #18
    Woof, woof! zacs7's Avatar
    Join Date
    Mar 2007
    Location
    Australia
    Posts
    3,459
    > I've never seen any good stuff made using OpenGL :P DirectX is the only reason I still keep my windows :P
    That's because it has large commercial backing and was really pushed when it came out. There are lots of OpenGL games.

    Recently Quake 4 and Doom 3

    And id's new game, whatever that's called. The Unreal Engine also used to (or still does) have a OpenGL renderer.

  4. #19
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,071
    Everything you can do with D3D, you can do in OpenGL. However, most games are written for Windows only, and telling gamers that your next big title has a DX10 renderer makes for better marketing (attributed to the commercial baking zacs mentioned).

    And personally, though I don't know a great deal about DX10, I would say many of the effects appearing on DX10 cards/games, could be acheived with a DX9 renderer on the same hardware. Although perhaps with a little more effort on the programmers part.
    M.Eng Computer Engineering Candidate
    B.Sc Computer Science

    Robotics and graphics enthusiast.

  5. #20
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    Yeah, I've got the GDDR3 version. I've always had *600 cards, 5600, 6600TDH (Turbo Force Edition haha), 7600GT and now a 8600GT. So I guess you could call me a poor man :-).
    Not only do I use mid-end cards, I upgrade every second generation. GF4 Ti4200, 6600, 8600. 9600 is the first exception . I only spent $35 on my 8600GT, so I could justify another $90 for a 9600 a year later.

    And personally, though I don't know a great deal about DX10, I would say many of the effects appearing on DX10 cards/games, could be acheived with a DX9 renderer on the same hardware. Although perhaps with a little more effort on the programmers part.
    The Crysis "very high" setting hoax (they claimed that "very high" is only doable in DX10, then someone found a way to enable it on XP, and it looked exactly like "very high" in Vista with DX10)

  6. #21
    Woof, woof! zacs7's Avatar
    Join Date
    Mar 2007
    Location
    Australia
    Posts
    3,459
    Quote Originally Posted by cyberfish View Post
    Not only do I use mid-end cards, I upgrade every second generation. GF4 Ti4200, 6600, 8600. 9600 is the first exception . I only spent $35 on my 8600GT, so I could justify another $90 for a 9600 a year later.
    The problem is that hardware doesn't seem to increase in a linear fashion :'(. I spent $186 AUD on this card, because I got it when they came out -- but couldn't afford / didn't need a 8800GT

    I tend to avoid ATi because it's so darn hard to count with their card names / versions
    Last edited by zacs7; 11-18-2008 at 10:23 PM.

  7. #22
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    That is true. The 8600 was a disappointment, especially when compared with the success of the 7600. The 9600 is great, though, roughly on the same level as 8800 GT (faster than GS, slower than GTS and GTX). If you can wait for a few years, I will try out GTX 260 and report back .

  8. #23
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    The only thing you cannot do in DX9 that you can in DX10 is geometry shaders. You cannot generate vertices in DX9 shaders or on DX9 hardware.

    DX10 was not widely accepted nor developed for mainly because it required Windows Vista. If Windows Vista had a better reception then you would have seen DX9 vanish overnight. However b/c Vista was plagued with issues many gamers simply did not move to it and thus did not have DX10 compatibility. Most game companies did not actively develop for DX10 b/c of this. If they did they risked alienating a good portion of their customer base. Now you do see DX10 games more often but most still support DX9 so they probably have some DX9 shaders in them.

    I would expect that as Vista is adopted more and more you will see DX9 games start to dwindle. Windows 7 was supposed to be released in 2009 b/c Vista flopped so bad but who knows. Early previews are saying that Windows 7 feels and looks like Vista does. Perhaps the difference will be in the quality and speed of the kernel.

    DirectX 11 introduces the new concept of compute shaders. Supposedly these allow collisions, physics, culling, and other software algos to be offloaded to the video card. All 9+ series NVidia cards now support PhysX natively so I'm sure we will see these compute shaders being used for just that.

    I think most of the recent talk of max settings on this or that game is just pure hype. A game is only going to look so good and so long as your card supports VS/PS 3.0 you can get some very nice visuals. Most of the visual quality of games for now seems to be in shaders and not in the number of vertices. When they begin to up the vertex count and make more and more detailed fully deformable mesh models then the new cards will really shine.
    However most of the current 'bloom' and post process effects in Crysis can be done just as well on DX9 hardware as it can on DX10. Floating point textures were introduced in PS 3.0 which really helps effects like bloom, shadow mapping, and real-time cubic environment mapping. Geometry instancing cannot be done on hardware in DX9 but I'm not sure if Crysis uses this or not. It would be smart to use it for trees and foliage but I just don't know enough about the game to know what they are doing. I've seen demos of DX10 where thousands of objects were drawn with very few calls due to geometry shaders.
    Last edited by VirtualAce; 11-19-2008 at 01:26 AM.

  9. #24
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    I tend to avoid ATi because it's so darn hard to count with their card names / versions
    I tend to avoid ATI because they don't make decent Linux drivers (or so last time I checked).

  10. #25
    Woof, woof! zacs7's Avatar
    Join Date
    Mar 2007
    Location
    Australia
    Posts
    3,459
    > I tend to avoid ATI because they don't make decent Linux drivers (or so last time I checked).
    That also, the last ATI card I had was a 9250 Radeon or whatever they're called. That scared me away from ATI. Although it was better than my FX5200 -- which I think is the best card ever made (that or the MX440).

    To cut a long story short, I broke my 9250 got a FX5200 then a 5600 :-)

    I can't wait until 5 years from now when Crysis and alike are considered "crappy" and my mobile phone can run them.

  11. #26
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by psychopath View Post
    Everything you can do with D3D, you can do in OpenGL. However, most games are written for Windows only, and telling gamers that your next big title has a DX10 renderer makes for better marketing (attributed to the commercial baking zacs mentioned).
    Hmmm. I don't know about that, or at least not as efficient.
    DirectX has been outpacing OpenGL for some time and the newest OpenGL standard was considered a failure by many. So unless the OpenGL group can get theirs wits together and fix the issues with OpenGL, it's not going anywhere, I would think.

    Quote Originally Posted by Bubba View Post
    I would expect that as Vista is adopted more and more you will see DX9 games start to dwindle. Windows 7 was supposed to be released in 2009 b/c Vista flopped so bad but who knows. Early previews are saying that Windows 7 feels and looks like Vista does. Perhaps the difference will be in the quality and speed of the kernel.
    They same it has the same look and feel, yes, but also more stability and performance.
    Basically, it will be a streamlined version of Vista--what Vista should have been--ie, Vista 2.0.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  12. #27
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,195
    Quote Originally Posted by cyberfish View Post
    I tend to avoid ATI because they don't make decent Linux drivers (or so last time I checked).
    I tend to avoid ATI because they make crappy hardware. They cut corners on hardware development and then try to fix the problems in software.

  13. #28
    Woof, woof! zacs7's Avatar
    Join Date
    Mar 2007
    Location
    Australia
    Posts
    3,459
    > I don't know about that, or at least not as efficient.
    It depends in which application it's applied. Recent tests show they're almost identical in speed. It is DX that used to lag behind.

    Why bash OpenGL? What issues?! It's a specification!

    Is this another bash-the-implementation-because-it's-not-C++ topics?
    Last edited by zacs7; 11-19-2008 at 04:38 AM.

  14. #29
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by zacs7 View Post
    > I don't know about that, or at least not as efficient.
    It depends in which application it's applied.
    Perhaps. But I have heard quotes that OpenGL code is a mess right now. The newest spec was supposed to fix it, but did not.

    Why bash OpenGL? What issues?! It's a specification!
    It's a specification, and that specification lacks describing certain features that DirectX have.
    Not exactly bashing it. OpenGL simply needs to pick up the pace and fix these "issues" or mess that developers are moaning about.

    Is this another bash-the-implementation-because-it's-not-C++ topics?
    No.
    I have no allegiance to neither OpenGL nor Direct3D, because I never do graphics.
    However, the API which evolves the quickest is the one I like most. Regardless if it's an open standard or closed.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  15. #30
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by abachler View Post
    I tend to avoid ATI because they make crappy hardware. They cut corners on hardware development and then try to fix the problems in software.
    In my experience, this applies to all of the Graphics HW vendors - these chips are so complex that if you where to test them THOROUGHLY before release, there will be no new chips on the market, because the time taken for this is too long - when I worked for one of the graphics hardware manufacturers, we'd get one frame of Quake 2, (I think) per day or so out of the full chip simulation - running the half a dozen most popular games for hundreds or thousands of frames would take years! Different vendors probably cut more or less corners depending on how they are doing versus their competition. But there certainly are cases where you have to fix a chip defect in software - because it would take another 12 weeks to get the chip fixed (from the point in time when the problem is identified and fixed in the hardware design).

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Why wait cursor still turning on?
    By Opariti in forum Windows Programming
    Replies: 0
    Last Post: 05-22-2009, 02:28 AM
  2. Replies: 3
    Last Post: 10-15-2008, 09:24 AM
  3. Signals, fork(), wait() and exec()
    By DJTurboToJo in forum C Programming
    Replies: 7
    Last Post: 03-18-2008, 09:14 AM
  4. Boom, Headoshot!!
    By mrafcho001 in forum A Brief History of Cprogramming.com
    Replies: 50
    Last Post: 07-21-2005, 08:28 PM
  5. anybody with experience using the wait() function?
    By flawildcat in forum C Programming
    Replies: 7
    Last Post: 04-22-2002, 02:43 PM