Why doesn't someone make a DirectX thing that's multi-platform >.< and I mean fully, as in, not emulated or anything :P
I've never seen any good stuff made using OpenGL :P DirectX is the only reason I still keep my windows :P
Why doesn't someone make a DirectX thing that's multi-platform >.< and I mean fully, as in, not emulated or anything :P
I've never seen any good stuff made using OpenGL :P DirectX is the only reason I still keep my windows :P
Currently research OpenGL
Quake?I've never seen any good stuff made using OpenGL
All the buzzt!
CornedBee
"There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
- Flon's Law
> I've never seen any good stuff made using OpenGL :P DirectX is the only reason I still keep my windows :P
That's because it has large commercial backing and was really pushed when it came out. There are lots of OpenGL games.
Recently Quake 4 and Doom 3
And id's new game, whatever that's called. The Unreal Engine also used to (or still does) have a OpenGL renderer.
Everything you can do with D3D, you can do in OpenGL. However, most games are written for Windows only, and telling gamers that your next big title has a DX10 renderer makes for better marketing (attributed to the commercial baking zacs mentioned).
And personally, though I don't know a great deal about DX10, I would say many of the effects appearing on DX10 cards/games, could be acheived with a DX9 renderer on the same hardware. Although perhaps with a little more effort on the programmers part.
M.Eng Computer Engineering CandidateB.Sc Computer Science
Robotics and graphics enthusiast.
Not only do I use mid-end cards, I upgrade every second generation. GF4 Ti4200, 6600, 8600. 9600 is the first exception . I only spent $35 on my 8600GT, so I could justify another $90 for a 9600 a year later.Yeah, I've got the GDDR3 version. I've always had *600 cards, 5600, 6600TDH (Turbo Force Edition haha), 7600GT and now a 8600GT. So I guess you could call me a poor man :-).
The Crysis "very high" setting hoax (they claimed that "very high" is only doable in DX10, then someone found a way to enable it on XP, and it looked exactly like "very high" in Vista with DX10)And personally, though I don't know a great deal about DX10, I would say many of the effects appearing on DX10 cards/games, could be acheived with a DX9 renderer on the same hardware. Although perhaps with a little more effort on the programmers part.
The problem is that hardware doesn't seem to increase in a linear fashion :'(. I spent $186 AUD on this card, because I got it when they came out -- but couldn't afford / didn't need a 8800GT
I tend to avoid ATi because it's so darn hard to count with their card names / versions
Last edited by zacs7; 11-18-2008 at 10:23 PM.
That is true. The 8600 was a disappointment, especially when compared with the success of the 7600. The 9600 is great, though, roughly on the same level as 8800 GT (faster than GS, slower than GTS and GTX). If you can wait for a few years, I will try out GTX 260 and report back .
The only thing you cannot do in DX9 that you can in DX10 is geometry shaders. You cannot generate vertices in DX9 shaders or on DX9 hardware.
DX10 was not widely accepted nor developed for mainly because it required Windows Vista. If Windows Vista had a better reception then you would have seen DX9 vanish overnight. However b/c Vista was plagued with issues many gamers simply did not move to it and thus did not have DX10 compatibility. Most game companies did not actively develop for DX10 b/c of this. If they did they risked alienating a good portion of their customer base. Now you do see DX10 games more often but most still support DX9 so they probably have some DX9 shaders in them.
I would expect that as Vista is adopted more and more you will see DX9 games start to dwindle. Windows 7 was supposed to be released in 2009 b/c Vista flopped so bad but who knows. Early previews are saying that Windows 7 feels and looks like Vista does. Perhaps the difference will be in the quality and speed of the kernel.
DirectX 11 introduces the new concept of compute shaders. Supposedly these allow collisions, physics, culling, and other software algos to be offloaded to the video card. All 9+ series NVidia cards now support PhysX natively so I'm sure we will see these compute shaders being used for just that.
I think most of the recent talk of max settings on this or that game is just pure hype. A game is only going to look so good and so long as your card supports VS/PS 3.0 you can get some very nice visuals. Most of the visual quality of games for now seems to be in shaders and not in the number of vertices. When they begin to up the vertex count and make more and more detailed fully deformable mesh models then the new cards will really shine.
However most of the current 'bloom' and post process effects in Crysis can be done just as well on DX9 hardware as it can on DX10. Floating point textures were introduced in PS 3.0 which really helps effects like bloom, shadow mapping, and real-time cubic environment mapping. Geometry instancing cannot be done on hardware in DX9 but I'm not sure if Crysis uses this or not. It would be smart to use it for trees and foliage but I just don't know enough about the game to know what they are doing. I've seen demos of DX10 where thousands of objects were drawn with very few calls due to geometry shaders.
Last edited by VirtualAce; 11-19-2008 at 01:26 AM.
I tend to avoid ATI because they don't make decent Linux drivers (or so last time I checked).I tend to avoid ATi because it's so darn hard to count with their card names / versions
> I tend to avoid ATI because they don't make decent Linux drivers (or so last time I checked).
That also, the last ATI card I had was a 9250 Radeon or whatever they're called. That scared me away from ATI. Although it was better than my FX5200 -- which I think is the best card ever made (that or the MX440).
To cut a long story short, I broke my 9250 got a FX5200 then a 5600 :-)
I can't wait until 5 years from now when Crysis and alike are considered "crappy" and my mobile phone can run them.
Hmmm. I don't know about that, or at least not as efficient.
DirectX has been outpacing OpenGL for some time and the newest OpenGL standard was considered a failure by many. So unless the OpenGL group can get theirs wits together and fix the issues with OpenGL, it's not going anywhere, I would think.
They same it has the same look and feel, yes, but also more stability and performance.
Basically, it will be a streamlined version of Vista--what Vista should have been--ie, Vista 2.0.
> I don't know about that, or at least not as efficient.
It depends in which application it's applied. Recent tests show they're almost identical in speed. It is DX that used to lag behind.
Why bash OpenGL? What issues?! It's a specification!
Is this another bash-the-implementation-because-it's-not-C++ topics?
Last edited by zacs7; 11-19-2008 at 04:38 AM.
Perhaps. But I have heard quotes that OpenGL code is a mess right now. The newest spec was supposed to fix it, but did not.
It's a specification, and that specification lacks describing certain features that DirectX have.Why bash OpenGL? What issues?! It's a specification!
Not exactly bashing it. OpenGL simply needs to pick up the pace and fix these "issues" or mess that developers are moaning about.
No.Is this another bash-the-implementation-because-it's-not-C++ topics?
I have no allegiance to neither OpenGL nor Direct3D, because I never do graphics.
However, the API which evolves the quickest is the one I like most. Regardless if it's an open standard or closed.
In my experience, this applies to all of the Graphics HW vendors - these chips are so complex that if you where to test them THOROUGHLY before release, there will be no new chips on the market, because the time taken for this is too long - when I worked for one of the graphics hardware manufacturers, we'd get one frame of Quake 2, (I think) per day or so out of the full chip simulation - running the half a dozen most popular games for hundreds or thousands of frames would take years! Different vendors probably cut more or less corners depending on how they are doing versus their competition. But there certainly are cases where you have to fix a chip defect in software - because it would take another 12 weeks to get the chip fixed (from the point in time when the problem is identified and fixed in the hardware design).
--
Mats
Compilers can produce warnings - make the compiler programmers happy: Use them!
Please don't PM me for help - and no, I don't do help over instant messengers.