OpenGL is a collection of the core and two levels of extensions, the ARB extensions and the vendor extensions. Every new version of OpenGL typically takes some tried and proven ARB extensions and makes them part of the core. In turn, ARB extensions are created by taking vendor extensions that have become popular and making a uniform interface for them. This way, functionality slowly wanders from the outside, the vendor extensions on which you simply cannot rely, towards the inside, the core, which you can rely on. In other words, an OpenGL driver claiming support for a specific OpenGL version MUST implement all core functionality of that version, can additionally support any number of ARB extensions and of course have whatever vendor extension it likes.
Programs work with this by dynamically detecting the availability of such extensions. They can then provide the new, shiny features only if support is available, and fall back on less shiny methods of achieving their goal otherwise.
Or they fail to start. Or they foolishly don't do proper detection and crash when trying to use the unavailable functionality. That's most likely the back-compatibility issues you're experiencing, Mario. Programs that simply don't bother to support drivers that do not offer the features they need.
DirectX follows a different philosophy. There are no extensions. There is just DX. You either support it or you don't. To gain access to the brand new features, you cannot splice your program to three code paths for three incompatible vendor-specific implementations of the feature as you might do in OpenGL. Instead you don't use the features until Microsoft has released the next DirectX version which supports the feature in a uniform way. (Of course you could wait in OpenGL until it becomes an ARB extensions, but who wants to wait that long? ) It's definitely easier and more portable this way. And if MS decides to release the new DX version only for their new OS, you're screwed. I'm not making any judgement here, just saying it as it is.
Now the thing is that MS always supplies software emulation for all features of DirectX. There is no such thing as an unsupported feature, there's just a feature not supported by the hardware. Which to me, recalling the frame rates I got when pixel shaders where the shiny new feature my TNT2 (I think) didn't support, is pretty much the same as unsupported, except that it doesn't crash your app. It only makes it run like a slug with asthma.
OpenGL gives the choice about feature support to the program. The program can support older cards by emulating new techniques or falling back to old techniques. The program can also not support the driver/card. The card/driver, in turn, can support whatever it wants through the extension rules, at the risk of having some programs not run. If driver development stops, the card is dead to new OpenGL apps that have no fallback.
DirectX can somehow query the hardware capabilities, although I can't remember how. But in principle, it imposes requirements on the card/driver. This interface is the stuff you have to support, or be non-compliant and software-emulated. The program is, mostly, supposed not to notice. Which I consider misguided, but that's just me.
It should be pointed out that the Mesa project offers something similar to what DirectX does. Mesa is a software implementation of OpenGL for the X system, but it's also the gateway to the DRI, the hardware rendering support. Set X to use Mesa (instead of the vendor OpenGL implementations such as ATI's or NVidia's) and at your disposal are the asthmatic slug versions of all the features Mesa supports, whether your card supports them or not. Then you plug in a 3d driver like the open-source radeon driver and Mesa will automatically use the hardware as much as possible.
The downside? Open-source 3d drivers are always many years behind the times, because they mostly have to guess at how the 3d cards work. Only very recently has support for the ATI r300 chips (that would be Radeon 9800 and friends) entered the Mesa source, and it's far from stable.
Edit: I just realized I made a mistake about the extension levels. There are actually three levels of extensions: ARB, EXT and vendor.
Vendor is vendor-specific. Only the inventor is ever expected to support the extension.
When several vendors implement similar functionality, they are encouraged to work together and create an EXT version of the functionality, which is a uniform interface.
EXT extensions can be officially accepted by the review board, making them ARB extensions. These are scheduled for inclusion in the core and very reliable in terms of availability.