My last system (now my debugging console) had a dual head NVidia Graphics Card. It was beautiful. The software ran flawlessly, I was able to set all my apps to launch on the desired monitor, and all those apps could launch their dialogs on any desired monitor. The NVidia tools are amazing. I loved NPerfMon and a variety of other free development tools from NVidia.
Cut to today. A while back I build a new box and somehow got talked into an ATI Radeon for the PCI-E slot as all my other cards were AGP. I suppose I wasn't really thinking but I didn't forsee problems in the future. Now I'm using ATIs HYDRAVISION and I'd rather scratch my own eyes out. The options are far more limited, and they break. Custom Application Options reset themselves and the HYDRAVISION interface crashes every time I apply my changes. ATI PerfStudio is ugly and (though I've barely used it) seems unweildy.
So realistically, how does ATI compete with NVidia so prominently? If the only comparison of quality is purely at the user/gamer level, one would think developers would jump onboard with NVidia and tip the scales.
Maybe I just haven't found and turned up the Goodness Slider yet...
Set Application Behaviour
<- Scratch Your Eyes Out / NVidiaesque ->