Most of the games on the PC are built on top of top-notch graphics systems or very good in-house systems.
Why is it then that games that are only 1 to 2 years old have a very hard time changing screen resolution to something other than the auto-detect feature detects? Is it that hard to change screen resolution? Is it that hard to re-compute the aspect ratio (width / height) for the projection matrix?
This really has me wondering if all these 'super' code bases are really all that 'super' at all. Something like screen resolution should be left to an ini, xml, or some other type of human readable file. And then the engine should be able to adjust for various resolutions. This allows players in the future to play the games on newer hardware with different resolutions. And not to mention it's not exactly rocket science.
From a Direct3D standpoint I cannot fathom why it is difficult to setup the D3DPRESENT_PARAMETERS width and height members to user-specified values. I also cannot fathom why they cannot setup the projection matrix correctly to fit 16:9's and/or 4:3's. It's simple math that any engine should be able to do. Even my crappy little D3D framework can support any resolution the card can support.
Are devs really hard-coding resolution into the games? Surely not. Seems like huge oversight to me and my confidence in the abilities of the gurus coding these 'engines' has slipped just a little.
Seems like a very basic hard-fast rule that screen resolution should never be fixed and a game should never be built to fit only certain resolutions.