I'm sorry guys but I can't keep out of this one.
I for myself have never own a recent nvidia card, so let me talk about ATI for a sec.
I've had a 9600 Pro, 9800 non-pro, a 9800 Pro and currently own an x800gto 256mb, which, for as little as $120 cad (that's canadian money), can run Oblivion in 1024x768 everything maxed except HDR(not supported)/Bloom, grass and shadows @ 4xAA 16xHQAF. This will give you between 25fps and 75fps. The image quality on this card is unbelievable, and I might add that the image quality of any ATI DX9 card is better than any nvidia card, period.
As for the drivers, if one knows how to use them, one knows how to appreciate ATI cards. I for myself use the official Catalyst 6.7, WITHOUT CCC, I use ATI Tray Tools instead, which gives you a lot of goodies like overclocking, monitoring, and loads of tweaks, amongst which are Triple Buffering, Geometry Instancing, etc, including everything CCC gives you, with none of the drawbacks, i.e. ATT is not a ressource hog. Also, ATI releases a driver set each and every month, so it's easy to find one that suits you, although some people have certain problems, especially those who have a Crossfire setup.
However, I wouldn't recommend my card, if you can afford a bit more, because what you need if you go the ATI route is a 16 pipe card. You can get cheap x8x0xt/xt pe if you search a bit. Also, their flagship is currently the x1900XTX which are clocked a bit higher than the xt (25mhz on the gpu, 100mhz on the memory.
Last but not least, the x1k series all have a programmable memory controller, which means that, through driver releases, the dev team can reprogram it, increasing greatly its performance over the lifespan of the card. And this isn't some BS feature either, just read the release notes from this month's release at
ATI Technologies Inc.
Performance Improvement
As with most Catalyst® releases, performance has increased in various situations.
Improved shader compiler and transform engine optimizations have led to many significant performance gains in OpenGL applications across the entire Radeon® X1000 series of products. These include:
* Doom 3 performance improves as much as 6.5%
* Quake 4 (v1.2 or higher) performance improves as much as 18%
* Chronicles of Riddick performance improves as much as 20%. Average improvements of 5-10% are commonly seen
* Prey performance improves as much as 16%
Optimizations made to ATI's graphics memory manager have led to significant gains in DirectX applications that make heavy use of graphics memory on 256MB ATI graphics cards:
* 3DMark06 performance improves as much as 22% on single ATI Radeon® X1000 series cards with 256MB of graphics memory at 1280x1024 and higher when anti-aliasing is enabled. The largest gains are found on the ATI Radeon® X1800 series and ATI Radeon® X1900 series of products
* Call of Duty 2 performance improves as much as 30% on single ATI Radeon® X550 cards or higher with 256MB of video memory at 1280x1024 and higher when anti-aliasing enabled. The largest gains are found on the ATI Radeon® X1800 series and ATI Radeon® X1900 series of products
What I can say about nvidia is they make good cards too, and they're clearly the best there is as far as OpenGL goes, but I still prefer ATI
/end of rant