Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
Originally posted by: Noema
It's not that the CPU matters less at higher resolutions with lots of AA; it's that the video card will be bottlenecked long before the CPU can be a factor.
I think that at that resolution, with SLI'ed 8800GTX, well yes, the CPU might be the bottleneck. But we are talking about an E6700! It's not like you are going to see a slideshow here.
'Bottleneck' in this case means that you'll get 150 Max FPS instead of, say, 190, or whatever... In other words, you probably wouldn't even notice if your E6700 were running at 3.6GHz unless you were really, really, really picky about max FPS and / or synthetic benchmarks.
EDIT: And to be perfectly frank, I think that SLI'ed 8800GTXes are an utter waste at that resolution.
For now, anything more than an 8800GTS is a waste @ 1680x1050.
Boo on that comment. :thumbsdown:
If you think that, you're not pumping up the IQ enough.
Tell me what game I can't play 4xAA/16xAF @ 1680x1050 with max settings on a 8800GTS (especially overclocked). Perhaps Oblivion might give you trouble.
I run my GTS @ 600/1900 but I haven't found a scenario where I can't play a game at max settings. Medieval II gets a bit choppy at times, but it's more than acceptable.
Now when Crysis and UT2007 are released, that's going to be another story. But by then I'll have upgraded to something better anyway.
Everything is starting to change and will change when games like Crysis, UT2007, and Alan Wake come around. For one we're going to see a whole new league of GPU requirements, but otherwise we're also going to see more stress on the CPU than ever before. If you're worried about TODAY's performance, however, a good CPU like an E6600 or E6700 and an 8800GTS is fine for ANYTHING at 1680x1050, unless you insist on running 16xQAA.