Originally posted by: Barkotron
Originally posted by: Finny
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.
First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.
Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.
However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.
1. Nobody's dismissing possible driver updates. I, however, am dismissing the idea that possible future driver updates should be used as an excuse for a poor showing on either side. Well, unless there's a very clear driver bug, as there have been in the past with SLI/Crossfire setups, or the renaming fear.exe in earlier Cats etc.
2. Why is there a right to expect a minimum of 30FPS just because someone's spent a lot of money on a videocard? Have you actually seen the amount of stuff that goes on outdoors in Oblivion? It looks
incredible, and screenshots just don't do it justice. Frankly I'm impressed that frame rates are as high as they are.
3. Xbit seems to be as reliable as other sites out there. Personally I have no problem with the numbers they're putting out - I'm just pointing out that specifically those Oblivion scores are not numbers "NVidia fans" should be shouting about. The ATI card has 80% better minimum FPS @12x10 and nearly 60% better @16x12. That's a big difference at any level, but when the difference is between 27FPS and 15FPS, or 22FPS and 14FPS, then it's the difference between "just about playable" and "severely affecting gameplay". Those scores are a
hammering, and no meaningless drivel about "nanosecond frame rate drops" can hide that.
Sure, driver updates can bring big improvements, but until they've been tested, using possible future driver enhancements as an excuse is pure spin.