Originally posted by: nRollo
Originally posted by: taltamir
oh wow, took me a while to get what you are talking about... they start the graph at 0.8 instead of 0. where 1 = performance of ATI card... so if the ATI card gets a 1 and the 280GTX gets a 2.5 it makes it seem like the ATI has 0.2 and the nv has 1.7 on call of juarez for example. which is 8.5x bigger bar.
And they used upcoming drivers for themselves, and outdated AMD drivers. The 8.5 cat drivers claim 12% performance increase on that specific game, i don't know if 8.4 changed anything for it too, but they used 8.3
Not THAT outdated (considering 8 = year, and 3 = month....) but still shady.
Apparently I'm the only person in this thread who understood that graph the first time he/she looked at it.
It comparison in relation to a 3870X2, it's very straight forward, very easy to read.
It's not saying "ATi got .2", it's saying "3870X2 got framerate X. GTX260 was Y% higher, GTX 280 was Z% higher".
I have seen marketing graphs that make 10-20% look like huge differences. These
are huge differences.
The only thing that can be construed in the slightest as misleading here is we don't know what went into the benchmarks. By this I mean presumably NVIDIA used areas of the game that showed their products in a favorable light. This could happen inadvertently with ANY review, and it's also possible there are areas in the game which could show their products in a more favorable light, or that these benches or representative of average difference in these games.
What I've just written is the only logical way that graph can be interpreted to my knowledge.