- Mar 20, 2000
- 102,425
- 8,388
- 126
Originally posted by: galperi1
It's as simple as that. It seems that Nvidia has inherited the worst attributes of a dying 3dfx company
i don't remember 3dfx cheating on benchmarks
Originally posted by: galperi1
It's as simple as that. It seems that Nvidia has inherited the worst attributes of a dying 3dfx company
Originally posted by: SexyK
Originally posted by: oldfart
And nVidia is an objective party in this? Who has more to lose? Please list what those "plenty of motivations" would be.Originally posted by: SexyKyou have to open your eyes to the fact that they are not an objective third party in this argument, and there are plenty of motivations for them to release this other than for the sanctity of the benchmark. Whether or not nVidia was being dishonest, this is a play in the market by FM.
Please go back and read my posts. I never said that nVidia was an objective third party, I only said that relying on the information provided by Futuremark is a mistake.
I also already listed Futuremark's reasons for attacking nVidia, but here's a recap:
1)FM tells nVidia that they have to pay hundreds of thousands of dollars to join the "beta program" for 3dmark03
2)nVidia decides that 3dmark03 isn't a realistic DX9 benchmark (a fact that many in the industry agree on)
3)As fallout to these decisions, the validity of Futuremark's most important product is brought into question
4)A further result is Futuremark takes a direct loss of hundreds of thousands of dollars when nVidia refuses to join the program.
nVidia's decision severly impacted the reputation and bottom line of FM. If you think Futuremark wasn't extremely upset at nVidia after that turn of events, you're mistaken.
Originally posted by: ElFenix
i clearly explained what a cheat is and what isn't. you can simply ignore that and spout off inane crap if you like.Originally posted by: Shamrock
hmm,
I find it funny that Anandtech didnt use 3dMark...why is that? Because he doesnt have the confidence in the program itself?
Also, everyone of you guys that own a Pentium 4...praise them for their "SSE2" OPTIMIZATION extensions while the athlon gets a close enough score with RAW horsepower. Before you scoff at Nvidia for cheating, should you not also take a look at Intel for optimizing with software? If you feel differently, then you're a hypocrite. Intel "optimizes" to include the SSE2 extensions, while AMD does not...let's DISABLE the SSE2 instructions, and see how fast the P4 goes then. I've eseen the SSE2 instructions disabled on Adobe Photoshop 7, and it was slower than the Macintosh! WHile the Athlon cruised.
Originally posted by: ElFenix
Originally posted by: SexyK
2)nVidia decides that 3dmark03 isn't a realistic DX9 benchmark (a fact that many in the industry agree on)
.
name 3.
"3DMark03, a DirectX 9 Benchmark?" This is one question that we really must ask ourselves. Game 1 is a very simple DX7 test that is not representative of any current games. Game Tests 2 and 3 are DX8.1 but use Pixel Shader 1.4, which is not used by any games we are aware of and will not be to our knowledge. Game 4 is a hybrid of DX8/DX9. It is these four tests that determine the overall score. Only one game test in this benchmark DirectX 9 and then only partially.
Is 3DMark03 really a good indication of what ?Real World? gaming is? Is 3DMark03 really "forward looking"?
With all that we have seen so far it does not appear to actually give us any indication how video cards are going to compare in any real games. It produces an overall 3DMark which is taken from unbalanced game tests. Furthermore as we have seen directly above in the benchmarks, drivers can be optimized to run the game tests faster. The problem is this is just a benchmark and not based on any real world gaming engines. Therefore while you may get a higher 3DMark number you will not see ANY increase in performance in any games that are out there.
One issue we have been discussing as of late with amoungst ourselves and with ATI and NVIDIA, is the implementation of real world game benchmarking versus synthetic benchmarking. We do feel that synthetic benchmarks do have their place if used properly. Actual games or gaming engines should remain the primary focus though. When it comes right down to it we would rather see improvements and gains in real world games rather than in synthetic benchmarks.
In closing, Kyle has informed me that Hard|OCP will not be using the overall 3DMark03 score to evaluate video cards.
And here's where the real discussion begins. Even NVIDIA is clearly distancing itself from the new release and believes that this benchmark test has no relation to reality. For one thing, the choice of game tests, beginning with the rarely played flight scenes test (Wings of Fury), is nothing close to actual practice. It goes further with the tests GT2 and GT3. The 3D engine that makes up the foundation of the tests is the same, and both tests use pixel shaders based on version v1.4. Here is an excerpt from NVIDIA's statement:
These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn't support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4.
These shaders don't only ensure bad results of PS 1.1 cards compared to those that support PS 1.4 (they need more passes for the effect) they are also hardly used in actual 3D games. Xbox can't run PS 1.4 code as well. Even more serious is that 3DMark03 test runs use different shader codes for different cards. This makes comparisons between different 3D-chips close to impossible. In 3D Mark 2001 SE, this was only the case with a special PS1.4 test. Now, however, all tests, including GT4 (Mother Nature), are actually no longer comparable to one another.
i clearly explained what a cheat is and what isn't. you can simply ignore that and spout off inane crap if you like.[/quote]Originally posted by: Shamrock
I find it funny that Anandtech didnt use 3dMark...why is that? Because he doesnt have the confidence in the program itself?
Also, everyone of you guys that own a Pentium 4...praise them for their "SSE2" OPTIMIZATION extensions while the athlon gets a close enough score with RAW horsepower. Before you scoff at Nvidia for cheating, should you not also take a look at Intel for optimizing with software? If you feel differently, then you're a hypocrite. Intel "optimizes" to include the SSE2 extensions, while AMD does not...let's DISABLE the SSE2 instructions, and see how fast the P4 goes then. I've eseen the SSE2 instructions disabled on Adobe Photoshop 7, and it was slower than the Macintosh! WHile the Athlon cruised.
Originally posted by: Pabster
Everybody is cheating!
Seriously, this shouldn't be a big shock. ATi's done (doing?) it before, it is nVidia's turn.
Originally posted by: Dean
Tim Sweeney basically says it all. ATI's so called cheat is a legitimate optimization and Nvidia's so called cheat is just that...
A Cheat
Originally posted by: Shamrock
hmm,
I find it funny that Anandtech didnt use 3dMark...why is that? Because he doesnt have the confidence in the program itself?
Also, everyone of you guys that own a Pentium 4...praise them for their "SSE2" OPTIMIZATION extensions while the athlon gets a close enough score with RAW horsepower. Before you scoff at Nvidia for cheating, should you not also take a look at Intel for optimizing with software? If you feel differently, then you're a hypocrite. Intel "optimizes" to include the SSE2 extensions, while AMD does not...let's DISABLE the SSE2 instructions, and see how fast the P4 goes then. I've eseen the SSE2 instructions disabled on Adobe Photoshop 7, and it was slower than the Macintosh! WHile the Athlon cruised.
and I also have a question...about these new drivers, the 44.03
Does this "optimization" affect ALL cards, or just the GFFX 5900 series? Because I have the new drivers in my GF3 Ti200, and I find NO FLAW in ANY game I play, and that's over 15 games on my harddrive. I have not seen a glitch, nor a bug, nor slowdown, nor crashes from these new drivers...however, I HAVE seen performance gains in ALL my games,with no problems whatsoever. So...
3dMark can stick their little benchmark, I play games, and that what counts..REAL world benchmarks...I tend to like the benchmarks Anandtech had. Oh yeah, they were also impressive...I will be buying a GFFX 5900
Another thing....Why did Anandtech get a bug where ATI cards have that window shining through the head of Sam on Splinter Cell? is ATI cheating because of that? hmm...maybe they are manipulating their own drivers...because Splinter Cell IS a DX9 game (it says it on my box)
Originally posted by: SexyK
Maybe I'm reading it wrong, but Sweeny's piece seems to say that there is way to much grey area in the world of 3d graphics to make black and white distictions between cheating and optimizing.
Only for some of the slighter cheating like (say) using 32-bit RGBA color (8 bits per channel) when the game author expected 16 bits of alpha.Originally posted by: SexyK
Maybe I'm reading it wrong, but Sweeny's piece seems to say that there is way to much grey area in the world of 3d graphics to make black and white distictions between cheating and optimizing.
Originally posted by: mikable
This thread is as worthless as mammaries on a wild male pig. :disgust:
An optimization would make a game or demo runs faster in general: if the camera is changed, and if it isn't a specific pre-recorded timedemo. The optimzation is valid (not cheating) if it provides identical image quality, or identical within the bounds of the OpenGL and DX9 api specs. ATI's quack cheats were not valid because they did not provide equivalent image quality.
I used to hold ATi's cheating firmly against them, my last card of theirs being a VGA Wonder XL (yes, they cheated long before the Quake fiasco).nVidia cheated with this driver. Period
ATI cheated with the "quake"-detecting driver. Period.
ATI may be running around in the grey area but Nvidia seems to running deep into the brown, if you know what i mean.