Futuremark confirms NVIDIA "cheating" with latest driver...releases audit report

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,425
8,388
126
Originally posted by: galperi1

It's as simple as that. It seems that Nvidia has inherited the worst attributes of a dying 3dfx company

i don't remember 3dfx cheating on benchmarks
 

CBone

Senior member
Dec 4, 2000
402
0
0
Originally posted by: SexyK
Originally posted by: oldfart
Originally posted by: SexyKyou have to open your eyes to the fact that they are not an objective third party in this argument, and there are plenty of motivations for them to release this other than for the sanctity of the benchmark. Whether or not nVidia was being dishonest, this is a play in the market by FM.
And nVidia is an objective party in this? Who has more to lose? Please list what those "plenty of motivations" would be.

Please go back and read my posts. I never said that nVidia was an objective third party, I only said that relying on the information provided by Futuremark is a mistake.

I also already listed Futuremark's reasons for attacking nVidia, but here's a recap:

1)FM tells nVidia that they have to pay hundreds of thousands of dollars to join the "beta program" for 3dmark03
2)nVidia decides that 3dmark03 isn't a realistic DX9 benchmark (a fact that many in the industry agree on)
3)As fallout to these decisions, the validity of Futuremark's most important product is brought into question
4)A further result is Futuremark takes a direct loss of hundreds of thousands of dollars when nVidia refuses to join the program.

nVidia's decision severly impacted the reputation and bottom line of FM. If you think Futuremark wasn't extremely upset at nVidia after that turn of events, you're mistaken.


1) Man, WTF are you talking about? It's not like nvidia was being extorted. They were a happy member of the beta program for years until their card proved to get smacked around in this bench. There are many different levels to the beta program as you can see here. Beta Program There is no reason that nvidia can't afford the fees and Beyond3D can. They could afford the "hundreds of thousands of dollars" up until now. Until you actually look at the beta program data that is readily available and get a clue as to what you are talking about, you should STFU.

2) When some actual DX9 games come out, then we can compare. Seems to me most of the industry is all over 3DMark. Even nvidia, who happened to trot out 3Dmark in all it's flawed glory when they seemed to have an advantage. nvidia has plenty of reason to worry when they are losing OEM ground to ATI. For instance, Dell's high end and low end products have ATI plastered all over the place.

3) There was always a question of the validity of 3dmark. That's not new. To the world outside of our niche, 3dmarks are king. Valid or not, they are the first thing people look at to compare new video cards. nvidia knows this and tried to spread FUD when their card came up short.

4) nvidia was already a member, they did not refuse to join, they left and did not re-up, spreading FUD in their wake.

Don't you think Futuremark pretty much has nvidia by the nuts here? If nvidia could send them a cease and desist order, don't you think they would? Word has hit the street. nvidia cheats.
 

Shamrock

Golden Member
Oct 11, 1999
1,439
560
136
Originally posted by: ElFenix
Originally posted by: Shamrock
hmm,

I find it funny that Anandtech didnt use 3dMark...why is that? Because he doesnt have the confidence in the program itself?

Also, everyone of you guys that own a Pentium 4...praise them for their "SSE2" OPTIMIZATION extensions while the athlon gets a close enough score with RAW horsepower. Before you scoff at Nvidia for cheating, should you not also take a look at Intel for optimizing with software? If you feel differently, then you're a hypocrite. Intel "optimizes" to include the SSE2 extensions, while AMD does not...let's DISABLE the SSE2 instructions, and see how fast the P4 goes then. I've eseen the SSE2 instructions disabled on Adobe Photoshop 7, and it was slower than the Macintosh! WHile the Athlon cruised.
i clearly explained what a cheat is and what isn't. you can simply ignore that and spout off inane crap if you like.

your explanation of a cheat is your own opinion, not the gospel, I may have a differing opinion...

you also totally disregarded the fact that Anand didnt use 3dMark...and no one has yet to answer it
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Originally posted by: ElFenix
Originally posted by: SexyK
2)nVidia decides that 3dmark03 isn't a realistic DX9 benchmark (a fact that many in the industry agree on)
.

name 3.

HardOCP
"3DMark03, a DirectX 9 Benchmark?" This is one question that we really must ask ourselves. Game 1 is a very simple DX7 test that is not representative of any current games. Game Tests 2 and 3 are DX8.1 but use Pixel Shader 1.4, which is not used by any games we are aware of and will not be to our knowledge. Game 4 is a hybrid of DX8/DX9. It is these four tests that determine the overall score. Only one game test in this benchmark DirectX 9 and then only partially.

Is 3DMark03 really a good indication of what ?Real World? gaming is? Is 3DMark03 really "forward looking"?

With all that we have seen so far it does not appear to actually give us any indication how video cards are going to compare in any real games. It produces an overall 3DMark which is taken from unbalanced game tests. Furthermore as we have seen directly above in the benchmarks, drivers can be optimized to run the game tests faster. The problem is this is just a benchmark and not based on any real world gaming engines. Therefore while you may get a higher 3DMark number you will not see ANY increase in performance in any games that are out there.

One issue we have been discussing as of late with amoungst ourselves and with ATI and NVIDIA, is the implementation of real world game benchmarking versus synthetic benchmarking. We do feel that synthetic benchmarks do have their place if used properly. Actual games or gaming engines should remain the primary focus though. When it comes right down to it we would rather see improvements and gains in real world games rather than in synthetic benchmarks.

In closing, Kyle has informed me that Hard|OCP will not be using the overall 3DMark03 score to evaluate video cards.

Tom's Hardware
And here's where the real discussion begins. Even NVIDIA is clearly distancing itself from the new release and believes that this benchmark test has no relation to reality. For one thing, the choice of game tests, beginning with the rarely played flight scenes test (Wings of Fury), is nothing close to actual practice. It goes further with the tests GT2 and GT3. The 3D engine that makes up the foundation of the tests is the same, and both tests use pixel shaders based on version v1.4. Here is an excerpt from NVIDIA's statement:

These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn't support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4.

These shaders don't only ensure bad results of PS 1.1 cards compared to those that support PS 1.4 (they need more passes for the effect) they are also hardly used in actual 3D games. Xbox can't run PS 1.4 code as well. Even more serious is that 3DMark03 test runs use different shader codes for different cards. This makes comparisons between different 3D-chips close to impossible. In 3D Mark 2001 SE, this was only the case with a special PS1.4 test. Now, however, all tests, including GT4 (Mother Nature), are actually no longer comparable to one another.

Still digging up a third link. Keep in mind, though, that Anand doesn't use 3dmark03 in his video reviews
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Shamrock

I find it funny that Anandtech didnt use 3dMark...why is that? Because he doesnt have the confidence in the program itself?

Also, everyone of you guys that own a Pentium 4...praise them for their "SSE2" OPTIMIZATION extensions while the athlon gets a close enough score with RAW horsepower. Before you scoff at Nvidia for cheating, should you not also take a look at Intel for optimizing with software? If you feel differently, then you're a hypocrite. Intel "optimizes" to include the SSE2 extensions, while AMD does not...let's DISABLE the SSE2 instructions, and see how fast the P4 goes then. I've eseen the SSE2 instructions disabled on Adobe Photoshop 7, and it was slower than the Macintosh! WHile the Athlon cruised.
i clearly explained what a cheat is and what isn't. you can simply ignore that and spout off inane crap if you like.[/quote]

your explanation of a cheat is your own opinion, not the gospel, I may have a differing opinion...

you also totally disregarded the fact that Anand didnt use 3dMark...and no one has yet to answer it [/quote]
I still fail to see what relevance Anand using 3Dmark or not has to do with nVidia cheating on a benchmark. Does Anand or whoever else not using or liking the benchmark somehow make it ok that nVidia cheated? No, it doesn't. It has nothing at all to do with it.

SSE optimizations VS nvidia cheating is a ridiculous comparison. And BTW, Future Mark calls it a cheat and proved it. Their opinion has a little more weight than yours.
 

tbates757

Golden Member
Oct 5, 2002
1,235
0
0
Originally posted by: Pabster
Everybody is cheating!

Seriously, this shouldn't be a big shock. ATi's done (doing?) it before, it is nVidia's turn.

Shouldn't give companies a pass for unethical behaviour, no matter if it has been done before or not.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
0
Tim Sweeney basically says it all. ATI's so called cheat is a legitimate optimization and Nvidia's so called cheat is just that...






A Cheat
 

NFS4

No Lifer
Oct 9, 1999
72,647
27
91
Originally posted by: Dean
Tim Sweeney basically says it all. ATI's so called cheat is a legitimate optimization and Nvidia's so called cheat is just that...






A Cheat

OWNED??
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Maybe I'm reading it wrong, but Sweeny's piece seems to say that there is way to much grey area in the world of 3d graphics to make black and white distictions between cheating and optimizing.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: Shamrock
hmm,

I find it funny that Anandtech didnt use 3dMark...why is that? Because he doesnt have the confidence in the program itself?

Also, everyone of you guys that own a Pentium 4...praise them for their "SSE2" OPTIMIZATION extensions while the athlon gets a close enough score with RAW horsepower. Before you scoff at Nvidia for cheating, should you not also take a look at Intel for optimizing with software? If you feel differently, then you're a hypocrite. Intel "optimizes" to include the SSE2 extensions, while AMD does not...let's DISABLE the SSE2 instructions, and see how fast the P4 goes then. I've eseen the SSE2 instructions disabled on Adobe Photoshop 7, and it was slower than the Macintosh! WHile the Athlon cruised.

and I also have a question...about these new drivers, the 44.03

Does this "optimization" affect ALL cards, or just the GFFX 5900 series? Because I have the new drivers in my GF3 Ti200, and I find NO FLAW in ANY game I play, and that's over 15 games on my harddrive. I have not seen a glitch, nor a bug, nor slowdown, nor crashes from these new drivers...however, I HAVE seen performance gains in ALL my games,with no problems whatsoever. So...

3dMark can stick their little benchmark, I play games, and that what counts..REAL world benchmarks...I tend to like the benchmarks Anandtech had. Oh yeah, they were also impressive...I will be buying a GFFX 5900

Another thing....Why did Anandtech get a bug where ATI cards have that window shining through the head of Sam on Splinter Cell? is ATI cheating because of that? hmm...maybe they are manipulating their own drivers...because Splinter Cell IS a DX9 game (it says it on my box)


Terrible comparison in involving the p4 and sse2 enhancements to this....It truly shows you have no reading comprehension or any common sense....

this wasn't an Intel AMD flame war yet you try to take it there..............

Also your adobe comparison is also crap as well..That may have been more truth back in the days of the early p4 willamette but thanks to amd's crappy pr rating the lead isn't what it was once was.....now they are well over 1ghz behind in speed and I don't think we need to get into cpu arhcitectures cause I don't think you are smart enough to follow that as well!!!!
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
0
Originally posted by: SexyK
Maybe I'm reading it wrong, but Sweeny's piece seems to say that there is way to much grey area in the world of 3d graphics to make black and white distictions between cheating and optimizing.


I've been jumping all over the net today checking on the discussions regarding this matter. From what I read it appears ATI's method of "optimizing" does not effect the quality of the end result. Nvidia's "optimizing" method runs quite a bit deeper and gives it lower accuracy and quality on the end result. ATI may be running around in the grey area but Nvidia seems to running deep into the brown, if you know what i mean.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: SexyK
Maybe I'm reading it wrong, but Sweeny's piece seems to say that there is way to much grey area in the world of 3d graphics to make black and white distictions between cheating and optimizing.
Only for some of the slighter cheating like (say) using 32-bit RGBA color (8 bits per channel) when the game author expected 16 bits of alpha.

Some of nvidia's cheats are clearly over that line, such as hard clipping the sky in a way that only works for one canned camera angle and that breaks as soon as the camera is unlocked.

An optimization would make a game or demo runs faster in general: if the camera is changed, and if it isn't a specific pre-recorded timedemo. The optimzation is valid (not cheating) if it provides identical image quality, or identical within the bounds of the OpenGL and DX9 api specs. ATI's quack cheats were not valid because they did not provide equivalent image quality.

SSE2 optimizations are valid (not cheating) because they produce identical results to the non-SSE2 code path, just more quickly. For example, if the non-SSE code path has a loop to do 100,000 multiplications one-at-a-time while the SSE2 code path uses the SSE2 assembly instructions to do 25,000 multiplications in batches of 4-at-a-time. This is somewhat similar to how AMD justifies their PR rating ("more work per clock cycle").
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
An optimization would make a game or demo runs faster in general: if the camera is changed, and if it isn't a specific pre-recorded timedemo. The optimzation is valid (not cheating) if it provides identical image quality, or identical within the bounds of the OpenGL and DX9 api specs. ATI's quack cheats were not valid because they did not provide equivalent image quality.

Bingo.

ATI's "optimization" of 3D Mark is valid because, had there been a game based on the game 4 engine, it would run faster with identical IQ as what the final score lets on and what the developpers intended.
nVidia's "optimization" of 3D Mark is INvalid because, had there been a game based on these engines. They would all run slower and with poorer image quality than what the card's score let's on and what the developper intended.

I don't see why this is so hard to grasp.

nVidia cheated with this driver. Period
ATI cheated with the "quake"-detecting driver. Period.

ATI partially redeemed itself by eventually providing drivers that surpassed the cheating drivers' performance without cheating. Will nVidia do the same? I hope they do.



 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
nVidia cheated with this driver. Period
ATI cheated with the "quake"-detecting driver. Period.
I used to hold ATi's cheating firmly against them, my last card of theirs being a VGA Wonder XL (yes, they cheated long before the Quake fiasco).

Now nVidia has "leveled the playing field" for me as I must now consider both companies on lower rungs ethically. Now I can't boycott both since I'd have no video output! Besides cheating on a single benchmark, in my view, is not reason enough to ban either company for life.

So bottom line: I'll now consider buying ATi video boards again.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
ATI may be running around in the grey area but Nvidia seems to running deep into the brown, if you know what i mean.

Hah.. that cracked me up. Well done. Anyway, I agree with what you are saying. I'm not trying to say that nVidia didn't do something they shouldn't have. All I'm saying is I think it's prudent to wait a bit and give them a chance to explain before jumping on them like this. They have a pretty good track record in my book, and I'm not going to throw that all away because one (somewhat dubious) company has made an accusation against them. I guess you guys are willing to do that, but I'm not. I've got machines with ATi cards, I've got machines with nVidia cards, I don't really feel any passionate ties to either of them, but I do know that nVidia has been a pioneer in the industry, and has always offered great products and great support. I'm not going to write them off without giving them a chance to clear their name. You can say whatever you want, but in my opinion, they deserve a chance, and I'll give it to them whether you want me to or not.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
I think it's prudent to wait a bit and give them a chance to explain before jumping on them like this

Precisely. I waited for their official response and here is the relevant quote:

We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad.

Yes, the official response is big bad Futuremark made a boo-boo on poor baby nVidia. I think nVidia and its fanboys ought to be cringing in embarassment after this quote. nVidia obviously knows what Futuremark did because they cheated their way around it.

This thread is as worthless as mammaries on a wild male pig.

They're only useless if you don't know what to do with 'em
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |