Ryzen's poor performance with Nvidia GPU's. Foul play? Did Nvidia know?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
May 11, 2008
20,055
1,290
126
LOL where do you guys get this stuff?

Do you just make things up as you go along? NVidia drivers scaling poorly on CPUs with more than four cores/threads eh? Then by gosh, how do you explain this? As far back as Kepler, NVidia drivers scaling with HT enabled on a 3770K whereas AMD's driver chokes:



Perhaps something a bit more modern then? How about a GTX 1080 scaling all the way to 10 cores/20 threads in Ghost Recon Wildlands:



And just a few months ago, Computerbase.de did a test on CPU scaling with a Titan X Pascal, and look at what they found.



The truth is, you guys have no idea what you're talking about. NVidia's driver scales wonderfully on high core/threaded CPUs, and NVidia's drivers have been native 64 bit for years

Also, CPU scaling has more to do with the game itself than the drivers. If a game is programmed to only use four threads, then no amount of driver trickery will change that.

Could you post the same images with respect 1080p , 1440p and 4k ?
 

w3rd

Senior member
Mar 1, 2017
255
62
101
No problem. Just needed the correction. It's already a mess trying to explain the problem to deniers anyways. I wanted to make sure they aren't filling the thread with useless benches that don't actually apply to the issue.

Also, the frequency in which one visits their tech sites, vary. It is hard to keep up to date on the minute to minute stuff.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,655
136
LOL where do you guys get this stuff?

Do you just make things up as you go along? NVidia drivers scaling poorly on CPUs with more than four cores/threads eh? Then by gosh, how do you explain this? As far back as Kepler, NVidia drivers scaling with HT enabled on a 3770K whereas AMD's driver chokes:



Perhaps something a bit more modern then? How about a GTX 1080 scaling all the way to 10 cores/20 threads in Ghost Recon Wildlands:



And just a few months ago, Computerbase.de did a test on CPU scaling with a Titan X Pascal, and look at what they found.



The truth is, you guys have no idea what you're talking about. NVidia's driver scales wonderfully on high core/threaded CPUs, and NVidia's drivers have been native 64 bit for years

Also, CPU scaling has more to do with the game itself than the drivers. If a game is programmed to only use four threads, then no amount of driver trickery will change that.
As I already corrected him on, the problem is with DX12 and not DX11. It's also somewhat new as pcgameshardware.de showed scaling in October buy by Computercase.de's Ryzen review it stopped.
 

jpiniero

Lifer
Oct 1, 2010
14,841
5,456
136
If it's really only DX12, can you really blame nVidia for it? Developers are much more in control there.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,362
5,026
136
Developers can certainly optimize for nVidia hardware. You can only squeeze so much blood out of a turnip, however.

DX12 performance gains hit their cap earlier with nVidia hardware (at least, prior to Volta) because of the software scheduler and the need for multi-threaded DCLs. There is a cost associated with software simulation of hardware-level features. That's what you are seeing in these results.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,655
136
If it's really only DX12, can you really blame nVidia for it? Developers are much more in control there.
Have you read any of the other posts? This only applies to Nvidia, so not the developers, and these did scale properly in the past on Nvidia video cards.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,655
136
Developers can certainly optimize for nVidia hardware. You can only squeeze so much blood out of a turnip, however.

DX12 performance gains hit their cap earlier with nVidia hardware (at least, prior to Volta) because of the software scheduler and the need for multi-threaded DCLs. There is a cost associated with software simulation of hardware-level features. That's what you are seeing in these results.
This isn't about the DX12 penalty with Nvidia. This is a relatively recently change in their drivers. I surmise it was done to improve performance on the 4c8t Intel processors since that is the mostly likely used configuration. That configuration now is the only CPU not seeing a penalty.
 

jpiniero

Lifer
Oct 1, 2010
14,841
5,456
136
Have you read any of the other posts? This only applies to Nvidia, so not the developers, and these did scale properly in the past on Nvidia video cards.

Wait, you are suggesting that nVidia has regressed performance in DX12 drivers? None of the posts are suggesting that. 480 vs 1060 is a different topic altogether.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,655
136
Wait, you are suggesting that nVidia has regressed performance in DX12 drivers? None of the posts are suggesting that. 480 vs 1060 is a different topic altogether.
That is exactly what I am suggesting and have stated probably a dozen times. I outlined it in a pretty detailed post here.

http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=thread...-did-nvidia-know.2503650/page-4#post-38843902

My current working theory was that most people were using normal i7's and i5's and they optimized the drivers for those CPU's so that they could get rid of the performance penalty when using DX12. The downside means that scaling stops at 4c. On more than 4c you see not only scaling stop but they still suffer from the Nvidia DX12 performance penalty. It's the last part that adds to Ryzen's performance descrepency, not only are their extra cores not being used, not only are they down on clock and IPC, but they still have to deal with the CPU overhead of the Nvidia driver.
 
Reactions: unseenmorbidity

jpiniero

Lifer
Oct 1, 2010
14,841
5,456
136
nVidia has had ongoing issues with DX12. I knew it was only a matter of time until someone blamed the devs though.

nVidia being faster in DX11 isn't an "ongoing issue".

That is exactly what I am suggesting and have stated probably a dozen times. I outlined it in a pretty detailed post here.
http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=thread...-did-nvidia-know.2503650/page-4#post-38843902

That seems like a pretty big stretch. You'd really need to compare drivers with the same version of the game to really come to that conclusion.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Yes, let us blame nVidia for a bad DX12 port when the developers are now responsible for >90% of the work.

Funny that the same people never blamed AMD for their bad DX11/OpenGL performance. Guess DX11 is more low level then DX12.
 
Reactions: Arachnotronic

Sven_eng

Member
Nov 1, 2016
110
57
61

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Each DX12 port is unique. Look at Tomb Raider and Deus Ex. Nixxus did both ports and yet the result is different:
Deus Ex: DX12 is 33% slower on nVidia
Tomb Raider: DX12 is ~30% faster on nVidia

Both in CPU limited scenario.
 
Reactions: unseenmorbidity

AMDisTheBEST

Senior member
Dec 17, 2015
682
90
61
why must we need drivers anyway? everything should work right out of the box, like linux.... (really, 90% drivers for linux are in the kernel)
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yes, let us blame nVidia for a bad DX12 port when the developers are now responsible for >90% of the work.

Funny that the same people never blamed AMD for their bad DX11/OpenGL performance. Guess DX11 is more low level then DX12.
More deflection. Predictably so.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
More deflection. Predictably so.

What do you expect him to say? The theory that NVidia's drivers have issues with scaling above four cores is just pure uneducated speculation, not borne of any facts. I've given an example of NVidia's driver scaling to as high as 10 cores, then I was told that this "problem" affected DX12 only, which is preposterous if you know anything about DX11 and DX12. Then I used Gears of War 4, which scales to six threads, and was told that it's not a proper DX12 title even though it has no DX11 rendering path.

So I used Ashes of the Singularity, an AMD sponsored title which clearly showed a 5960x significantly outperforming a 4770K in DX12, and was told that it's just a synthetic benchmark

So now that this thread has obviously been hijacked by trolls, the real question is why hasn't it been locked?
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You claim nVidia has "ongoing issues" with DX12, but "it's faster with DX11" is ultimately what that issue is, isn't it?

I don't recall anyone blaming/crediting the devs with AMD's overall DX11 performance. Specific games like Gameworks titles, sure. But not that they aren't coding DX11 poorly and it was effecting AMD and not nVidia.

What do you expect him to say? The theory that NVidia's drivers have issues with scaling above four cores is just pure uneducated speculation, not borne of any facts. I've given an example of NVidia's driver scaling to as high as 10 cores, then I was told that this "problem" affected DX12 only, which is preposterous if you know anything about DX11 and DX12. Then I used Gears of War 4, which scales to six threads, and was told that it's not a proper DX12 title even though it has no DX11 rendering path.

So I used Ashes of the Singularity, an AMD sponsored title which clearly showed a 5960x significantly outperforming a 4770K in DX12, and was told that it's just a synthetic benchmark

So now that this thread has obviously been hijacked by trolls, the real question is why hasn't it been locked?

So, is it the different games or is it the different driver optimizations for the games? If the games scale with one brand but not the other that should let the game off of the hook. And I haven't looked at it closely at all to say anything. Other than to dismiss out of hand that it's nVidia's drivers and then place the blame squarely on the devs just wreaks of typical nVidia excuses. They always blame someone else. When I read forum posters immediately jump in with it, I assume they are just delivering nVidia's blame everyone else canned response.
 
Reactions: DarthKyrie

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Then I used Gears of War 4, which scales to six threads, and was told that it's not a proper DX12 title even though it has no DX11 rendering path.
Unreal Engine has a long development process optimizing for a given API. You're not going to have a completely revamped DX12-only engine that sheds all the previous baggage just because Microsoft decided to port Gears of War to the PC.
So I used Ashes of the Singularity, an AMD sponsored title which clearly showed a 5960x significantly outperforming a 4770K in DX12, and was told that it's just a synthetic benchmark
What's so enlightening about that? GPU-bound tests showed better scaling on the Fury X vs the 980Ti. Also, this is straight from Kyle Bennet:

First and foremost, I hate the AotS benchmark for use in any kind of graphical / GPU data collection. It has always proven to be an outlier in the world of canned GPU benchmarks. There have been a lot of conclusions that it has pointed to that have simply not panned out in the world of Triple A gaming titles. We have even used it here recently, but against my liking. I simply do not merit its abilities to give any sort of direction in terms of GPU performance. You may not like my opinion on that, but it is likely not the first time that has happened.
He later clarified:
....it is a game that is basically played by nobody but still used by AMD as a credible benchmark as to its gaming performance.
Disparaging as it may sound but his problem is with the benchmark, not the engine:
My remarks are to the AotS benchmark and not the game engine. Oxide is doing some awesome things with that engine and I am very happy to see it moving forward in the VR world. Awesome stuff.
Just like Doom isn't going to reflect the proliferation of AAA-games with Vulkan support optimized for specific hardware, AoTS isn't a reflection of well-threaded DX12 games flooding the market.
So now that this thread has obviously been hijacked by trolls, the real question is why hasn't it been locked?
A hardly surprising non-argument. How about you explain this:
http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=thread...and-discussion.2499879/page-216#post-38824172
 

DrMrLordX

Lifer
Apr 27, 2000
21,807
11,161
136
They have absolutely no reason to sabotage Ryzen's paradade for purpose. AMD's cpu department ain't their direct competitior, and Ryzen sells very well without their optimized drivers. If they can't get their drivers working well with Ryzen, it will only mean more Vega sells in the future. No, nvidia's best interest is to fix their drivers asap as they want as many Ryzen platforms paired with their gpu.

I haven't followed this thread from start to finish, but I'm going to have to agree here. Nvidia (at worst) got caught with their pants down. Now people are snapping up AM4 systems with 6-8 cores each and SMT and people are starting to notice the problem.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So, is it the different games or is it the different driver optimizations for the games? If the games scale with one brand but not the other that should let the game off of the hook. And I haven't looked at it closely at all to say anything. Other than to dismiss out of hand that it's nVidia's drivers and then place the blame squarely on the devs just wreaks of typical nVidia excuses. They always blame someone else. When I read forum posters immediately jump in with it, I assume they are just delivering nVidia's blame everyone else canned response.

I'm not dismissing that the driver is at fault. I think it is, somehow, but not intentionally. Likely it's just that NVidia has to optimize their drivers for AMD's SMT. This is the first SMT enabled CPU that AMD has ever produced to my knowledge, and it's obviously different from Intel's. That combined with the fact that AMD never sent any samples to NVidia to test, means that NVidia never got a chance to optimize their drivers for Ryzen.

Intel on the other hand has had SMT capable CPUs for years out in the field, and NVidia has definitely optimized for it as seen by the benchmarks I posted a few pages back.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Could you post the same images with respect 1080p , 1440p and 4k ?

Sorry I missed this post, but those benchmarks I posted only come at those low resolutions, which is the point of CPU benchmarking because it takes the burden off of the GPU and puts it on the CPU.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |