No problem. Just needed the correction. It's already a mess trying to explain the problem to deniers anyways. I wanted to make sure they aren't filling the thread with useless benches that don't actually apply to the issue.Thanks.
Actually it has to do with the game and the drivers both. While drivers can't magically make a game scale beyond what it's programmed to do, a game can't fix poor drivers either.
He was wrong the problem is specific to DX 12 on Nvidia video cards. I posted just earlier today an example of previous scaling and the fact that it stopped somewhat recently in probably one of the best examples of well threaded gameplay.
Citing one game unfortunately does not prove your case. Anyone that knows anything about DX12, will tell you that the game's programming and optimization matter more than ever when it comes to how it will perform on any given hardware. BF1 obviously doesn't have a good DX12 implementation because it loses significant performance with DX12 on NVidia hardware, and can even lose performance on AMD hardware depending on the configuration.
At any rate, check out Gears of War 4. It scales to six threads on NVidia hardware, and is DX12 only unlike BF1
I know it doesn't fit your agenda, but you're wrong. Driver quality can bring a game performance to it's knees. To the point of being unplayable purely due to the drivers. Now you go ahead and repeat yourself again like it'll change anything.I think you need to re-read what I wrote. I never said that drivers don't matter, I said that CPU scaling has more to do with the game programming more than the drivers. Both matter yes, but the game programming matters more. Otherwise why would we have a game like Ghost Recon scale on deca core CPU, compared to something like Far Cry 4 which only uses a quad core effectively?
Let me summarise:
Topweasel: nvidia used to scale fine with more cores, but with current drivers they don't scale above four cores anymore.
Carfax83: you are wrong, look at these old benchmarks nvidia drivers does scale with more cores.
Citing one game unfortunately does not prove your case. Anyone that knows anything about DX12, will tell you that the game's programming and optimization matter more than ever when it comes to how it will perform on any given hardware. BF1 obviously doesn't have a good DX12 implementation because it loses significant performance with DX12 on NVidia hardware, and can even lose performance on AMD hardware depending on the configuration.
At any rate, check out Gears of War 4. It scales to six threads on NVidia hardware, and is DX12 only unlike BF1
No one is questioning that Nvidia doesn't do well in DX12. It's the cost of the Arch. But that isn't what is being pointed out to you.
Citing one game unfortunately does not prove your case. Anyone that knows anything about DX12, will tell you that the game's programming and optimization matter more than ever when it comes to how it will perform on any given hardware. BF1 obviously doesn't have a good DX12 implementation because it loses significant performance with DX12 on NVidia hardware, and can even lose performance on AMD hardware depending on the configuration.
At any rate, check out Gears of War 4. It scales to six threads on NVidia hardware, and is DX12 only unlike BF1
I'll watch those video's once you in particular can articulate why in games that scale fine in DX11 on Nvidia, Scale fine with AMD in DX12, now doesn't scale with Nvidia in DX12. Also please explain why that scaling stopped happening recently. Nothing you have stated actually describes the issue and I'd rather not waste my time on a "It's not us, it's them" video if it isn't actually going to cover that issue.And i said it multiple times: You have to understand what DX12 is. nVidia's DX11 driver "scales" over multiple threads. With DX12 it is the job of the application to mimic nearly everything of the DX11 driver. Watch the video i linked. nVidia is explaining it.
I know it doesn't fit your agenda, but you're wrong. Driver quality can bring a game performance to it's knees. To the point of being unplayable purely due to the drivers. Now you go ahead and repeat yourself again like it'll change anything.
I would not use Gears of War 4 to say that DX12 and NVIDIA does not have issues. Only the AotS engine at the moment is built from the ground up with DX12 in mind.
Lets not forget about the UWP version of Quantum Break, which came with DX12, that Remedy themselves had said that it was difficult to implement DX12. But the results are very interesting nonetheless, an R9 390 almost 50% faster than the GTX 970 in DX12, and even with improved DX12 support the GTX 1060 still suffers a 15% drop compared to DX11, while the RX 480 is identical performance-wise in both the APIs.
I noted 4 games. 4 Games that scaled with more than 4 cores in DX11 and stopped scaling on DX12. I used BF1 as an example because I also showed evidence that it used to scale. I'll check out the Gear test later.
LOL my agenda? This is the like the pot calling the kettle black. I don't even know why I'm talking to you to be honest, because you're intentionally misrepresenting what I'm saying and your reasons are suspect given your ardent nature when it comes to AMD. I never claimed at any point that driver quality doesn't matter as it relates to CPU scaling. I even provided evidence that it does matter on the previous page with Watch Dogs.
I'm only saying that it matters less than actual game programming when it comes to core/thread usage. But typically, you have now shifted goal posts by bringing up overall driver quality which had nothing to do with what we were discussing originally.
Like I said, typical.
So a DX12 only game shouldn't be used, but a AMD sponsored DX11/DX12 hybrid game should because "reasons." And don't even bring up Quantum Break. That game is pathetic in terms of optimization.
But I'll humor you this time, just because I'm interested to see what excuses you will conjure up next Here is DX12 scaling on Ashes of the Singularity featuring a quad core and an octa core CPU. As you can see, the octa core CPU is significantly faster than the quad core CPU in this game, which implies that the game and the driver are able to utilize more than four threads in both DX11 and DX12.
Do you agree or disagree?
I caught what you said the 1st two times you posted it. But carry on defending nVidia and blaming the devs.
In one DX12 game, NVIDIA's drivers don't cause any issue, and that's because this "AMD sponsored" game has had optimizations done for both AMD and NVIDIA. In most of the other games that support DX12, NVIDIA seems to have issues.
Quantum Break DX12 ran flawlessly on AMD cards, and as I said before, R9 390 is almost 50% faster than the GTX 970. Remedy talked about how difficult it was to implement the advanced rendering features of the game in DX12 at GDC, that's why it is relevant because we know that in its first UWP iteration, it was built with DX12 in mind.
Gears of War 4 doesn't count because because the tight integration of new APIs like DX12 and Vulkan isn't possible in engines like UE4 - developers who make their engines from scratch have better chances of implementing DX12 properly.
Oh and Kyle over at [H] where you pulled that graph from considers AotS more as a synthetic benchmark than an actual game benchmark, and I agree with him on this. So yeah, AotS results don't say much about say much after all about the efficiency of NVIDIA's drivers.
Speaking of AotS, Kyle did some testing with the heavy load preset as well, which stresses the GPU. This is what you would want under normal gaming scenarios. The results are quite revealing:
I don't know if I would say most. NVidia's DX12 has definitely been improving lately and in DX12 only games, NVidia seems to have an edge over AMD.
This is you talking out of your ass for sure. Unless you're a graphics programmer that works for Epic, you have no idea what is and isn't possible on the UE4. That said, no current DX12 game is pure DX12 from the ground up, as all the engines are still hybridized to some degree. This will change eventually however.
Compared to the RX 480, the GTX 1060 fares worse in DX12 and the GTX 970 fares much worse.Quantum Break DX12 on AMD runs slower than Quantum Break DX11 on NVidia. That's all I need to say.
The "excuse" I was waiting for. Glad to see you didn't disappoint.
So you went from pedestaling Ashes of the Singularity to criticizing it once you found out it didn't support your argument about NVidia driver scaling in DX12
Anyway, the only revealing thing about those benchmarks is how terrible AMD's DX11 driver is in comparison to NVidia's. If AMD's DX11 driver is still CPU bottlenecked to that degree even at 1440p and crazy settings, then that's very revealing indeed.