RussianSensation
Elite Member
- Sep 5, 2003
- 19,458
- 765
- 126
DX12 is not going to fix AMD's performance of this game at 4k and 8X AA.
Since when do we need 8xMSAA at 4K. I remember once you got your 1440p monitor, the first thing you said is you can play games with 0-2xMSAA. Now we need 4-8x MSAA at 4K? Talk about shifting goal posts. Also, what are the chances a gamer with a 4K monitor is using RX 480/GTX1060 level card? In that case, it's automatically an NV-built system with a 980Ti/SLI, GTX1070/SLI, 1080/SLI or Titan XP/SLI. As far as 1080p and 1440p performance is concerned, anything at the RX480/1060 level is more than capable for this style of game.
This is simply an Nvidia favored game right now.
Yes, but with 2 major caveats:
1. In the real world with most PC gamers having CPUs far below i5-6600/i7 4770K, the game's performance will dip well below 50 fps even on a GTX1080, even at 1080p. In other words, the extra GPU performance, whether on the AMD or NV side, will not always show up due to the CPU bottleneck. How many gamers are still using i3/i5 from Sandy/Ivy/Haswell era? I bet a lot.
2. Even with sub-60 fps performance, we are talking about a 3rd turn-based board game. Once again, the usual suspects are making a BIG deal about one of the most obvious AAA games of 2016 that absolutely does not require 60 fps to enjoy it unless you are the most hardcore CIV player out there. In that case you better be ready to upgrade to an i7 7700K and hope it hits 5-5.1Ghz because that's what it sounds like the game will need to maintain smooth 60 fps even on a 1080.
I'm sure drivers and game optimizations can help AMD out, but DX12 performance at 1440p and 4k is going to be largely close to DX11 performance even a year from now unless Firaxis gets some deep AOTS type optimizations, in which case would benefit both camps.
How do you know this? You aren't a programmer so I am stunned that you are making such claims with such confidence. The fact of the matter is, unless a PC gamer has an i5 6600K/i7 6700K @ 4.7-4.8Ghz, there should be almost no discussion about "AMD or NV winning" in this title because the real world benchmarks under CPU limited scenarios show sub-50 fps performance even on a GTX1070 with still high-end i7 4770K and i5-6600. To say that NV is winning in this title is a Pyrrhic victory because the performance given the CPU bottleneck and the level of graphics leaves a sour taste in the mouth. Thankfully this is a title that runs well even at a locked 30 fps. The game's engine is completely broken for a strategy game as can be seen from this:
That's some horrible, horrible CPU optimization for a 2016 strategy title that millions will purchase. Not only is the game CPU bottlenecked across threads because there is no efficient multi-threading in place, but even the single CPU core isn't efficiently utilized because the i7 5960X is only loaded to 70% on that one core.
Here is Total War Warhammer DX11
vs.
DX12
Night and Day difference. DX11 is simply outdated for modern strategy games and AoTS, Total War Warhammer are two obvious examples that are proving it. DX11 is literally holding back modern GPUs by gimping modern AMD/Intel CPUs. As a result, we end up with CPU & GPU under-utilization which is insane for a 2016 title that was years in the making and will be targeting tens of millions of PCs bought in the last 5-6 years. The ONLY aspects that save Firaxis' face are the hope that DX12 will fix these issues and that in a turn-based strategy, almost no one cares about the FPS as long as it's not choppy. Otherwise, this is some shoddy level of optimization.
It's still too early to judge this game until we find out more clarity about the DX12 patch for this title. Mantle in Civ BE, DX12 in AoTS and Total War Warhammer made the performance of those games under DX11 irrelevant for GPUs that could take advantage of DX12.
The amount of anti-DX12 posts in the last 1-2 years is mind-boggling. I would much rather developers abandon DX9-DX11 entirely, and as a result obsolete my Hawaii and Pascal rigs, than continue to see 2016-2017 AAA games be gimped by DX11. This game clearly shows that DX11 neuters modern i7s and gimps 1080/1070 SLI/Tiatn XP GPU level of performance due to the DX11 draw-call bottlenecks and lack of ability to support future Async capable GPU architectures. I get why the developer had to target DX11 because so many PC gamers still use older GPUs and CPUs but come on it's 2016, at least provide a proper DX12/Vulkan code-path for those of us with modern modern systems that can take advantage of it, especially after seeing how Mantle benefited Civ BE.
Not only that, but at some point I really want to be able to take advantage of a 6-10 core CPU and until the serial and legacy in nature DX11 is in place, we simply cannot move forward towards efficient CPU multi-core utilization. I cannot wait for AMD/NV to go all-in on Async Compute and DX12 with their 2018 architectures so that DX9-11 GPUs get obsoleted as fast as possible and developers move away from those ancient game engines. Raja is right that with Moore's Law even though GPU hardware continues to get faster every year, the software inefficiency is simply out of this world and it has got to change. I am hoping that by the time PS5/XB2 launch in 2020, DX9-11 is buried completely as by then we should be able to purchase a 8-core CPU for $350 and as GPUs become even more complex, there will be even more underutilized CUDA/Stream Processors that could be used for parallel compute tasks.
Last edited: