They are not even 11.2 today as they claim. And yes, they will partly support DX11.3, just like any DX11.0 card using fallback. However they wont fully support DX11.3 Nor will they support DX12 in its full mode, only in he API reudction mode.
There is absolutely NOTHING in what you link that points to anything different. So lets not make up stuff that isnt there.
I've done my research as always but as usual you haven't.
"Intel has graciously agreed to make the source for their compelling Asteroids demo available to all developers in the DX12 Early Access program. Oh, and the screenshot above? That was a screenshot from an Intel Haswell graphics DX12 machine running UE4.4’s Landscape Mountains demo."
http://blogs.msdn.com/b/directx/archive/2014/10/01/directx-12-and-windows-10.aspx
According to you Haswell doesn't even support DX12. Oops.
I am inclined to believe AMD, NV, Intel and MS over your unsubstantiated opinion. All 4 have committed to have their DX11 cards support DX11.3 and most of DX12 feature set that will go back to Fermi, Kepler and GCN 1.0, 1.1, 1.2 and Haswell GPUs. Even if all of those cards don't support the "full DX12" feature set, you haven't actually told us why this matters in terms of IQ or performance for those of us running overclocked i5/i7s that would barely benefit from Mantle-like CPU overhead reduction of DX12.
Most importantly, since we would need Windows 10 for those next gen DX11 feature sets and considering neither you nor anyone in this thread can provide any proof of the exact importance of DX11.2, 11.3 and 12 for the next wave of future games starting with Dead Light, Project CARS, The Division, The Witcher 3, etc. your entire point to single out outdated DX11 spec feature support you outlined regarding Kepler, Haswell and GCN is irrelevant to the discussion of Witcher 3. Unless you or anyone else here can prove that Tile-based Deferred Renderer, or any other DX11.2/11.3 feature, will be implemented in Witcher 3 and that its inclusion will have an adverse performance/IQ effect on all non-Maxwell GPUs (or alternatively a major IQ/performance boost for Maxwell), the point you are trying to make is moot.
More ironic is that for any
serious Witcher fan, performance of 290/290X/780/780Ti/970/980 is irrelevant since if these cards don't perform well enough, they will upgrade to GM200/300 series by May 2015. Anyone else who purchased these cards months/year before Witcher 3 came out, guess you haven't learned a thing from Half-Life 2.
Next thing you are going to tell us that buying a DX11.3/12 GPU right now is more future-proof for BF5 in 2016, right?
The only advantage performance wise that Gameworks gives NVidia, is that it allows them earlier access to the game code so they can get a head start on polishing their drivers.
That's not how GW works.
NV provides specific in-house game code for
free to game developers so that they can more easily and freely implement cutting edge graphical features and effects without dedicating their own resources to target 5-10% of the PC market that might have the latest GPU architectures to take advantage of these features.
"As part of their ongoing drive to improve the state of computer graphics, NVIDIA has a dedicated team of over 300 engineers whose primary focus is the creation of tools and technologies to make the lives of game developers better. What's truly interesting about GameWorks is that these libraries are free for any developers that want to use them. The reason for creating GameWorks and basically giving it away is quite simple: NVIDIA needs to entice developers (and perhaps more importantly, publishers) into including these new technologies, as it helps to drive sales of their GPUs among other things"
http://www.anandtech.com/show/8546/nvidia-gameworks-more-effects-with-less-effort
The GW effects libraries are NV-specific SDK game code, optimized for NV. It's not just about NV having early access to the overall game code of the game, but NV specifically providing their OWN NV-graphics card optimized code to be inserted into the game.
"300 visual effects engineers from NVIDIA who will be
dispatched to developers across the globe offering library of SDKs, technology and algorithms and finally developer tools. Part of this strategy includes three new SDKs – Flex, GI Works and Flame Works.
‘We’ve dispatched our engineers to work onsite with top game developers and add effects, fix bugs, tweak performance, and train developers in open standards and work hand-in-hand with our game laboratory.’ NVIDIA
Read more:
http://wccftech.com/nvidia-gamework...unified-gpu-physx-demonstrated/#ixzz3OEt7U0tz"
The purpose of NV's GW is to push next generation graphical effects, primarily to sell NV graphics cards, not just to gain early access to developer code. Since GW SDK libraries are NV's own code, not developer writing the code for tessellation, new God rays or realistic water or physics effects, naturally GW's titles are going to heavily favour NV cards most of the time. Consequently, NV gains an automatic advantage since NV-designed and optimized code is inserted into the game! This is why there were so many articles which called into question the highly controversial business practice of GW. If GW only allowed NV to gain access to the game engine's code and nothing more, it wouldn't be
that controversial.
Well if you insist on ultra settings then yes. If you want simply playable then plenty of cards today will be fine.
Name any time in the history of next generation DX, where the first generation of DX cards could play next generation games of that DX generation at Ultra settings maxed out? This never happened. By the time true DX9, 10, 11 games came out, we needed 2nd or 3rd generation GPUs that supported that level of DX. History proved this for every single DX generation in the last 15 years. Since DX12's main differentiation is the lower-level Mantle-like API, its primary advantage over DX11.3 is the lower CPU overhead, not advanced graphical features.
Further, by the time MS releases Windows 10, and developers will actually start making DX11.3/12 games from the ground-up, all modern cards like 7970Ghz/680/770/280X/290/290X/960/970/980 will be obsolete. Since 2016, 2017 and 2018 games will be even more demanding, regardless of DX11.3/12, most of us will upgrade to something way faster available at $400-500 to play those future titles. I would bet the same people who are worried that their card won't support DX12 will be upgrading to Pascal/Volta anyway. For those who bought GM204 and think that DX12 is some kind of a future-proof feature, that's nice -- see you in 2016 when 14nm/16nm FinFET Pascal/GCN 3.0 are = 980 SLI at $550.
Considering $650 780/$1000 Titan can be had in a $250 R9 290 and $700 780Ti in a $350 970, it's remarkable that experienced PC builders still try to push the virtues of DX future-proofing. If you think you'll be playing 2016-2017 DX12 games on a 970/980 on Ultra, I don't know what to tell you....
Since new Windows adoption takes years, it'll be 2-3 years before developers start to make DX12 games from today. Chances are 2016 games will still be DX11 because development on them will have started late 2014 or mid-2015. It's not going to make sense for the developers to spend time focusing on DX12 code-path until adoption of W10 and DX12 GPUs pick up. As far as DX11.2 and DX11.3 go, DX8.1, 9.0b/c and DX10.1 have all shown to be extremely niche. Don't forget that next generation DX12 game engines are basically non-existent right now as adoption of Unreal Engine 4.4 is MIA.
--
It's funny how this whole DX11-12 discussion started because the OP made a conjecture that the Witcher 3 specs had something to do with deferred tiled resources when the real reason could be just as simple as those specs are just as irrelevant as they are for 90% of all PC minimum/recommended specs released by other developers.
For example AC Unity had minimum CPU requirements of:
"Intel Core i5-2500K @ 3.3 GHz or AMD Phenom II x4 940 @ 3.0 GHz or AMD FX-8350 @ 4.0 GHz"
Total BS as X4 940 and 2500K were miles apart in the final game. Despite AC Unity recommending Intel Core i7-3770 @ 3.4 GHz or AMD FX-8350 @ 4.0 GHz or better,
i5 2500K's minimum requirement was actually faster in the game. Are we supposed to take CPU recommendations for games as gospel now when 90% of developers have failed before? :whiste:
Recommended GPU requirements are just as bad since they never tell us what settings and FPS the developer is talking about.
AC Unity recommended:
"NVIDIA GeForce GTX 780 or AMD Radeon R9 290X (3 GB VRAM)"
But in reality one needed 780
SLI or faster to hit 60 fps with 4xMSAA in Unity.
Lately the developer's minimum and recommended CPU/GPU specs are so off the mark, after looking at Lord's of the Fallen's requirements and then the final game, I don't even pay attention at all. Just wait until day 1 reviews and upgrade upgrades as necessary. This strategy never fails.