[WCCFtech] AMD and NVIDIA DX12 big picture mode

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I dont think we see DLCs with such updates. I just think we see no updates. Unless its some kind of extremely popular game people continue to play in droves. Like Dota, CS, TF etc. Or something like MMOs.

I'm not saying the DX12 hardware updates for new hardware will be in the DLC, rather, that the fact the game has a continuing revenue stream for 2+ years due to DLC will allow them to make a business case for supporting new hardware on an already released game as long as they continue to release DLC on it. I'm sure the hardware updates would be to all users, funded by DLC purchasers.

I agree though that updates will probably be limited, I see these categories being the only ones to get regular updates:
1) AAA with extended DLC campaign after release, for as long as they are releasing DLC (~1.5 to 2.5 years of updates)
2) F2P and subscription based constant evolution games (LoL, Dota2, TF, MMOs, World of Tanks, etc.)
3) Games on unmodified off-the-shelf engines (Unreal, CryEngine, Unity)

I agree a very real possibility is that they tell you to fall back to DX11 mode when new hardware won't run it in DX12 for mixed mode games.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
928
149
106
Has there actually been any talk from AMD, Nvidia, Microsoft or someone else about DX12 breaking compatibility with newer architectures?

Just because it was a problem for AMD with Mantle doesn't mean Microsoft has been unable to avoid the same problems for DX12. I'd assume DX12 is better off since it's been made to work on both AMD, Nvidia and Intel.
 
Last edited:

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Has there actually been any talk from AMD, Nvidia, Microsoft or someone else about DX12 breaking compatibility with newer architectures?

Just because it was a problem for AMD with Mantle doesn't mean Microsoft has been unable to avoid the same problems for DX12. I'd assume DX12 is better off since it's been made to work on both AMD, Nvidia and Intel.

You can't lose compatibility, what you might lose are the bare metal optimizations.

So worst case you have ~DX11 performance with the lower CPU overhead the API automatically brings.

It's still a hardware agnostic API, just it also allows developers if they want to, to get much closer to the bare metal hardware.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Has there actually been any talk from AMD, Nvidia, Microsoft or someone else about DX12 breaking compatibility with newer architectures?

Just because it was a problem for AMD with Mantle doesn't mean Microsoft has been unable to avoid the same problems for DX12. I'd assume DX12 is better off since it's been made to work on both AMD, Nvidia and Intel.

There has been talks about it. It seems they all expect a DX11 path to handle it. The problem is what happens if a DX11 path doesnt exist. Because as long as there is a DX11 path there is limitations. We know DICE wants a DX12 only game for example. And I am sure others as well. Specially those where console version sis the vast amount of income.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I wonder if AMD/nVidia/Intel could implement a generic emulated hardware target as a minimum fallback. For example, AMD could have an "AMD_Minimum" which could accept API calls designed for other AMD (gcn 1.0, 1.1 and 1.2 +) architectures and direct it to a thick driver which would sort it out and form the hardware commands particularly to the device present. Sort of like re-introducing the abstraction of DX11 but at the driver level and directly compatible with DX12 code for DX12 only games. Basically a DX12 interpreter/cross compile sort of idea, controlled by the IHVs which have much more of an interest in continuing compatibility than the game devs do. It would just require that game devs add AMD_Minimum (in this example) as a target when developing the game, which is less onerous than having to add each architecture as they come out after release.

I'm sure performance would suffer versus functional and specific DX12, how much I'm not sure, but it would be better than having it not run at all and seeing the way the DX12 calls are formed would allow for more intelligent driver code compared to DX11 calls. You'd probably preserve at least some portion of the reduced overhead and allow for better optimization than DX11. I'd guess it ends up somewhere between pure DX12 mode and DX11 in terms of overhead.

I don't know enough about the maximum abstraction level permitted by DX12 currently. We know it goes very low. Is FL_11 mode the only way to get more abstraction? Perhaps this sort of thing is already possible within the current API design using DX12 FL_11 code?
 
Last edited:

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
I wonder if AMD/nVidia/Intel could implement a generic emulated hardware target as a minimum fallback. For example, AMD could have an "AMD_Minimum" which could accept API calls designed for other AMD (gcn 1.0, 1.1 and 1.2 +) architectures and direct it to a thick driver which would sort it out and form the hardware commands particularly to the device present. Sort of like re-introducing the abstraction of DX11 but at the driver level and directly compatible with DX12 code for DX12 only games. Basically a DX12 interpreter/cross compile sort of idea, controlled by the IHVs which have much more of an interest in continuing compatibility than the game devs do. It would just require that game devs add AMD_Minimum (in this example) as a target when developing the game, which is less onerous than having to add each architecture as they come out after release.

I'm sure performance would suffer versus functional and specific DX12, how much I'm not sure, but it would be better than having it not run at all and seeing the way the DX12 calls are formed would allow for more intelligent driver code compared to DX11 calls. You'd probably preserve at least some portion of the reduced overhead and allow for better optimization than DX11. I'd guess it ends up somewhere between pure DX12 mode and DX11 in terms of overhead.

I don't know enough about the maximum abstraction level permitted by DX12 currently. We know it goes very low. Is FL_11 mode the only way to get more abstraction? Perhaps this sort of thing is already possible within the current API design using DX12 FL_11 code?

I honestly don't think this will be necessary. AMD at least is pretty committed to their GCN architecture. We're basically seeing a very CPU like direction with graphics cores.

You don't lose X86 functionality with each new CPU Intel releases. If they need new instructions, they add those while maintaining complete backwards compatibility.

You wouldn't even need to do in software what you're describing if AMD or Nvidia did drastically change their architectures. Hardware could do it with low overhead.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I honestly don't think this will be necessary. AMD at least is pretty committed to their GCN architecture. We're basically seeing a very CPU like direction with graphics cores.

You don't lose X86 functionality with each new CPU Intel releases. If they need new instructions, they add those while maintaining complete backwards compatibility.

You wouldn't even need to do in software what you're describing if AMD or Nvidia did drastically change their architectures. Hardware could do it with low overhead.

BF4/Thief Mantle and GCN1.2 ring a bell? Thats the problem when you go close to metal. Btw, In terms of x86 you got an entire decoder block to handle it.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
Has there actually been any talk from AMD, Nvidia, Microsoft or someone else about DX12 breaking compatibility with newer architectures?

Just because it was a problem for AMD with Mantle doesn't mean Microsoft has been unable to avoid the same problems for DX12. I'd assume DX12 is better off since it's been made to work on both AMD, Nvidia and Intel.

You don't break compatibility, you just lose the optimisations. Same problem with DX11 and other APIs.

But now, you need to wait for the game devs to release an update and this can be a problem depending on the Publisher.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
BF4/Thief Mantle and GCN1.2 ring a bell? Thats the problem when you go close to metal. Btw, In terms of x86 you got an entire decoder block to handle it.

You mean the driver that is required for Mantle to work wasn't updated for GCN 1.2 because DirectX 12 is right around the corner?

Mantle is not DX12, don't automatically expect the same issues to pop up, that come with using a 3rd party API not baked into an OS.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
You mean the driver that is required for Mantle to work wasn't updated for GCN 1.2 because DirectX 12 is right around the corner?

Mantle is not DX12, don't automatically expect the same issues to pop up, that come with using a 3rd party API not baked into an OS.

You should check when the R9 285 was released.

And why would a driver fix it? It worked fine in newer Mantle games with GCN1.2 support.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
I wonder if AMD/nVidia/Intel could implement a generic emulated hardware target as a minimum fallback.

Console devs already do this by themselves,they write games with the least amount of special (uarc specific) code so it will be easy to port.

Dx12 will provide a number of "tricks" that only worked on console until now and devs will start using those in console games since now they will run on PC as well.

Anything that only runs well on a limited number of hardware the devs will not even touch,unless they get sponsored.

If things would work differently then we would be knee deep in mantle games since month one of it coming out,but limited customer base equals limited interest from devs.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
You should check when the R9 285 was released.

And why would a driver fix it? It worked fine in newer Mantle games with GCN1.2 support.

That's not what AMD said, and not what Anandtech said in the Fury X review.

Of particular note, the Mantle driver has not been optimized at all for GCN 1.2, which includes not just R9 Fury X, but R9 285, R9 380, and the Carrizo APU as well. Mantle titles will probably still work on these products – and for the record we can’t get Civilization: Beyond Earth to play nicely with the R9 285 via Mantle – but performance is another matter.

That's the main issue with a 3rd party API. Support will be half-assed. DirectX 12 isn't a 3rd party API, support will not be half-assed.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
It's a little unfair to criticize Mantle at this point. Mantle bootstrapped DX12. I recall conversations last year when there were murmurs that AMD was helping Microsoft with DX12 and people here were like "yeah right, sure they are". Now it turns out AMD effectively designed it for them (makes sense after you think about GCN enabled Xbox One) so it makes sense as we got closer to the release of Windows 10 AMD ceased development of Mantle. It would have been a waste of resources to continue the Glide like approach to optimization.

Anyways, DX12 is here with Windows 10 and has a huge audience (how many users have upgraded now, 100 million?). Big blockbuster games are being released very soon ((not soon as the ("Eight Bulldozer cores will be used soon, you will see!")). This isn't a "you wait, you'll see" thing, it's simply a case within 6 months we'll have a good handful of DX12 games with many more in development on the way.

The foundations have been built, mass deployment has happened, the game engines have been prepped, now we sit back and wait for the titles to roll in.

What's accelerating this pace is is the combined sales of PS4 and Xbox one have made it such that many newer blockbuster games are dropping ports to the older consoles altogether as it's no longer profitable to waste development time on them.

This means many games can rely on one or two code paths (DX11 / DX12) rather than three or more which should expedite development and make engine designers jobs much easier going forward.

GCN owners should be happy. AMD played the long game and is about to be rewarded for it. By winning over the consoles (where primary game and engine development happens) we have a trojan horse scenario playing out where the AMD developed DX12 will benefit GCN immensely because "Asynchronous Computing" is a major part.

I'm sure Nvidia saw this coming and will be ready with Pascal but it'll be interesting to watch them waste money and resources optimizing their existing cards for each DX12 title released where AMD should be able to sit back a little and stretch it legs. Per game optimization is a waste of resources.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
358.50. Well would you look at that. Fairly large improvement on the DX12 graph.
All is not lost for Nvidia in DX12 after all. Everybody please put away your Zombie apocalypse go bags. False alarm.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
Nice improvement! Good to see NVidia getting their drivers into shape, hopefully they should have the problems ironed out by the time we get actual DX12 games.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I want to see how the NV GPUs will perform with slower CPUs in DX-12 and power measurements for all systems.

Edit:

And they didnt use MSAA on the epic-preset
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Nice improvement! Good to see NVidia getting their drivers into shape, hopefully they should have the problems ironed out by the time we get actual DX12 games.

Of course I already sold my GTX 980 Ti

Kidding aside, would love to know what NV did to get them gains. From everything I've read (and been told) Maxwell2 can't do AC.

EDIT: I wonder if any of this applies to Fable? Has anyone tested there. And clicking the DX11 performance, holy crap does AMD crater there. The hell is going on? They showed in Battlefront they can do DX11 competitively, yet in their sponsored game they trail by a HUGE margin.

EDIT #2: Actually, even AMD got a good perf gain with their driver update. I didn't realize NV wasn't THAT far behind. However in DX11 AMD saw some regression in Medium and no gain in Epic settings. While NV saw gains in both DX11+DX12.

I want to see how the NV GPUs will perform with slower CPUs in DX-12 and power measurements for all systems.

Edit:

And they didnt use MSAA on the epic-preset

In before "computerhardware.de is a shill site."

Pretty sure it's obvious why they didn't. It would crater their performance from last I remembered.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
IQ manipulation, are you assuming the gains come from doing AC?

I didn't assume anything. Open question. If it's purely IQ manipulation, does NV need AC? If they did something with the AC thingy-majiggy, what they do?

I'm not NV, I'd like to know how they overcame something I was led to believe couldn't be done.

If it's pure IQ manipulation, someone is gonna do a side by side and catch them.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
I must be missing something because it's clear as day that AMD gained as much as NV did from newer drivers. Sorry to burst the focus group's bubble, but there's no NV driver magic here as usual? Just propaganda?
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |