DX12 and developer freedom

greatnoob

Senior member
Jan 6, 2014
968
395
136
Direct X 12 is one of the greatest generational leaps in graphics API history, how can developer and consumer adoption be improved exponentially?

The move away from fixed function vertex and pixel graphics pipeline led to the birth of OpenGL 2 and DirectX 9 which was massive in the sense that it opened up a range of possibilities in both creative and technical development of games. On the hardware side, we saw a massive change from fixed function pixel and vertex units to what our current graphics hardware implementation still relies on: massive amounts of parallel compute/shader units.

It seems like DX12 and its upcoming/GCN hardware is equivalent to the DX9 era of hardware and API changes. Over the span of 16 years, the graphics APIs have been opening up considerably. Developers have significantly more (almost all) control over hardware when comparing OpenGL 1.0 and its incredibly simplistic design to Vulkan, shows this change. As time goes on the drivers have gotten thinner and are now almost a means of just relaying information to and from the kernel/graphics card(s). This is great for talented indie developers and huge studios although it doesn't seem to be practical for people like myself with limited time and knowledge of the API. Which leads me to its usage.

I think DX12 can see a massive surge in developer and customer adoption if development environments like Unity and Unreal Engine adopted it as a backend. This brings with it compatibility issues: how do you translate what were once dx11 calls to those of dx12? The answer is almost universal to every developer: you include an OPTIONAL "glue" layer which translates Unity and UE4's DX11 graphics SDK to that of DX12 - some DX11 parts should be deprecated in order to push for DX12-like functionality. Although it sounds useless, this allows game developers to transition to DX12 while allowing Unity and UE4 engineers to make back end optimisations to their engines using DX12. This may lead to a performance regression in the short run, but significantly better performance and efficiency in the long-run.

Adding to that: there will almost definitely be DX11-like wrappers released for DX12 that some developers may choose to use. This lets them work with the knowledge they had developing with DX11 and carry that over to a DX12 rendering pipeline with the added possibility of using unmasked DX12 specific features and optimisations.

An evolving high-level API like DX will eventually have to start exposing more and more of its internals and that's exactly what we've been seeing over the years and through every iteration of DX. Hacky driver tweaks from manufacturers to boost game performance are almost abolished with DX 12 and the API can finally be used like it was meant to without any "ifs" or "buts". Now IHVS are forced to adapt to what developers are doing. Developers write the software and hardware vendors create the hardware that can best fit the software. A good example of this being ImaginationTech's TBDR (tile based deferred renderer) and AMD's discard accelerator which adapt the hardware to work with unoptimised software via hardware optimisations and not gimmicky software tricks (driver updates for games).

I see a group of misinformed users here who think anomalies of small regression in DX12 performance is bad, not remembering how the shift to DX9 and DX9 hardware was necessary to move the graphics industry along despite the change making the graphics cards inefficient (fixed function units vs a CPU-like instruction processor) and increasing developer workload and costs significantly.

So a couple of discussion questions so I can realise peoples' thoughts on DX12:

What are everybody's thoughts on increasing adoption rates by moving game engines to use DX12 and DX12 giving developers more control over what they can do? Personally, I am in complete favour of it even though I won't be using nor learning DX12 anytime soon (might get some reading done on Vulkan though).

Should IHVs be blamed for crappy DX12 performance or should the framework be blamed for forcing features on inept cards/architectures?

TBDR and "discard accelerator" are some great ways of optimising all sorts of funky rendering techniques via hardware instead of IHVs giving devs a checklist of what they can and cannott do on their hardware, should this be the way of the future considering DX12's thinner drivers will make it more difficult to optimise crappy code or should the current bloated DX11 drivers be the guideline with DX12 too?
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Adoption rates will increase regardless since IHVs are only shipping DX12 compatible hardware and the fact that it's a Microsoft standard ...

For the second question it's a little bit of both including developers. Potentially subpar DX12 performance could come from a mismatch between hardware and API design but there do exist fast paths for other IHVs other than AMD so let's not fixate on the limitations of other hardware and instead trust in the developers to deliver performance for all hardware ...

The so called "discard accelerator" is just another way of saying that AMD will be making some grounds to catch up to Nvidia in terms of triangle throughput since the former is rasterizer limited ...
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
Small regression isn't bad but shouldn't be necessary if done right.
To an extent, true. Unless you're bordering in your desired framerate, or minimums take a dive, a small regression isn't likely to make a playable game suddenly unplayable.

My sentiments for overclocking as well. Best case scenario, good for a setting bump or to help get that extra push if you're on the edge of 60 fps. Will not make an unplayable game playable.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Whole point of DX12 is to have alternative for old fat APIs. (Which continue to evolve as alternative to low level APIs)
After the drivers have matured the performance is in developers hands as it should be.

One of the bigger advantages is that developer can be quite sure that if the code doesn't work, it's hes own fault. (Not having to by step driver problems, only to find out later that the trick doesn't work anymore and vice versa.)

I'm quite exited on SM6 as it will bring some fantastic new features for developers which some have been requesting for over 10 years.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Whole point of DX12 is to have alternative for old fat APIs. (Which continue to evolve as alternative to low level APIs)
After the drivers have matured the performance is in developers hands as it should be.

One of the bigger advantages is that developer can be quite sure that if the code doesn't work, it's hes own fault. (Not having to by step driver problems, only to find out later that the trick doesn't work anymore and vice versa.)

I'm quite exited on SM6 as it will bring some fantastic new features for developers which some have been requesting for over 10 years.

That's not totally true. We know async compute didn't work on nVidia, even though it was exposed by the drivers, because of lack of hardware support. There could be other things too. It does prove though that all issues with the game are not automatically the dev's fault.
 

book_ed

Member
Apr 8, 2016
29
0
6
DX12 and Vulkan are good for those wanting to make complex games, think Star Citizen big/complex. It sure helps in others, such as BF where Mantle helped, more so on lower end systems, but until a developer will move completely away from DX11 and perhaps even ignore the console market, I recon great leaps in image quality and game complexity can't be truly done. Regularly we should be able to see improvements such as AoS for the AMD cards and perhaps even lower end processors, but these are still in an environment design to still work on a less efficient path, hence what we have today.

I see no reason why DX11 should disappear in the short run and perhaps even in the long run for relative simple games.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
DX12 and Vulkan are good for those wanting to make complex games, think Star Citizen big/complex. It sure helps in others, such as BF where Mantle helped, more so on lower end systems, but until a developer will move completely away from DX11 and perhaps even ignore the console market, I recon great leaps in image quality and game complexity can't be truly done. Regularly we should be able to see improvements such as AoS for the AMD cards and perhaps even lower end processors, but these are still in an environment design to still work on a less efficient path, hence what we have today.

I see no reason why DX11 should disappear in the short run and perhaps even in the long run for relative simple games.

Weren't they supposed to continue on with DX11.3 for the devs who didn't want to write low level code?
 
Feb 19, 2009
10,457
10
76
Not every game is suitable for DX12 to be better than DX11. Games that are very CPU intensive, should really be made with DX12.

Total War for example, bogs down badly even on beastly rigs due to CPU limitations. Try playing Attila or Rome 2 with big unit sizes and gg your FPS. In general, RTS & open world games, lots of scene complexity, really needs these next-gen API.

DX11 will be there for folks who don't need that kind of performance peak/ceiling and don't want to work harder/closer to the metal to reach it.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Not every game is suitable for DX12 to be better than DX11. Games that are very CPU intensive, should really be made with DX12.

Total War for example, bogs down badly even on beastly rigs due to CPU limitations. Try playing Attila or Rome 2 with big unit sizes and gg your FPS. In general, RTS & open world games, lots of scene complexity, really needs these next-gen API.

DX11 will be there for folks who don't need that kind of performance peak/ceiling and don't want to work harder/closer to the metal to reach it.

No it doesn't, unless you play with your FX CPU perhaps.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Total War for example, bogs down badly even on beastly rigs due to CPU limitations.
But is the problem the api? Afaik the previous total war games used a single thread to resolve the combat simulation.

Same with the arma series. If you have a ton of ai units fighting eachother the fps will be bad. Press escape (which pauses the simulation) and fps is nice agian, even though everything is still rendered.
 

greatnoob

Senior member
Jan 6, 2014
968
395
136
Not every game is suitable for DX12 to be better than DX11. Games that are very CPU intensive, should really be made with DX12.

Total War for example, bogs down badly even on beastly rigs due to CPU limitations. Try playing Attila or Rome 2 with big unit sizes and gg your FPS. In general, RTS & open world games, lots of scene complexity, really needs these next-gen API.

DX11 will be there for folks who don't need that kind of performance peak/ceiling and don't want to work harder/closer to the metal to reach it.

DX11 should and will be deprecated just like prior versions of DX were once DX9 was released. As an example, moving away from fixed function hardware made DX 8.1 completely irrelevant, even though DX9 had a bigger performance hit on cards back then it was still used. Like I stated in my original post, a DX wrapper is the better option and would let devs get the best of both worlds. It'd be similar to the DX11 API in terms of design but it'll just be wrapping over DX12's featureset.

The good thing about DX11 at the moment is the years of general driver optimisations Nvidia and AMD have pulled out though that isn't a good excuse to use a (future) deprecated API. A wrapper is the better alternative for indie developers unless they specifically want to support older cards which leaves DX9 as the only option on the table.
 

book_ed

Member
Apr 8, 2016
29
0
6
Weren't they supposed to continue on with DX11.3 for the devs who didn't want to write low level code?

Yes, even AMD said Mantle isn't for everyone and it's quite right. You don't need roads that F1 cars can race on when all you have is light traffic.

But is the problem the api? Afaik the previous total war games used a single thread to resolve the combat simulation.

Same with the arma series. If you have a ton of ai units fighting eachother the fps will be bad. Press escape (which pauses the simulation) and fps is nice agian, even though everything is still rendered.

The issue may be due tu multiple bottlenecks. In ArmA you can test yourself with the provided 2D/3D editor without any AI. Performance tanks once you increase object complexity and draw distance and it's more severe in cities and towns. I'd guess 32bit limitations, probably some API here and there, plus poor occlusion and multitrheading, all take part in that. And yes, you have poor performance while the CPU and GPU don't do much. Since it's the same engine, the same happens on Day Z, ArmA 2, Take on Helicopters, etc.

Not sure how's in Total War, but you can play test quite easy different scenarios in ArmA and perhaps even analyse the draw calls if you have the know how.

Beyond the API matter, let's not forget about GPU physics and AI, already demonstrated by nVIDIA and AMD. None of those really took off, although they hold a far greater potential even compared to DX12 or Vulkan. If an old HD4850 can run 3000 AIs with great complexity and advanced graphics at HD resolution, what could modern day GPUs do with physics and AI plus a great API?

The technology is there and doable, too bad Crysis didn't sell that well. Gaming would be a whole lot different than the same copy -> paste games year after year.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Stop talking about stuff you have no idea about man.

I am a big fan of this series and have hundreds of hours into many TW games

I have 599 hours in Rome II and 456 hours in Attila.

I know exactly what I talk about.

Here, from the developer's own words:

https://www.youtube.com/watch?v=G0z5cmzZqB8

Yet they didn't need DX12 for it. They made DX11 better.

http://wiki.totalwar.com/w/Total_War_WARHAMMER_Specs

Or are you saying the developers....are wrong?
 
Last edited:
Feb 19, 2009
10,457
10
76
It's not exclusive.

You can improve DX11 and you can improve it further with DX12. For games that are intensive on the CPU, DX12 can lead to a huge improvement.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
We already know what CPU/Combo will give 60FPS+ with ultra settings if you trust the developers.

And since you post multiple graphs, including pclab that you usually dont trust. What is it?

38FPS with an i3 sets the bar high. But FX CPUs are an utter joke in the game, we know.

From your original statement you made it sounds like the game would crawl, even on top PCs.
 
Last edited:

kraatus77

Senior member
Aug 26, 2015
266
59
101
If dx12 isn't needed than why so many companies are wasting time and money on it ? even the freaking nvidia worked on it for 3-4 years. or are you saying they lied again ? which isn't shocking as they lack high tier support in alot of features.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
If dx12 isn't needed than why so many companies are wasting time and money on it ? even the freaking nvidia worked on it for 3-4 years. or are you saying they lied again ? which isn't shocking as they lack high tier support in alot of features.

By your standard, nobody should work on Linux or OpenGL

And again, MS likes DX12 on Xbox due to the weak console CPU. Even the PS4 got a high level API beside its low level.

And feature wise, there is DX11.3. So from an IHV point of view, DX12 just means less work. Its cost moved from IHVs to developers.

Will there be future cases where DX12 shine? Quite possible, but it requires a developer that will fork out the extra money and time for it. Including extended support. But its certainly not going to be the norm by any means. And current DX12 games shows this, one flop and failure after the other.
 
Last edited:
Feb 19, 2009
10,457
10
76
From your original statement you made it sounds like the game would crawl, even on top PCs.

I don't know about you but I prefer min fps to be as high as possible.

DX12 can do that. But whether it will do it for TW:WH, we will have to wait and see. Until then, you are free to keep on spouting that DX12 won't help large scale strategy games that are known to be CPU limited all you like...
 

kraatus77

Senior member
Aug 26, 2015
266
59
101
By your standard, nobody should work on Linux or OpenGL

And again, MS likes DX12 on Xbox due to the weak console CPU. Even the PS4 got a high level API beside its low level.

And feature wise, there is DX11.3. So from an IHV point of view, DX12 just means less work. Its cost moved from IHVs to developers.

Will there be future cases where DX12 shine? Quite possible, but it requires a developer that will fork out the extra money and time for it. Including extended support. But its certainly not going to be the norm by any means. And current DX12 games shows this, one flop and failure after the other.
No card supports dx11.3 afaik, but they do support dx12. why is that ?


nothing becomes a norm in one day. and as you said ms likes dx12, those devs like dx12, and those Developers who actually make the game are praising dx12 instead of crying about extra workload/money you talk about.

i could've believed you but going by your past predictions, everything you say turns 180.

so keep dreaming while we see industry change in next 1-2 years.
hopefully nvidia can make something better than gcn which won't require GW to perform better. oh wait they are already doing it with pascal with 64cc/sm and FP16. something GCN already have.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |