Hitman DirectX-12 BenchmarksupdateComputerbase

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Robert Hallock:

https://www.reddit.com/r/Amd/comments/49u358/early_hitman_dx12_benchmarks_390_is_15_faster/d0vilqy

DX12 DirectFlip was not previously enabled in 16.x drivers. Now it is enabled as of 16.3 (released yesterday). DirectFlip is what allows a DX game to claim exclusive full screen, disable vsync, and more.

It's worth addressing. So we did. For example, Ashes of the Singularity can now use this behavior on Radeon. Any DX12 game designed to work in full screen mode, like gamers are accustomed to with DX11, can do the same.

In response to the 60hz vsync lock on Fury in Hitman DX12:

This is not a driver bug.

Surprising that they work with IO Interactive and they did not sort this out, because it's Fiji-related. Maybe it will get a day 1 patch to address it, since game is going live tomorrow.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Lets don't forget that this is Gaming Evolved title, so it will make AMD GPUs look good.

Actually GE titles tend to run just fine on nVidia. It's just that they aren't gimped on AMD like the Gameworks titles are.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That and modern great looking games vs 2006 remake on UE3 with DX12 tacked on.

What do you guys think will be norm moving forward? Is it common to have 2006 games with DX12 bolted on or actual modern games? -_-

That's all the money nVidia gave them would buy. They weren't going to kill a new game they need to make money on for them. They've not too much to lose on a game they made bank on a decade ago.
 

Samwell

Senior member
May 10, 2015
225
47
101
Actually GE titles tend to run just fine on nVidia. It's just that they aren't gimped on AMD like the Gameworks titles are.

Not as bad as some gameworks titles, but some clearly preferring AMD with less performance on Nvidia. It always depends on how much work the developer wants to spend on the other brands architecture. It makes no more sense to try to talk about general game performance in amd sponsored games than in Nvidia sponsored ones.
It's funny many people here exclude gameworks titles but as amd comes into play everything is ok.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not as bad as some gameworks titles, but some clearly preferring AMD with less performance on Nvidia. It always depends on how much work the developer wants to spend on the other brands architecture. It makes no more sense to try to talk about general game performance in amd sponsored games than in Nvidia sponsored ones.
It's funny many people here exclude gameworks titles but as amd comes into play everything is ok.

What games are you referring to?
 
Feb 19, 2009
10,457
10
76
Not as bad as some gameworks titles, but clearly preferring AMD with less performance on Nvidia. It makes no more sense to try to talk about general game performance in amd sponsored games than in Nvidia sponsored ones.
It's funny many people here exclude gameworks titles but as amd comes into play everything is ok.

It's more that AMD GE titles have excellent performance all round for the visual quality, but AMD GPU have an edge.

A great example: Alien Isolation, that game looks amazing, runs heaps better than Metro (similar indoors game), we're talking 100+ fps even on modest hardware.

Another example: Battlefront, looks amazing, runs very fast on everything, just faster on GCN.

And this one, same genre, Project Cars vs Dirt Rally (or even neutral titles like F1 2015)...

If you need more example, compare Far Cry 4 at release vs Far Cry Primal.

With GW:


Without GW:


Now compare to Primal: https://www.youtube.com/watch?v=cMUbHdQBSiQ

Runs great from day one with excellent performance all round, mid-range GPUs are getting amazing FPS.

GW titles tend to have poor performance overall, except that it's much worse on AMD & Kepler.

So I see it as AMD sponsorship = All gamers run it well, AMD runs it better.

NV GW sponsorship = Maxwell owners can run it okay, if they disable GW features. Everyone else can suffer.
 
Last edited:

Samwell

Senior member
May 10, 2015
225
47
101
What games are you referring to?

Dirt Showdown was running bad on Nvidia in the beginning. Tomb Raider Tress FX also need time to run ok on nvidia. Dragon Age Inquisition clearly preferred AMD as Hitman is now doing.
You don't need DX12 to see, that Hitman is clearly biased towards AMD. If it were only DX12 where AMD excels ok, but even in DX11 at 1920 a Fury non x is nearly as fast as a overclocked 980Ti. In a normal game at DX11 this won't happen.
 
Feb 19, 2009
10,457
10
76
Dirt Showdown was running bad on Nvidia in the beginning. Tomb Raider Tress FX also need time to run ok on nvidia. Dragon Age Inquisition clearly preferred AMD as Hitman is now doing.
You don't need DX12 to see, that Hitman is clearly biased towards AMD. If it were only DX12 where AMD excels ok, but even in DX11 at 1920 a Fury non x is nearly as fast as a overclocked 980Ti. In a normal game at DX11 this won't happen.

It would be comparable if AMD's features were hidden, obfuscated and closed source, then you can blame them for crippling NVIDIA performance. Except it's not, it's open source, NV is freely able to get access and optimize as they see fit.

I mean NV even released "Game Ready" for Hitman & Ashes BETA, because AMD sponsorship doesn't prevent them from accessing the game nor encrypting of features delay NV's ability to optimize.

So the lower NV performance is due to NV not optimizing the game, because nothing is hidden from them, they have no excuses.

This is where NV's GW approach opens a target on their backs, because they obfuscate the GW libraries and do not provide source code (not even to devs, certainly not to AMD), clearly they have something to hide. So they open themselves to be accused of being dirty or crippling AMD. Now if they go open source, nobody can accuse them of such tactics, the responsibility for optimization will then solely fall on the devs (who can view/change the library code) and AMD.
 
Last edited:

Samwell

Senior member
May 10, 2015
225
47
101
I mean NV even released "Game Ready" for Hitman & Ashes BETA, because AMD sponsorship doesn't prevent them from accessing the game nor encrypting of features delay NV's ability to optimize.

So the lower NV performance is due to NV not optimizing the game, because nothing is hidden from them, they have no excuses.

With this logic Gears of War is also just a problem, because AMD didn't optimize the drivers for it, because even without HBAO it runs bad on AMD. That's not my view on it. The GoW developers just made bad works and the Hitman developers just optimized their game on AMD. Nvidia can't do much about it as AMD can't do in GoW alone. You don't need insight in Gameworks if the game without using Gameworks runs bad, this is just the developers fault.
 
Feb 19, 2009
10,457
10
76
With this logic Gears of War is also just a problem, because AMD didn't optimize the drivers for it, because even without HBAO it runs bad on AMD. That's not my view on it. The GoW developers just made bad works and the Hitman developers just optimized their game on AMD. Nvidia can't do much about it as AMD can't do in GoW alone.

This would make sense if Hitman devs didn't work with NV, didn't test on their hardware, DID NOT even tell NV they plan to release to give them a headsup to prepare drivers etc.

See how senseless it is when you apply it to GoWU, who are guilty of all the above?

They even shoved in NV HBAO+ GameWorks into their game and didn't even LABEL it as NV tech or GW or HBAO+ like it normally is done. So shady.

If that's the best comparison you can come up with... come on, don't bother.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
With this logic Gears of War is also just a problem, because AMD didn't optimize the drivers for it, because even without HBAO it runs bad on AMD. That's not my view on it. The GoW developers just made bad works and the Hitman developers just optimized their game on AMD. Nvidia can't do much about it as AMD can't do in GoW alone.
AMD never received notice that the game was in QA or that it would be released. So the developers never worked with AMD or tested on AMD hardware. The developers only tested and performed QA with NVIDIA. They even provided NVIDIA an advanced build of the game prior to release so that NVIDIA could optimize their drivers for the XBox spring showcase.

All 100% true.

So with GoW, we have the devs to blame.
 

Samwell

Senior member
May 10, 2015
225
47
101
This would make sense if Hitman devs didn't work with NV, didn't test on their hardware, DID NOT even tell NV they plan to release to give them a headsup to prepare drivers etc.

See how senseless it is when you apply it to GoWU, who are guilty of all the above?

They even shoved in NV HBAO+ GameWorks into their game and didn't even LABEL it as NV tech or GW or HBAO+ like it normally is done. So shady.

If that's the best comparison you can come up with... come on, don't bother.

Many games implement HBAO and it's not a problem, like you see in the Division. There are gameworks effects which clearly cripple amd performance, but HBAO is not one of them. You can't push everything on gameworks. But yes, GoW is a bad example.

Still there are other games which run clearly better on Nvidia using just HBAO where others perform normally. Either i see it as Amds own problem as much as Nvidias in Hitman or both are the devs because of optimizing on one architecture.

Developing is always a matter of money. I don't believe in real sabotage by developers but in time/money constraints. If a developer is needed to incorporate a few gameworks effects, this will cost him time which he could use to optimize on AMD. As he has a contract with Nv he will at least optimize for Nvidia, because else Nv kicks their ass, but AMD comes at last.

Same now happening in AMD games. The developer needs to optimize on AMD, maybe implement something of their technique and if this is all done, then maybe he founds time to optimize on Nvidia. The optimization of one is always the loss of the other one.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Dirt Showdown was running bad on Nvidia in the beginning. Tomb Raider Tress FX also need time to run ok on nvidia. Dragon Age Inquisition clearly preferred AMD as Hitman is now doing.
You don't need DX12 to see, that Hitman is clearly biased towards AMD. If it were only DX12 where AMD excels ok, but even in DX11 at 1920 a Fury non x is nearly as fast as a overclocked 980Ti. In a normal game at DX11 this won't happen.

Dirt showdown released 2012 = compute heavy forward lighting. nVidia can't handle that very well.

Tomb Raider



DAI


So one game, really. And you left off all of the other GE games that run better on nVidia.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
I'd keep the developers out of it and focus on the publishers. The publishers (and maybe untalented devs) might like gameworks but talented devs like to build their own engines and titles (John Carmack, Johan Andersson, Casper of Unity and Dan Baker just to name a few).

Some devs only build engines and then strap on NVApi when they license out their engines to newbie devs (think Unreal Engine here).

As for HBAO+, the feature is closed source so the shader code itself cannot be optimized or replaced by AMD/Intel. That's the issue with Gameworks, these features use NV code optimized for NV hardware. Evidently this will benefit NVIDIA. So what are we supposed to have? AMD and NVIDIA competing over closed source features by trying to sway developers their way or an OpenGPU format where either IHVs features has its source code exposed for optimizations by competing IHVs? I think the latter makes more sense as the former will only cause further head aches for gamers waiting on optimized drivers and suffering bad performance on launch day.

I want to end with this statement, adding open standard features, exposed by the API or open graphics tweaks such as TressFX is not biased. So if a developer adds Asynchronous compute + graphics into his/her game then they're not biased towards AMD or GCN. These devs are following the DX12 open standards and features listing. The same is true with ROVs and CR.

Gameworks, however, is biased due to its closed nature and its biased optimizations.

My two cents.
 
Feb 19, 2009
10,457
10
76
Developing is always a matter of money. I don't believe in real sabotage by developers but in time/money constraints. If a developer is needed to incorporate a few gameworks effects, this will cost him time which he could use to optimize on AMD. As he has a contract with Nv he will at least optimize for Nvidia, because else Nv kicks their ass, but AMD comes at last.

Same now happening in AMD games. The developer needs to optimize on AMD, maybe implement something of their technique and if this is all done, then maybe he founds time to optimize on Nvidia. The optimization of one is always the loss of the other one.

This is something we can all agree on, there will be priorities on optimizations.

AMD enables NVIDIA and devs to optimize their features due to its open nature approach.

NV GW is the opposite, only NV is responsible for optimizing those features.

And so you can imagine a scenario where NV sponsor games, the devs have less time to optimize on AMD, and once its released, AMD gets hold of it and has to deal with library obfuscation to try to optimize it. Not ideal by any measure.

On the other hand, the opposite scenario, AMD sponsor game development, NVIDIA gets access to alphas and betas (as noted by their "Game Ready" drivers for alpha/betas of AMD sponsored titles!), they also get access to AMD's features via their website directly, and so the task of game optimization is much easier.

If you want to accuse AMD of crippling NVIDIA, you can use this argument: AMD tech are in all the major consoles.

The reason is as we agree, game development is about time/$ management and consoles will always receive the bulk of the optimization. Hopefully we can agree on that? If so, the logical outcome is the PC port favors console-like architecture and that happens to be GCN.

This is why the 390 is simply so much faster than the 970 lately, even in NV sponsored titles, because at their roots, these modern games are optimized for console GCN.

Example, Rise of the Tomb Raider (release review):



Far Cry Primal: https://www.youtube.com/watch?v=cMUbHdQBSiQ

The Division: https://www.youtube.com/watch?v=Jne8VWuE2a4

The same applies for PvZ2, Need for Speed and now Hitman. Pretty much all modern AAA titles so far, except for GoW (you can argue how AAA it is when it's a poor remake locked on the Window Store).

So definitely, AMD is playing the long strategic game (whether it eventually pays off for them, remains to be seen, since their marketshare is terrible). Does it mean they are gimping NV? Well, yes in a way because they sneak in their tech as the focal point for all the major studios to optimize their games. I can see a case for that line of argument. Similar to AMD shoving Mantle for free at everyone to get Vulkan/DX12. But it's more of an indirect consequence. Whereas you can't accuse AMD of gimping NV directly in games they sponsor because their devs are able to give NV access and their features are open source.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
The reason is as we agree, game development is about time/$ management and consoles will always receive the bulk of the optimization. Hopefully we can agree on that? If so, the logical outcome is the PC port favors console-like architecture and that happens to be GCN.

This is why the 390 is simply so much faster than the 970 lately, even in NV sponsored titles, because at their roots, these modern games are optimized for console GCN.

I wouldn't even call it optimisation in some cases, more like designed for. Levels are huge flat surfaces with fancy lighting, because that's what runs well on consoles with weaksauce cpu's + gcn.
 

Vaporizer

Member
Apr 4, 2015
137
30
66
This is something we can all agree on, there will be priorities on optimizations.

AMD enables NVIDIA and devs to optimize their features due to its open nature approach.

NV GW is the opposite, only NV is responsible for optimizing those features.

And so you can imagine a scenario where NV sponsor games, the devs have less time to optimize on AMD, and once its released, AMD gets hold of it and has to deal with library obfuscation to try to optimize it. Not ideal by any measure.

On the other hand, the opposite scenario, AMD sponsor game development, NVIDIA gets access to alphas and betas (as noted by their "Game Ready" drivers for alpha/betas of AMD sponsored titles!), they also get access to AMD's features via their website directly, and so the task of game optimization is much easier.

If you want to accuse AMD of crippling NVIDIA, you can use this argument: AMD tech are in all the major consoles.

The reason is as we agree, game development is about time/$ management and consoles will always receive the bulk of the optimization. Hopefully we can agree on that? If so, the logical outcome is the PC port favors console-like architecture and that happens to be GCN.

This is why the 390 is simply so much faster than the 970 lately, even in NV sponsored titles, because at their roots, these modern games are optimized for console GCN.

Example, Rise of the Tomb Raider (release review):



Far Cry Primal: https://www.youtube.com/watch?v=cMUbHdQBSiQ

The Division: https://www.youtube.com/watch?v=Jne8VWuE2a4

The same applies for PvZ2, Need for Speed and now Hitman. Pretty much all modern AAA titles so far, except for GoW (you can argue how AAA it is when it's a poor remake locked on the Window Store).

So definitely, AMD is playing the long strategic game (whether it eventually pays off for them, remains to be seen, since their marketshare is terrible). Does it mean they are gimping NV? Well, yes in a way because they sneak in their tech as the focal point for all the major studios to optimize their games. I can see a case for that line of argument. Similar to AMD shoving Mantle for free at everyone to get Vulkan/DX12. But it's more of an indirect consequence. Whereas you can't accuse AMD of gimping NV directly in games they sponsor because their devs are able to give NV access and their features are open source.
Really nice post. I completely agree to your detailed analysis
 
Feb 19, 2009
10,457
10
76
They updated, with AMD's 16.3 beta driver to fix Fiji DX12 vsync lock at 60.

DX12

1080p:


1440p:


1440p 21:9 ratio:


4K:


DX11

1080p:


1440p:


4K:


That Sapphire R290 1Ghz... damn!
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Man, the trend is too strong. GCN is taking over. If you like AAA games, I'm going to recommend AMD at every price point below the GTX 980TI. Hands down. It's like Kepler all over again. Pascal needs to drop very soon.
 

jantjeuh

Member
May 4, 2015
45
0
0
Looks like the only cards worth considering until Pascal/Polaris drop are the 390 8GB and the 980 Ti. The rest are all redundant.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Also, there is a huge difference between their 290 and 390. This is the exact same chip, one is clocked 10Mhz higher and has 6000mhz memory rather than 5200. Is their 290 trix downclocking or what?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |