Hitman DirectX-12 BenchmarksupdateComputerbase

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

casiofx

Senior member
Mar 24, 2015
369
36
61
Ahaha, 980 Ti @ 1380Mhz is only 7-6% faster than 390X (290X) @ 1070Mz depending on resolution

I did not expect this kind of result

Are we going to see a pattern where Hawaii was first fighting against GK110, then GM104 and finally GM100 in its final days? Now that would be something else.
If it is true, hawaii would be the new crazy legendary card. It's the card that takes on 3 top of the line cards (each year) from the competitor.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
You didn't know AMD released a driver for GOW? Not too bad, since they didn't have access to it and weren't told of the game's release.

60% performance boost! Now everything runs at the 60 fps cap in that UWP app.

Even GameWorks can't gimp GCN for long these days, pretty weak they should stop wasting all that $ on a useless program.

I know - that's the reason why it is only 50%:
http://www.overclock3d.net/reviews/...rmance_retest_with_amd_crimson_16_3_drivers/5
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91

And that doesn't strike you odd as no other games have that kind of massive performance drop off? Even much better looking games?

Funny how Nvidia fanboys only want to point out performance issues and AMD uses point out performance gains. Guess we know which group perfers performance increases, and which is used to watching cards drop in performance
 
Feb 19, 2009
10,457
10
76
And that doesn't strike you odd as no other games have that kind of massive performance drop off? Even much better looking games?

Funny how Nvidia fanboys only want to point out performance issues and AMD uses point out performance gains. Guess we know which group perfers performance increases, and which is used to watching cards drop in performance

That and modern great looking games vs 2006 remake on UE3 with DX12 tacked on.

What do you guys think will be norm moving forward? Is it common to have 2006 games with DX12 bolted on or actual modern games? -_-
 
Feb 19, 2009
10,457
10
76
But hey, let us use only AMD sponsored games to determine how nVidia and AMD will perform under DX12.

I look forward to NV sponsoring proper modern DX12 games.

I hope they don't make a habit of tacking DX12 into ancient games and think somehow that is good for the gaming scene.

ps. We know Fiji has issues with GoW Ultimate and AMD is fixing it, 1080p and 1440p already fixed (oh, FCAT + DX12 don't mix btw). Other AMD GPUs runs it fine.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
But hey, let us use only AMD sponsored games to determine how nVidia and AMD will perform under DX12.

Ironic since you are trying to use a DX9 game engine sponsored and using Nvidia gameworks as your single source against multiple recent DX11 and DX12 titles
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I look forward to NV sponsoring proper modern DX12 games.

I hope they don't make a habit of tacking DX12 into ancient games and think somehow that is good for the gaming scene.

"Modern DX12 games"? I dont know what this is. Hitman looks worse than the Division and doesnt use CR for advanced shadows. Anno2207 looks much better than Ashes and doesnt need DX12 to get to 60FPS.

I wait for a modern DX12 game which looks better than DX11 games and where DX12 isnt just used to promote AMD cards. And when you dont think DX12 is properly used in Gears of War why should we take Hitman or Ashes seriously? Both games are playable with DX11...

BTW: Gears of War isnt even so far away from Hitman. :\
 
Feb 19, 2009
10,457
10
76
BTW: Gears of War isnt even so far away from Hitman. :\

You must be blind to the hundreds of AI simulated NPCs with high model details, when GoW is a few units at a time.

Better API allows for higher fidelity or massively increased scene complexity. Asking for both is beyond our current hardware. Quantity is a quality of it's own.

Do you honestly think GOW Ultimate is representative of games in the DX12/Vulkan era? That its the normal practice for devs to take 2006 source code on an obsolete DX9 engine, then bolt on DX12/Vulkan? Answer that one if you can.

And no, I don't think 2 AMD sponsored games is also representative. But when is NV going to push their own MODERN DX12 sponspored games so we can compare?
 

casiofx

Senior member
Mar 24, 2015
369
36
61
I dont know. A stock GTX980 is beating Fury X in 4k, too. But i guess you as a AMD user dont care about 4K.
So here is another look at AMD's DX12 performance:

http://www.pcper.com/reviews/Graphics-Cards/PresentMon-Frame-Time-Performance-Data-DX12-UWP-Games

But hey, let us use only AMD sponsored games to determine how nVidia and AMD will perform under DX12.
This is hitman benchmark thread.

Why are you quoting benchmarks or analysis from other games? You are going off topic.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
You must be blind to the hundreds of AI simulated NPCs with high model details, when GoW is a few units at a time.

Better API allows for higher fidelity or massively increased scene complexity. Asking for both is beyond our current hardware. Quantity is a quality of it's own.

Do you honestly think GOW Ultimate is representative of games in the DX12/Vulkan era? That its the normal practice for devs to take 2006 source code on an obsolete DX9 engine, then bolt on DX12/Vulkan? Answer that one if you can.

And no, I don't think 2 AMD sponsored games is also representative. But when is NV going to push their own MODERN DX12 sponspored games so we can compare?
It is safe to say that certain board members have their own opinions. These members are empirically wrong and thus aren't worth our time as of this point.

What we've stated is true and is being proven. Massive amounts of users, across many other forums, have taken note.

Mindshare wise, there is a flip occurring as it pertains to DX12/Vulkan and AMD GCN vs NVIDIA Maxwell which is set to affect the upcoming Pascal vs Polaris sales.

These users have lost. We were right. We are right. We can now sit back and enjoy the show
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
And no, I don't think 2 AMD sponsored games is also representative. But when is NV going to push their own MODERN DX12 sponspored games so we can compare?

Gears of War is a modern DX12 game. Uses Multi-Engine and Multi-Threaded rendering. But i guess both features are only "modern" when AMD is winning, right?

This is hitman benchmark thread.

Why are you quoting benchmarks or analysis from other games? You are going off topic.

People doesnt just talk about Hitman. They are talking about how bad nVidia is with DX12 in generel. You should read the whole thread. If this thread is only about Hitman then there shouldn be such topics allowed.

But you are right. It is wrong to talk about DX12 in generel here.
 
Last edited:

Mahigan

Senior member
Aug 22, 2015
573
0
0
Gears of War is a modern DX12 game. Uses Multi-Engine and Multi-Threaded rendering. But i guess both features are only "modern" when AMD is winning, right?
Gears of War does not use Multi threaded rendering. Look up the CPU charts. It does not scale past 2 cores with ht.

GoW also does not make use of multi-engine support. It's basically a DX9 game running on the DX12 API.

I spoke to Dan Baker about this and he told me that barring any complete re-write of the UE3, the engine itself could not be representative of DX12.

You're clinging to false hope. I'm sorry friend but eventhough these forums allow you to speak freely, no developer and nobody at Beyond3D or any GPU/API guru is going to side with you.

Take care,

Mahigan.
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
It is funny when Fottemberg from Semiaccurate forum in, If I remember correctly, April last year wrote that there will be storm over forums, about DirectX12, and Nvidia GPUs. Looks like he had very good source of this information. It was also him who said that DirectX 12.1 feature level was added just because Nvidia specifically asked for it to Microsoft. What he implied is that it was more "political" decision, not "technical".

I do not believe he is wrong on this topic, also.

End off-top.
 
Last edited:

flynnsk

Member
Sep 24, 2005
98
0
0
It is safe to say that certain board members have their own opinions. These members are empirically wrong and thus aren't worth our time as of this point.

What we've stated is true and is being proven. Massive amounts of users, across many other forums, have taken note.

Mindshare wise, there is a flip occurring as it pertains to DX12/Vulkan and AMD GCN vs NVIDIA Maxwell which is set to affect the upcoming Pascal vs Polaris sales.

These users have lost. We were right. We are right. We can now sit back and enjoy the show

It's best to just ignore Snot'n, if you look back through his/her/it's history you will see they've been kicked off numerous forums for instigating flame wars with out right lies. deceit and border line shill tactics (ocn, beyond3d, techreport),. when some ppl go tard they go FULL TARD..

With regard to DX12 and forward, it's obviously apparent to anyone NOT beholden to a specific vendor that AMD has played the long game and picked up in both in Game and Driver development.. (remind me again... HOW MANY drivers has AMD/ATI issues that caused hardware failure or had to "recall"..?). NV thought it was being smart by giving up in the console wars, moving to mobile market where they have all but FAILED.. (tegra anyone).. this allowed AMD/ATI (RTG) to implement a long term strategy, though seriously impacted by 20nm being sidelined, it is now coming to fruition.. Don't worry NV, you will always have your shillings and nTards (patent pending) to gladly and gleefully buy your next promises, forgetting the failures to deliver on past promises... /wink
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Some of this is getting a bit over the top

Isn't the empirical evidence showing a modest boost for one very specific AMD architecture, which has already been replaced, and which replacement isn't really showing the same benefits?
(Definitely not so in Fiji, but some real issues with Tonga too.).

That's nice for people with those cards and all, but anything at 28nm will be entirely obselete in a few months time. The 14nm cards will inevitably be fairly different.
 
Feb 19, 2009
10,457
10
76
Gears of War does not use Multi threaded rendering. Look up the CPU charts. It does not scale past 2 cores with ht.

GoW also does not make use of multi-engine support. It's basically a DX9 game running on the DX12 API.

How can anyone possibly even claim that 2006 UE3 source code with DX12 tacked on is a modern game representative of DX12?

I mean we've seen Ashes fully loading 8 cores, with excellent mutli-thread rendering. We've seen multi-engine async compute being used by these games too.

Really, I do hope NV pushes DX12 games more so the gaming scene can all move forward together. But I wouldn't want to see such old remakes like using 2006 game code..
 
Feb 19, 2009
10,457
10
76
Isn't the empirical evidence showing a modest boost for one very specific AMD architecture, which has already been replaced, and which replacement isn't really showing the same benefits?

390/X isn't replaced yet. Waiting on Polaris.

What it does show is DX12 requires architecture specific optimization. Ashes did it for Tahiti, Hawaii, Tonga and Fiji, but Hitman (currently) seems to be only Tahiti and Hawaii.

However, pcgameshardare.de did not say it was horrible performance on Tonga or Fiji, only that it would not go above 60 fps due to DX12 vsync issues and so they can't get accurate benchmarks. So that one is a different issue entirely.
 

flynnsk

Member
Sep 24, 2005
98
0
0
It is funny when Fottemberg from Semiaccurate forum in, If I remember correctly, April last year wrote that there will be storm over forums, about DirectX12, and Nvidia GPUs. Looks like he had very good source of this information. It was also him who said that DirectX 12.1 feature level was added just because Nvidia specifically asked for it to Microsoft. What he implied is that it was more "political" decision, not "technical".

I do not believe he is wrong on this topic, also.

End off-top.

continuing with the "slightly OT discussion, NVidia fans have been recently clamoring about Just Cause 3 adding CR (Conservative Rasterization), however.. what they seem to ignore is that this was added with Intel's work, who supports Teir 3 of Conservative Rasterization, where as NV only supports Teir 1, in effect NV is like DX 12.05 where as Intel is the most complete DX part to date (DX12.1). This could actually hamper NV more than it benefits where as Intel iGPUs have the potential to get significant gains. Intel gains MUCH more than NV vs NV vs AMD.. (1/2 pixel coverage vs intels 1/256 pixelx , NV also lacks Post-snap degenerate triangles and has no Inner input coverage) for greater details see https://msdn.microsoft.com/en-us/library/windows/desktop/dn903791(v=vs.85).aspx

from MSDN: (primary benefits bolded)

NV-->
Tier 1 enforces a maximum 1/2 pixel uncertainty region and does not support post-snap degenerates. This is good for tiled rendering, a texture atlas, light map generation and sub-pixel shadow maps.

Intel ->
Tier 2 reduces the maximum uncertainty region to 1/256 and requires post-snap degenerates not be culled. This tier is helpful for CPU-based algorithm acceleration (such as voxelization).
• Tier 3 maintains a maximum 1/256 uncertainty region and adds support for inner input coverage. Inner input coverage adds the new value SV_InnerCoverage to High Level Shading Language (HLSL). This is a 32-bit scalar integer that can be specified on input to a pixel shader, and represents the underestimated Conservative Rasterization information (that is, whether a pixel is guaranteed-to-be-fully covered). This tier is helpful for occlusion culling

In effect NV only gains minimal perf from tex altas (think lookup), light maps and shadow maps (notice NV as of late PUSHING shadows a lot in last GW titles..!) where as they do NOT get the more substantial performance benefits from occlusion culling or efficient use of voxels ..

Edit video:

https://youtu.be/zL0oSY_YmDY?t=137
 
Last edited:

Mahigan

Senior member
Aug 22, 2015
573
0
0
How can anyone possibly even claim that 2006 UE3 source code with DX12 tacked on is a modern game representative of DX12?

I mean we've seen Ashes fully loading 8 cores, with excellent mutli-thread rendering. We've seen multi-engine async compute being used by these games too.

Really, I do hope NV pushes DX12 games more so the gaming scene can all move forward together. But I wouldn't want to see such old remakes like using 2006 game code..
It suits an agenda. I get accused of having an agenda all the time. I do have an agenda. My agenda is free and fair competition, support for open standards and open source. Ethical conduct and a respect for reason and empiricism.

My agenda is the truth and I'm pretty open about it.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
In effect NV only gains minimal perf from tex altas (think lookup), light maps and shadow maps (notice NV as of late PUSHING shadows a lot in last GW titles..!) where as they do NOT get the more substantial performance benefits from occlusion culling or efficient use of voxels ..

Also, only 2nd gen maxwell has many of those features, meaning easy to obsolete their old stuff with new gameworks "features".
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |