[Guru3d] Hitman (2016) DirectX 12 updated benchmarks review

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Maverick177

Senior member
Mar 11, 2016
411
70
91
Well NV has a massive user base, anything long term to the GeForce user base will only cause NV to make less money. Cash grab yo.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I doubt Maxwell will be gimped so much as compute limited. Even in AMD biased games, Maxwell seems to fall in line with GCN according to raw compute. (280x closing in on the 970 due to having similar compute capabilities)

I believe games right now are utilizing Maxwell fairly well. Right now, getting more brute in there is probably the main focus behind Pascal. We'll see how that works out. Without ACEs, it will probably be difficult to maintain peak efficiency in all situations, though if they are able to effectively utilize the chip without, they could very well simply out brute Polaris now, then work on ACEs and other such additions later.

One other thing is the bad PR Nvidia has been gaining as of late. Enthusiasts are already fully aware of Nvidia GPUs losing steam against their competitors, not to mention the memory issue in the 970, and now poor drivers, it would be dumb to recommend an Nvidia card at all unless CUDA was a necessity. I'm not sure if Nvidia can hold back again this time and come away unscathed.

nVidia cards don't typically have issues with AMD games because AMD doesn't do anything to purposely gimp nVidia's cards. DX12 is looking different I think because of the console effect. Not because games are being sponsored by AMD.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
It's not like that at all.

NV's architectures were made to excel for the now. Because the now matters more for people when they decide which GPUs to buy, they look at current benchmarks and see that NV is on top. They don't think about a year or two years down the road how the GPUs stack up.

Thus, NV's architectures give them the dominant position and it's the winning design. It also promotes more frequent upgrades and such, it generates more revenue for NV. It's a good strategy.

That's two sides of the same coin. NV knows what's here now, while AMD can tell themselves that their designs are what games are going to want in the future. There's advantages to both.
 
Feb 19, 2009
10,457
10
76
That's two sides of the same coin. NV knows what's here now, while AMD can tell themselves that their designs are what games are going to want in the future. There's advantages to both.

Not for company profits, NV's approach is clearly superior for that.

AMD's approach is what resulted in my answer to guskline when he asked me whether I will upgrade to Polaris 10 from my R290X: No I won't, because the 290X is still kicking ass and I only paid $250 for it brand new.

That isn't good for AMD's bottom line, it's only good for me.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Not for company profits, NV's approach is clearly superior for that.

AMD's approach is what resulted in my answer to guskline when he asked me whether I will upgrade to Polaris 10 from my R290X: No I won't, because the 290X is still kicking ass and I only paid $250 for it brand new.

That isn't good for AMD's bottom line, it's only good for me.

Also we have to remember that AMD does not have the resources to constantly replace their lineup of GPUs generation after generation technology wise. This is mainly due to their financials and potentially engineering resources too. nVIDIA on the other hand can do this unless their financials take a huge hit Q after Q.

They've been milking the GCN architecture for a very long time now with many rebrands and they are prolonging the life of these video cards by other methods e.g. APIs that suit their architecture more, console wins etc. I mean if AMD had driver optimizations on the same level as nVIDIA or better, would they really have pushed mantle as they did now? I don't think so.
 
Feb 19, 2009
10,457
10
76
Also we have to remember that AMD does not have the resources to constantly replace their lineup of GPUs generation after generation technology wise. This is mainly due to their financials and potentially engineering resources too. nVIDIA on the other hand can do this unless their financials take a huge hit Q after Q.

They've been milking the GCN architecture for a very long time now with many rebrands and they are prolonging the life of these video cards by other methods e.g. APIs that suit their architecture more, console wins etc. I mean if AMD had driver optimizations on the same level as nVIDIA or better, would they really have pushed mantle as they did now? I don't think so.

Yes, AMD has less R&D dollars, so they have to make it count. It's vital they come up with a uarch that's flexible and future-proof to keep going strong for the next decade etc. It means they can make cheaper iterations to the uarch rather then entire revolutions.

The Mantle/Vulkan/DX12 thing... if you were in control and your GPU architectures make it into all the major consoles, you would do a similar push with next-gen API to take that advantage to the PC as well. It's a good strategy.

We tend to forget, but when AMD announced Mantle, NV was not happy. They knew what would be coming down the road, and so they started/ramped up the GameWorks initiative to fight back by gimping AMD as much as possible.

Moving forward, games which NV won't sponsor, will come GCN optimized. In many ways, this is good for AMD as they will benefit from GCN optimizations for consoles unless NV has the final say. The onus is on NV to sponsor the PC port to ensure it runs optimized on their architectures!
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not for company profits, NV's approach is clearly superior for that.

AMD's approach is what resulted in my answer to guskline when he asked me whether I will upgrade to Polaris 10 from my R290X: No I won't, because the 290X is still kicking ass and I only paid $250 for it brand new.

That isn't good for AMD's bottom line, it's only good for me.

Tell me though. If there was a clear upgrade path from both vendors which would you go with?

And this isn't meant to be a loaded question. I just don't want to make it too leading.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Also we have to remember that AMD does not have the resources to constantly replace their lineup of GPUs generation after generation technology wise. This is mainly due to their financials and potentially engineering resources too. nVIDIA on the other hand can do this unless their financials take a huge hit Q after Q.

They've been milking the GCN architecture for a very long time now with many rebrands and they are prolonging the life of these video cards by other methods e.g. APIs that suit their architecture more, console wins etc. I mean if AMD had driver optimizations on the same level as nVIDIA or better, would they really have pushed mantle as they did now? I don't think so.

DX11 had to go. It's very old compared to how often DX is typically upgraded. MSFT got complacent due to lack of competition. AMD gave them a kick to get them moving.
 
Feb 19, 2009
10,457
10
76
Tell me though. If there was a clear upgrade path from both vendors which would you go with?

And this isn't meant to be a loaded question. I just don't want to make it too leading.

No hesitation, AMD. Because I prefer for my gaming dollar to go as long as possible and AMD has proven GCN keeps on keeping on.

But, the majority aren't as informed nor they may not hold that view and/or they don't keep their GPUs for that long.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No hesitation, AMD. Because I prefer for my gaming dollar to go as long as possible and AMD has proven GCN keeps on keeping on.

But, the majority aren't as informed nor they may not hold that view and/or they don't keep their GPUs for that long.

Well, that's AMD's job to change.
 

flopper

Senior member
Dec 16, 2005
739
19
76
Also we have to remember that AMD does not have the resources to constantly replace their lineup of GPUs generation after generation technology wise. This is mainly due to their financials and potentially engineering resources too. nVIDIA on the other hand can do this unless their financials take a huge hit Q after Q.

They've been milking the GCN architecture for a very long time now with many rebrands and they are prolonging the life of these video cards by other methods e.g. APIs that suit their architecture more, console wins etc. I mean if AMD had driver optimizations on the same level as nVIDIA or better, would they really have pushed mantle as they did now? I don't think so.

wrong.
game developer like DICE Johan Andersson had asked for this for years, AMD simply built the framework for them what they asked for.
AMD now created a revolution for gamers, DX12, Vulkan as Mantle is the base code for the new evolution of PC Gaming all running with AMD GCN tech and we already know it runs better on AMD Hardware with DX12.

Why optimize for a broken API as DX11 is?
Do you repair your cellphone or just buy a new one?

Mantle from AMD created a new era for PC gamers.
They did listen to what game developers wanted and gave them that.
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
Also we have to remember that AMD does not have the resources to constantly replace their lineup of GPUs generation after generation technology wise. This is mainly due to their financials and potentially engineering resources too. nVIDIA on the other hand can do this unless their financials take a huge hit Q after Q.

They've been milking the GCN architecture for a very long time now with many rebrands and they are prolonging the life of these video cards by other methods e.g. APIs that suit their architecture more, console wins etc. I mean if AMD had driver optimizations on the same level as nVIDIA or better, would they really have pushed mantle as they did now? I don't think so.
if by milking you mean
having the consoles 100%
having most of the mobile area with ardeno desing
making mantle that is now dx12/vulkan
constantly improving a desing that has been proven to work and is aging well

then yeah they do afterall a card that is 4 years old almost is competing with a card that is a year old..
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
wrong.
game developer like DICE Johan Andersson had asked for this for years, AMD simply built the framework for them what they asked for.
AMD now created a revolution for gamers, DX12, Vulkan as Mantle is the base code for the new evolution of PC Gaming all running with AMD GCN tech and we already know it runs better on AMD Hardware with DX12.

Why optimize for a broken API as DX11 is?
Do you repair your cellphone or just buy a new one?

Mantle from AMD created a new era for PC gamers.
They did listen to what game developers wanted and gave them that.

I can't take your post seriously when your making statements like that im afraid. Your analogy doesn't make sense either.. Because I can repair the cellphone if i deem it cheaper and less of an investment than buying a new phone.
 

Rhael

Junior Member
May 3, 2016
4
0
6
Not for company profits, NV's approach is clearly superior for that.

Not for our profits. I couldn't care less about their profits. We should always buy accordingly to what they offer us not whats best for them. I dont like to be forced to spend 500$ every 2 years just because its good for some company.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
No hesitation, AMD. Because I prefer for my gaming dollar to go as long as possible and AMD has proven GCN keeps on keeping on.

But, the majority aren't as informed nor they may not hold that view and/or they don't keep their GPUs for that long.

I strongly believe this is the case. There is a healthy number of users that upgrade frequently, I'd say perhaps every other year. They aren't buying top of the line GPUs, but they go from GTX 670 to GTX 970 and will likely go to GTX 1070 because each upgrade was roughly two years apart and came with a good performance increase.

The more lucrative buyer are going from GTX 680 to 780 to perhaps 780 Ti to 980 and probably 980 Ti.

I think AMD missed the bus with their uarch and their release schedule. Yes, it's a win/win for consumers, but that is slowly what is killing AMD in marketshare and overall profits.

Look at the AMD Enthusiast:
Bought HD 7970, OC it skip 7970 Ghz, bought 290X, OC it skip 390X, and possibly buy Fury X.
The jumps between flagships was minimal in the same time frame that NV release their own flagships.

On the two year cycle, you had 7870 having to hold out until 280X/285/380X which was a less upgrade versus some one going from 670 to 970, and if that buyer was wise, they jumped on bargain sale 290/290Xs and will probably stay there skipping 390/390X and possibly Polaris 10.

AMD is as much fighting NV as they are fighting themselves.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
https://twitter.com/Battlefield/status/727241279734726656

i wonder could bf5 be the first true vulkan game? seems like both companies are advertising it

I doubt it will be vulkan showcase, Vulkan still seems pretty fresh and needs some work from the API and driver perspective.

I do see it going DX12 at launch with vulkan patch later for Win7/8, but doubt we'll see it on Linux. Also I don't see them removing DX11 support in favor of Vulkan only due to lack of support for older cards.

That said, DICE has always pushed the latest techs hard, they were the reason for most of my upgrades due to needing newer video card to play, so I'd love to see them continue that and force vulkan/dx12 only, but I don't see them doing it due to lowered sales
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
I doubt it will be vulkan showcase, Vulkan still seems pretty fresh and needs some work from the API and driver perspective.

I do see it going DX12 at launch with vulkan patch later for Win7/8, but doubt we'll see it on Linux. Also I don't see them removing DX11 support in favor of Vulkan only due to lack of support for older cards.

That said, DICE has always pushed the latest techs hard, they were the reason for most of my upgrades due to needing newer video card to play, so I'd love to see them continue that and force vulkan/dx12 only, but I don't see them doing it due to lowered sales
j.andresson were saying they are trying to remove dx11 very agressive till the summer and that statement was on early april who knows(he also said that bringing down games to vulkan into other systems is proving to be difficult because they need to rewrite for the old wddm system )
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
j.andresson were saying they are trying to remove dx11 very agressive till the summer and that statement was on early april who knows(he also said that bringing down games to vulkan into other systems is proving to be difficult because they need to rewrite for the old wddm system )

Oh I know he posted about wanting DX12 only by the end of this year, but that will cut out a lot of hardware... granted much of that hardware can't run the game well anyway..

Honestly though I'd hope they go pure DX12 as I know they can create very optimized games and having access to DX12 without having to tag along DX11 would allow them to design the engine how they've wanted since 2011(? can't find the article now), and which is why they helped to create Mantle.

I'm just trying to not get my hopes up for DX12 only for it as I've been dissapointed with the last few major AAA releases this year
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Not for company profits, NV's approach is clearly superior for that.

It really wasnt. Because while looking at the future, AMD still man-handled the present. Hawaii was no slouch in dx11. Tahiti was not either. Main issue would have been games like mmos that exposed dx11 overhead issues.

For AMD its a bad image and bad PR that gave them trouble, not looking to the future. dx12 is just giving their consumers better value.
 

renderstate

Senior member
Apr 23, 2016
237
0
0
This future vs. present philosophy is simply not substantiated by facts.

Both companies have innovated a lot over the years, with new features, new APIs, new rendering algorithms, etc.

AMD has had a large influence over the last round APIs, but don't forget NVIDIA invented compute (on GPUs) from scratch (this is just an example out of many possible).

The truth is that sometimes ideas stick and sometimes fail, and both companies had (and will continue to have for the foreseeable future..) alternating fortunes.


Sent from my iPhone using Tapatalk
 
Feb 19, 2009
10,457
10
76
This future vs. present philosophy is simply not substantiated by facts.

Sure, according to you.

Let's see what people involved in designing GCN have to say.

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?print=1

Cerny is convinced that in the coming years, developers will want to use the GPU for more than pushing graphics -- and believes he has determined a flexible and powerful solution to giving that to them.

"The vision is using the GPU for graphics and compute simultaneously," he said. "Our belief is that by the middle of the PlayStation 4 console lifetime, asynchronous compute is a very large and important part of games technology."

Cerny envisions "a dozen programs running simultaneously on that GPU" -- using it to "perform physics computations, to perform collision calculations, to do ray tracing for audio."

"The time frame when we were designing these features was 2009, 2010. And the timeframe in which people will use these features fully is 2015? 2017?" said Cerny.

"Our overall approach was to put in a very large number of controls about how to mix compute and graphics, and let the development community figure out which ones they want to use when they get around to the point where they're doing a lot of asynchronous compute."

Cerny expects developers to run middleware -- such as physics, for example -- on the GPU. Using the system he describes above, you can run at peak efficiency, he said.

"If you look at the portion of the GPU available to compute throughout the frame, it varies dramatically from instant to instant. For example, something like opaque shadow map rendering doesn't even use a pixel shader, it’s entirely done by vertex shaders and the rasterization hardware -- so graphics aren't using most of the 1.8 teraflops of ALU available in the CUs. Times like that during the game frame are an opportunity to say, 'Okay, all that compute you wanted to do, turn it up to 11 now.'"

Sounds great -- but how do you handle doing that? "There are some very simple controls where on the graphics side, from the graphics command buffer, you can crank up or down the compute," Cerny said. "The question becomes, looking at each phase of rendering and the load it places on the various GPU units, what amount and style of compute can be run efficiently during that phase?"

You can go back to Raja Koduri's GCN speech at the launch of Tahiti back in 2011 as well if you want evidence that AMD designed it with a future with heavy compute usage in games in mind.

Now compute or advanced API aside... At the very basic level, designing for the future can be exemplify by this: 680 2GB vs 7970 3GB. We said the 2GB is going to break that card 2 years down the road, and it's true. No future proofing at all was designed into that Kepler 680. Today, a 280X (7970) and it's equivalent 380X, are often ahead by a huge margin.

So don't say things like there's no future proofing. It's demonstrably false.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Tell me though. If there was a clear upgrade path from both vendors which would you go with?

And this isn't meant to be a loaded question. I just don't want to make it too leading.

AMD in a heartbeat. It's not even a question. Between the money I saved going with a freesync screen over its gsync equivalent (even though I got a refurb xr341ck that cuts that gap to "only" $300) and an expectation that I can get another year out of AMD cards, I can step up a bracket for roughly the same expected outlay. That's going to make a much bigger difference than any theoretical advantage NV holds and is why I bought a freesync screen.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
AMD in a heartbeat. It's not even a question. Between the money I saved going with a freesync screen over its gsync equivalent (even though I got a refurb xr341ck that cuts that gap to "only" $300) and an expectation that I can get another year out of AMD cards, I can step up a bracket for roughly the same expected outlay. That's going to make a much bigger difference than any theoretical advantage NV holds and is why I bought a freesync screen.

This is how AMD's bottom line can be helped. All they have to do is deliver the goods. That's not always easy, of course.

The thing I keep in mind is that AMD's performance relative to nVidia's typically improves. Sometimes dramatically. Like now with DX12.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |