[bitsandchips]: Pascal to not have improved Async Compute over Maxwell

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
Because one is a statement from the guy responsible for coding the engine.

The other is back-track from their marketing PR, who tried to distance themselves from NVIDIA with absurd claims that they didn't use NV tech, NV didn't sponsor them...

When there's NV logos on their tracks everywhere, NV's marketing featuring the game and there's the guys coding the game presenting at conferences showing evidence to the contrary.



Doesn't take a genius to figure out who are lying scumbags when their exec has blatantly lied about communication with AMD and was forced to back-track publicly.

Let's not rehash this issue again anytime soon because I already beat this horse the last time someone tried to go revisionist, again.


They didn't take money but there's ads for Nvidia all over their game... ok if you say so.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The fanboy wars are real. But as we have multiple documented cases with Nvidia gimping performance, there is no such thing on AMD side. And Nvidia even gimps their own cards like in gtx970 (and then call it good design) so it's insane to see anyone defending them, and I just can't comprehend it.

Can you link me to these documented facts? Im quite interested because ive stayed away from this whole GW debacle. But to actually say something, one must understand it.

From my minimal knowledge, normally a "GW" title adds one or two visual features e.g. hairworks or VXAO etc. Now these will run better on nVIDIA GPUs (well duh?) but won't be optimised for competing products. Also most of the time, these features can be turned off (need citation - correct me if I am wrong) or there is an alternative option that produce the same or similiar effect e.g. SSAO. Also GW features aren't the core of the engine itself but like a small extension that could be bypassed if the developer wants to produce similiar visual effects(?).

Now unless the game cannot have these turned off and the features are forced to run on the GPU (or are just purely built from ground up to work better with a particular architecture), are they really gimping the performance of their competition or are they just providing more features/options for their customers?

Ive always assume that most games (and most of them are ported from the console) just have a few tagged on GW visual features that if turned off will just result in the game being in a vanilla state. Is this a correct assumption?
 

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
Can you link me to these documented facts? Im quite interested because ive stayed away from this whole GW debacle. But to actually say something, one must understand it.

From my minimal knowledge, normally a "GW" title adds one or two visual features e.g. hairworks or VXAO etc. Now these will run better on nVIDIA GPUs (well duh?) but won't be optimised for competing products. Also most of the time, these features can be turned off (need citation - correct me if I am wrong) or there is an alternative option that produce the same or similiar e.g. SSAO.

Now unless the game cannot have these turned off and the features are forced to run on the GPU (or are just purely built from ground up to work better with a particular architecture), are they really gimping the performance of their competition or are they just providing more features/options for their customers?

Ive always assume that most games (and most of them are ported from the console) just have a few tagged on GW visual features that if turned off will just result in the game being in a vanilla state. Is this a correct assumption?



Proof? You want a smoking gun? That ain't happening you realize right? Like asking a cheater if he cheated...

What you can do is compare Farcry 4 which was the poster child for GW, compare it to Farcry Primal. Primal is a re-skinned FC4 but get this, w/o the GW crap.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Proof? You want a smoking gun? That ain't happening you realize right? Like asking a cheater if he cheated...

What you can do is compare Farcry 4 which was the poster child for GW, compare it to Farcry Primal. Primal is a re-skinned FC4 but get this, w/o the GW crap.

I don't quite understand the point of your post but posters have said that its well documented as fact that GW gimps performance (for using GW visual effects?) so I wanted to know where I could find that.

On the topic of FC4 vs Primal, whats the main difference in regards to GW?
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Keep pushing your agenda.

Sorry to disappoint you but I don't have an agenda because number one, the US Air Force pays me, not NVIDIA. Number two, I enjoy both companies cards. Number three, at the end of the day those who are extremely biased will probably never change their opinions and I could care less at this point. I'm just setting the record straight. That's all I'm going to say on that.
 

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
I don't quite understand the point of your post but posters have said that its well documented as fact that GW gimps performance (for using GW visual effects?) so I wanted to know where I could find that.

On the topic of FC4 vs Primal, whats the main difference in regards to GW?


FC4 = full of GW

FCP = no GW
 
Feb 19, 2009
10,457
10
76
@Cookie Monster
The problem with those optional features, is that they are enabled at Ultra settings.

Ie. Fallout 4, on Ultra it would enable GameWorks and so it tanked AMD performance (until AMD released their drivers and the game was patched later).

The latest example, The Division, sites that ran with GW off, AMD has the performance advantage, 390 is much faster than 970. With GW on, AMD performance tanks. There's more data in that game's benchmark thread on this forum if you don't believe me.

The problem is that review sites benchmarks typically run "Ultra" and produce their data that way, with these features enabled.

If you want FC4 vs FCP, here's the thread: http://forums.anandtech.com/showthread.php?t=2467067
 

C@mM!

Member
Mar 30, 2016
54
0
36
I think we can all agree that many GameWorks features are either 'pie in the sky' (HFTS) , badly optimised (hairworks) or bordering on deceitful (over tesselation) which whilst hurts AMD cards the most, often hurts Nvidia cards as well, to the point that competing AMD GPUOpen middleware implementations run better on Nvidia cards than the same Gameworks middleware.

This isn't helpful to anyone, the PC platform, developers, distributors, and most of all, us as consumers.

That being said, it is somewhat hilarious how AMD has pincered the industry though with pretty well much dictating Mantle as Vulkan (donate Mantle, wallah) & DX12 (Microsoft scrambling to keep up). This is okay at the moment whilst AMD is the underdog, but AMD's longterm strategy on this could push Nvidia out of the market.
 
Feb 19, 2009
10,457
10
76
That being said, it is somewhat hilarious how AMD has pincered the industry though with pretty well much dictating Mantle as Vulkan (donate Mantle, wallah) & DX12 (Microsoft scrambling to keep up). This is okay at the moment whilst AMD is the underdog, but AMD's longterm strategy on this could push Nvidia out of the market.

No way. NV has huge revenues, cash reserves, commanding HPC domminance and they are branching out to new markets.

AMD is on the losing end regardless of their Mantle > Vulkan/DX12 gamble. NV may be at a slight disadvantage, but they will fix it via GameWorks $$ until Volta arrives.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
@Cookie Monster
The problem with those optional features, is that they are enabled at Ultra settings.

Ie. Fallout 4, on Ultra it would enable GameWorks and so it tanked AMD performance (until AMD released their drivers and the game was patched later).

The latest example, The Division, sites that ran with GW off, AMD has the performance advantage, 390 is much faster than 970. With GW on, AMD performance tanks. There's more data in that game's benchmark thread on this forum if you don't believe me.

The problem is that review sites benchmarks typically run "Ultra" and produce their data that way, with these features enabled.

If you want FC4 vs FCP, here's the thread: http://forums.anandtech.com/showthread.php?t=2467067

Ok I understand now. Im not surprised that it hurts AMD performance since those GW features would be optimised for nVIDIA hardware.

So this is my general thought on this issue. Unless these features cannot be turned off or there is an alternative option, I can't see what the problem here is nor do I understand why the IHV is getting the flak.

nVIDIA users can use GW features to enhance image quality. Good for them. AMD users could try but will result in lower performance (no surprise there). But they can run the game with the vanilla settings without seeing a decrease in performance right albeit without the enchanced visuals (whether or not it looks better depends on the user so YMMV)?

Shouldn't this be more of a developer fault than anything else? If the argument is about inaccurate benchmarks, shouldn't that be more on the reviewers themselves than the IHV? If apples to apples comparisons need to be made, they could simply bench with no GW features are enabled.

This is what I get from the gist of it. PLus I just read about the difference in primal vs FC4 and it looks like primal strips not just GW features but the users ability to change other graphical settings as well. This is more of a loss to the PC gamer more so than GW vs non GW in my eyes.
 
Last edited:

C@mM!

Member
Mar 30, 2016
54
0
36
No way. NV has huge revenues, cash reserves, commanding HPC domminance and they are branching out to new markets.

AMD is on the losing end regardless of their Mantle > Vulkan/DX12 gamble. NV may be at a slight disadvantage, but they will fix it via GameWorks $$ until Volta arrives.

All comes down to consoles. AMD won't be moved out of the console market for a long time, unless Intel decides to stop fucking around with their GPU's, and with consoles looking to bring in re-releases of consoles with better hardware (whilst maintaining backward compatbility), it will have the effect of solidifying AMD gains.
 

Adored

Senior member
Mar 24, 2016
256
1
16
That being said, it is somewhat hilarious how AMD has pincered the industry though with pretty well much dictating Mantle as Vulkan (donate Mantle, wallah) & DX12 (Microsoft scrambling to keep up). This is okay at the moment whilst AMD is the underdog, but AMD's longterm strategy on this could push Nvidia out of the market.

Yep when Koduri said Radeon would power 90% of the world's pixels he wasn't lying. I don't think they can be stopped unless Intel buys Nvidia.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I think we can all agree that many GameWorks features are either 'pie in the sky' (HFTS) , badly optimised (hairworks) or bordering on deceitful (over tesselation) which whilst hurts AMD cards the most, often hurts Nvidia cards as well, to the point that competing AMD GPUOpen middleware implementations run better on Nvidia cards than the same Gameworks middleware.

This isn't helpful to anyone, the PC platform, developers, distributors, and most of all, us as consumers.

That being said, it is somewhat hilarious how AMD has pincered the industry though with pretty well much dictating Mantle as Vulkan (donate Mantle, wallah) & DX12 (Microsoft scrambling to keep up). This is okay at the moment whilst AMD is the underdog, but AMD's longterm strategy on this could push Nvidia out of the market.

More complex and better looking visuals (this varies between users) would result in lower performance Id expect. Running terrible on AMD hardware is also expected but expecting it not to .. is that a realistic expectation to have in the first place?
 
Feb 19, 2009
10,457
10
76
Ok I understand now. Im not surprised that it hurts AMD performance since those GW features would be optimised for nVIDIA hardware.

So this is my general thought on this issue. Unless these features cannot be turned off or there is an alternative option, I can't see what the problem here is nor do I understand why the IHV is getting the flak.

nVIDIA users can use GW features to enhance image quality. Good for them. AMD users could try but will result in lower performance (no surprise there). But they can run the game with the vanilla settings without seeing a decrease in performance right albeit without the enchanced visuals (whether or not it looks better depends on the user so YMMV)?

Shouldn't this be more of a developer fault than anything else? If the argument is about inaccurate benchmarks, shouldn't that be more on the reviewers themselves than the IHV? If apples to apples comparisons need to be made, they could simply bench with no GW features are enabled.

This is what I get from the gist of it. PLus I just read about the difference in primal vs FC4 and it looks like primal strips not just GW features but the users ability to change other graphical settings as well. This is more of a loss to the PC gamer more so than GW vs non GW in my eyes.

Because review sites don't like to run so many different settings, they bench if on Ultra and get their data, then they summarize it in charts. We get the performance comparison of GPUs from this data, tainted by AMD GPUs running GameWorks.

Technically, it is a developer fault, these "optional" settings should truly be optional and disabled by default unless turned on. But that's not how it is, reality is that GimpWorks does exactly that to AMD in benchmarks.

FCP has plenty of graphical options, read through that thread again if you missed the point. Visually its a superior game. Performance wise it runs much better. Ubisoft added their own light shafts and effects, choosing not to use GameWorks, resulting in optimized features that run better on all hardware.

As for expecting features to run well on all hardware, yes, it's realistic and normal. Example for you, TressFX3.0/PureHair in Rise of the Tomb Raider, ran great on all GPUs, a very small performance hit compared to NV's HairWorks in Witcher 3. That's AMD's open approach benefiting gamers and developers. Regardless of how you feel about AMD or NV, you have to admit GPUOpen is a benefit for gamers and developers in general due to the open and collaborative approach, with an MIT license allowing them full control, without a EULA clause that says their license can be revoked at anytime for any reason (like GameWorks license on NV's recent GitHub "Open Source").
 
Last edited:

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Because review sites don't like to run so many different settings, they bench if on Ultra and get their data, then they summarize it in charts. We get the performance comparison of GPUs from this data, tainted by AMD GPUs running GameWorks.

Technically, it is a developer fault, these "optional" settings should truly be optional and disabled by default unless turned on. But that's not how it is, reality is that GimpWorks does exactly that to AMD in benchmarks.

FCP has plenty of graphical options, read through that thread again if you missed the point. Visually its a superior game. Performance wise it runs much better. Ubisoft added their own light shafts and effects, choosing not to use GameWorks, resulting in optimized features that run better on all hardware.

What do you mean by "truly" be optional? Isn't it? Id assume users would actually adjust settings to their tastes when playing games?

Didn't FC4 also had their own effects in addition to those GW features?

Im referring to this (although its hardocp, they had a nice list of things changed graphics option wise)

Far Cry 4 had an "Enhanced Godrays" mode that created realistic tessellated Godrays. Far Cry Primal does not have this option, instead it just has the Volumetric Fog Godrays.

In Far Cry 4 there was an option to enable HBAO+ Ambient Occlusion for better shadowing and depth. Well, this game has fallen back to Ubisofts own in-house Ambient Occlusion and has in fact not even given us an option to change the quality of this feature, there is no Ambient Occlusion setting at all!

Far Cry 4 had the option to enable a simulated fur on creatures, this feature is gone in Far Cry Primal.

In Far Cry 4 there was an option to enable a much better looking Soft Shadow image quality. Well, this is gone in Far Cry Primal as well.

Finally, the AA options have been stripped down so that is offered is FXAA or SMAA; there are no MSAA or SSAA options.

Id hope that primal would run better because FC4 is the older game.
 

kondziowy

Senior member
Feb 19, 2016
212
188
116
There is also tessellating water under the city in Crysis 2, and overtessellating hairs in witcher 3 beyond the visible difference to hurt AMD and it backlashed on Kepler users, until multiple threads like this were created on Nvidia forums, and Nvidia decided to lower tesselation factor to improve Kepler performance. Can it get more obvious than this? Nvidia playing tessellation card for years, doesn't even care for their own customers. Before this patch Radeon performance with lowered tess scalar was better than Kepler which is hilarious considering it was gameworks feature closed in black box (Witcher 3 developer says it can't be optimised).

Fake gtx970 specs - sure, no problem "it's a feature" Jen-Hsun said on Nvidia blog. And look at those ultra god rays vs low in Fallout 4 eating 1/3 (33%!) of the overall fps. The whole Gameworks program will ramain a joke until fully opened.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Take a look at yourself first bud. Note what I said, it's from their lead engineer, they say PhysX running on the CPU at 600 hz is the problem for AMD.

Nowhere do they or I say it runs on the GPU otherwise.

Do you realize on the console, they would never be able to poll PhysX at that rate, 20x the 30 frames per second. It's just wasteful and hurts AMD on the PC given nobody runs the game at anywhere close to 600 FPS.

Do you know why this only hurts AMD? Think multi-thread DX11 drivers for NV, it's not a problem if PhysX chokes a the main game engine thread.

The game does actually run it's physics simulation at 600hz on console.

It's hard for us to know for sure, but it's more likely the game suffers on AMD due to a high draw call count or something similar.
 
Last edited:

Kippa

Senior member
Dec 12, 2011
392
1
81
I am not an expert and do a little 3d modelling. You'd be surprised how good a very low polygon model can look when it has excellent textures being wrapped around it. You don't have to have uber high polygons to make good models, you need a good modeller and texture person to create a good model. I think you can excellent visuals with a low poly count and still look awesome. I think that Aliens Isolation is a good example of this.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Namely because most people understood that with HBM1, it wasn't possible to cram more on, and the bandwidth gain minimised the impact of 4GB at 1440p+.

And people bitched about the 970 due to 512mb of that 4GB buffer dropping to a 64 bit bus due to the way Nvidia sliced the die.

Um, no. I don't think you know how vram works. In fact I know you don't because if you did, you wouldn't say what you just said. If you need more than 4GB of vram, please explain to me how HBM is going to increase the bandwidth of your system ram when your textures start spilling over to that

It won't. The arguments were foolish back then, and yours is completely nonsensical now.
 

C@mM!

Member
Mar 30, 2016
54
0
36
Um, no. I don't think you know how vram works. In fact I know you don't because if you did, you wouldn't say what you just said. If you need more than 4GB of vram, please explain to me how HBM is going to increase the bandwidth of your system ram when your textures start spilling over to that

It won't. The arguments were foolish back then, and yours is completely nonsensical now.

As your learned self should be aware

HDD -> System RAM -> vRAM.

Now obviously, the best place for information to be cached is to be in vRAM. However you can make up some of the loss of it having to stream out from System RAM if you can feed your pipeline faster, thus clearing your vRAM cache for the next batch. Add in that many games that take advantage of more than 4GB of RAM are really just cramming as much as they can into the texture cache, rather than it being active, its not a huge deal at this point in time.

However, if you actually read my original reply in context, it was why consumers forgave AMD somewhat for only having 4GB of RAM on the Fury, and the answer is bluntly the performance gains of the new tech and its current implementation limitations.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
There is also tessellating water under the city in Crysis 2, and overtessellating hairs in witcher 3 beyond the visible difference to hurt AMD and it backlashed on Kepler users, until multiple threads like this were created on Nvidia forums, and Nvidia decided to lower tesselation factor to improve Kepler performance. Can it get more obvious than this? Nvidia playing tessellation card for years, doesn't even care for their own customers. Before this patch Radeon performance with lowered tess scalar was better than Kepler which is hilarious considering it was gameworks feature closed in black box (Witcher 3 developer says it can't be optimised).

Fake gtx970 specs - sure, no problem "it's a feature" Jen-Hsun said on Nvidia blog. And look at those ultra god rays vs low in Fallout 4 eating 1/3 (33%!) of the overall fps. The whole Gameworks program will ramain a joke until fully opened.

Well AMD played the tessellation card first if I recall til nVIDIA really pushed it. However isn't crysis 2 more of a developer problem than nVIDIA because the developers are the ones who own and develop the game?

Im not even sure why im replying to this incoherent post but i have to point out that things seem very personal!
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
As your learned self should be aware

HDD -> System RAM -> vRAM.

Now obviously, the best place for information to be cached is to be in vRAM. However you can make up some of the loss of it having to stream out from System RAM if you can feed your pipeline faster, thus clearing your vRAM cache for the next batch. Add in that many games that take advantage of more than 4GB of RAM are really just cramming as much as they can into the texture cache, rather than it being active, its not a huge deal at this point in time.

However, if you actually read my original reply in context, it was why consumers forgave AMD somewhat for only having 4GB of RAM on the Fury, and the answer is bluntly the performance gains of the new tech and its current implementation limitations.

Wrong, that has nothing to do with why people forgave AMD. Why? Because what you say makes no sense. How textures get filtered through the rest of the subsystem is the same for NVidia as it is for AMD meaning your performance tanks either way if you NEED more than 4GB of vram and don't have it.

Why don't you just say "i'm an ADF member and it's ok when they mess up" because you aren't fooling anyone but yourself at this point.
 

C@mM!

Member
Mar 30, 2016
54
0
36
Wrong, that has nothing to do with why people forgave AMD. Why? Because what you say makes no sense. How textures get filtered through the rest of the subsystem is the same for NVidia as it is for AMD meaning your performance tanks either way if you NEED more than 4GB of vram and don't have it.

Why don't you just say "i'm an ADF member and it's ok when they mess up" because you aren't fooling anyone but yourself at this point.

Lel, I own a Titan X. Before that, a 970, before that, a Titan, before that, 3x680's. I have to go back to a 5870 for AMD.

So ah, Try again.

But hey, if you don't believe me, you only need to see the Fury's benchmarks @ 4k. They mostly don't choke.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |