Does GameWorks influences AMD's Cards game performance?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I don't even know why you bother posting your opinion when you openly admit to being a fanboy. Either way, GameWorks isn't used on consoles, and consoles don't use the same drivers as PCs. This is an irrelevant post. The fact that you're trying to make Nvidia into a victim against all evidence is kinda sad.

Would you rather he post and keep his favoritism non-divulged? Deny it at every turn when questioned about it, yet keep continually posting they way he does? Probably not. I think it's better to be in the open about ones position. Everybody has bias from one degree to another. Some quite severe. You should be so lucky to have a forum full of people who are honest about it.
 
Feb 19, 2009
10,457
10
76
Another example: it took NV a while to get optimized drivers for TressFx in Tomb Raider, I believe. Didn't NV say that they got final game code right before the launch? Pretty sure there was a huge disadvantage (though I think that has since disappeared).

This is the fundamental difference that some of you guys need to appreciate.

AMD GE program may run poorly on NV on launch, certainly it did so in Dirt with Global Illumination & Tomb Raider with TressFX. The reason is obvious, because it was a new feature using dx11 compute. What makes this different to GameWorks libraries? Because NV got access to the final game code before the launch. It's all open source, they are free to look into the code and optimize their drivers.

Note how fast it took for them to improve performance, it must have been a month tops. It also didn't require the GE developers to release an NV performance patch. It's all on NV to use the open source nature and do it themselves.

Here's the other thing, AMD attends GDC and presents its features and ways to optimize for it, to ALL developers, including NV. Their source code is freely available on their website. NV pulled theirs down at the start of GW.

With AMD GE, the program is there to give them an assurance that the game on launch, will run well on AMD GPUs. That's it. No ulterior motive. The difference with GameWorks is that on launch, the game may not even be optimized, because the developers cannot optimize GW code without permission from NV. If they do get to optimize it, they cannot share that with AMD so it only benefits one vendor.

Thus, GW has only 1 purpose: to run better on NV than AMD. It achieves that goal in every single GW title released so far. It takes 3-4 months and requires the developers to release an official AMD performance patch for the situation to improve for AMD GPUs.

Re: Dragon Age 2. That was a special case, BioWare admitted they stuffed up, with texture & performance bugs for NV GPUs. Two weeks later, they released a patch that fixed the issue.

It is interesting to note that big neutral games, AMD & NV performance are very close. AMD GE games, performance are very close, certainly all the GE games lately: Alien Isolation, Civ BE, Dirt Rally, on release, they ran excellent on all hardware. So neutral devs & GE devs seem to be able to optimize for both vendors. GW devs seem to only optimize for NV. Facts speak for themselves.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I had no problem with Mantle.

That's not how Mantle works at all and you know it. Also, the reason you didn't have a problem with Mantle is because it didn't take off. If it had, you'd be crying fowl.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
Do you all seriously believe the majority of code is not reused from x86-64 consoles? Wow, ok, whatever floats your boats.

Maxwell sales, I'm only counting GeForce 900 series, which the much higher volume GTX 960 only started selling in 2015. If I'm wrong about GM204 sales, so be it.

Of course Nvidia is gonna choose to be profitable company, guess what? AMD is gonna do the same too.

http://arstechnica.com/gadgets/2015...cheaper-solution-will-refocus-on-performance/

But hey, all companies should sell large die size, high performance class products at low, low prices and shoot themselves in the foot right?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
That's not how Mantle works at all and you know it. Also, the reason you didn't have a problem with Mantle is because it didn't take off. If it had, you'd be crying fowl.

It's major job, to me, was creating enough awareness so standards may be forged. I was behind this from the start and agreed with Huddy and Repi.
 

Alatar

Member
Aug 3, 2013
167
1
81
Does anyone not push the power control slider all the way to the right anyway?

Reviewers and people who run out of the box settings.

My main Titan has a 700W limit so I don't but a lot of people do.

Either way, dunno what's causing crazy power use on AMD in pCARS, but that's the problem at least for my tahiti.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I was very vocal about the wrong spec disappointment at Rage and desired to hear this will not happen again. If ya can't trust at face value specs of one's sku's, one may lose precious credibility. Jen-hsun Huang offered he understood why some were disappointed and voiced, it will not happen again. It's what I desired to hear from the leader of nVidia.



Why? Why should a company that desires to offer more have to wait for others? Support and Q/A their hardware?

Well, Nvidia can do what they want. The developer should know better. If a feature introduced cripples the crap out of the competition. Maybe, optimize for the other card, too, or don't use it? It looks bad on the developer. It makes them look lazy. It makes it look like they're taking a bribe from Nvidia at the cost of their CUSTOMERS, the gamers. You know the saying (true or not), perception is what matters. To me, that's how I perceives developers when they incorporate GamesWorks into their title.

Every GamesWorks title from the past year has proven it cripples AMD. No doubt about it. Like I said, Nvidia can do what they want. But, the blame is on the developers for allow crap like this to happen.
 
Feb 19, 2009
10,457
10
76
Mantle did not exist to gimp NV because it didn't even run on NV in the first place. It also didn't prevent the devs from optimizing the dx11 path for NV hardware. As we can see, in every game with Mantle thus far, Nv's performance on dx11 is very competitive.

Mantle v GW isn't comparable unless you like to draw a realllyy looooong bow.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Maybe, optimize for the other card, too, or don't use it?

imho,

It's nVidia's work and investment -- why should they incur the costs of support and Q/A on every piece of silicon and IHV drivers? The idea is to do no harm but simply add value for nVidia customers, fidelity improvements, gaming experience improvements, out-of-the-box performance.

If nVidia did actual harm and calculated, intentional, developers or at least some developers would be very vocal and expose them as frauds.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
All I see is negativity. PC gaming is pretty strong, due in part to nvidia but according to many here they are only evil and can't do anything good.

How exactly does nV make PC gaming stronger? If Intel made GPUs as good as NV/AMD and we have a duopoly consisting of Intel/AMD or Intel/NV in GPUs, it wouldn't change much. What makes PC gaming stronger is the developer focusing on making the game specifically for the PC, which hardly any of them do anymore. Taking a console port, adding GW/GE features isn't proof that the developer is dedicated to PC gaming. RockStar reworking GTA V shows they actually did care. There are plenty of AMD GE/NV GW titles which are 97% console ports through and through and no amount of TXAA or PhysX can hide it.

I had no problem with Mantle.

This comparison would only be valid if AMD inserted Mantle API inside AAA games and NV cards were forced to run Mantle API and nothing else and had no way to optimize their driver for the Mantle API. As it stands, Mantle was a completely separate path to run a game that had nothing to do with NV. GW impacts all gamers, not just AMD/Intel but also NV users with older cards which have seen themselves crippled ever since GW started.

I have no problem with IHV's desiring to offer gaming experience value for their customer base and most of the graphical effects are fidelity settings that may go beyond what the developer intended. I have no problem if a company desires to risk and invest.

So in your opinion what makes TWIMTPB different from GW then and why would you advocate for the latter over the former? The statement you made doesn't address this point. You can invest into new graphical features without alienating 85% of the GPUs in the world.

Think about it, when NV pushed tessellation, was tessellation open source or was it a GW SDK? While you might disagree but if NV allowed AMD/Intel GPUs to run PhysX, PhysX would have exploded in games by now. GW and PhysX are basically the exact same thing - it's a process of vendor lock to try to make your products look better than the competitors at the expense of everyone else in the market who doesn't own an NV product. Since you only buy NV GPUs, this concept is often hard for you to understand. Also, since Intel is MIA in performance GPU segment today and doesn't have its own Intel GameWorks program, everything is fine and dandy in the NV GPU owner's land...for now.
 
Last edited:
Feb 19, 2009
10,457
10
76
@SirPauly
"Maybe, optimize for the other card, too, or don't use it?"

Not NV optimize GW for AMD lol.. that would never happen. I think he means developers. If they use a GW feature and optimize it, they can't optimize it for AMD, its against the contract as shown by RS. Devs cannot share GW optimizations with AMD, period.

There must be a time limit on the clause, because so far, every single GW title released (I am not making this stuff up, absurd as it sounds, its true!) an AMD performance patch 3-4 months post-launch that fixes the problems.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
imho,

It's nVidia's work and investment -- why should they incur the costs of support and Q/A on every piece of silicon and IHV drivers? The idea is to do no harm but simply add value for nVidia customers, fidelity improvements, gaming experience improvements, out-of-the-box performance.

If nVidia did actual harm and calculated, intentional, developers or at least some developers would be very vocal and expose them as frauds.

Spoken like a PR press release. Yeah, the idea is not to do harm. The result, however, is the exact opposite.

Like I said, Nvidia can do whatever they want. Developers should make the decision to optimize for AMD if they choose to use GamesWorks. It's the developers' responsibility to make sure it runs well on as many GPUs as possible.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
imho,

It's nVidia's work and investment -- why should they incur the costs of support and Q/A on every piece of silicon and IHV drivers?

I would believe what you just wrote if you were a 16 year old just joining PC gaming and didn't know any better.

Do you not remember how NV worked with CryTek to push SM3.0? Do you not remember how NV pushed tessellation in Batman and HAWX games? But all of those next gen features were completely open source.

Imagine a world where starting from GeForce 2, every single next generation feature from compute shaders to global illumination to SM3.0 to tessellation was inserted into AAA games as part of NV's proprietary GW's SDK? PC gaming would not be what it is today and you know it.

I never whined about it when I gamed on ATI hardware for nearly a decade.

ATI never inserted proprietary game code that could never be optimized by NV in any game. How are you even making this point? If ATI did this during 9700/9800Pro to X1950XTX generations, NV would be a crippled firm today because ATI beat NV in every single one of those generations as far as flagship performance goes without any proprietary game code tricks.

When did you game on ATI hardware for nearly a decade? From the time I joined the forum, you've always had NV cards. You even purchased GeForce 5, the worst generation of all time which was impossible to buy for anyone but hardcore NV fans or Linux users. It was worse in every way compared to ATI's cards unless you just played ONLY OpenGL games and flight sims all day. MSAA/AF and DX9 performance was atrocious but you still bought GeForce 5. The awful tri-linear texture filtering, absolutely abysmal 2D IQ, and inferior MSAA IQ alone made GeForce 5 irrelevant even before performance was brought into the equation. Since GeForce 5's MSAA IQ was so much worse, you needed to go 1 level higher to match ATI's IQ which means the already inferior performance took an even bigger dive. This is going off-topic but your example doesn't even make any sense since ATI never engaged in anti-competitive practices where upon they would team up with a developer and send closed/locked proprietary source code to be inserted into the title!
 
Last edited:

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
The GameWorks middleware PhysX and Apex is multi-platform at this time, imho.

Those have been around since before GameWorks even existed, and the code used on consoles isn't the same as what's used on PC. Again, irrelevant.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Would you rather he post and keep his favoritism non-divulged? Deny it at every turn when questioned about it, yet keep continually posting they way he does? Probably not. I think it's better to be in the open about ones position. Everybody has bias from one degree to another. Some quite severe. You should be so lucky to have a forum full of people who are honest about it.

That's really hypocritical coming from you...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The GameWorks middleware PhysX and Apex is multi-platform at this time, imho.

PhysX can be run on a CPU but you cannot run PhysX on an Intel or AMD or Matrox or PowerVR GPU. PhysX is vendor locked only for NV GPUs. NV probably knows AMD's cards would run PhysX much much faster because GCN is superior in compute and it has more shaders which are directly tied to PhysX performance. If AMD got a hold of PhysX and optimized for it, the chance of a 1664 CUDA core 970 outperforming a 2816 shader 290X is probably close to 0%. By the very act of NV acquiring Ageia, they have destroyed any chance of next generation advanced physics effects for all PC gamers. Since that acquisition, PhysX remains DOA except for Batman games and the BL franchise. Almost a decade has passed and so far we've hardly seen any progress with PhysX, thanks to its locked/proprietary nature that alienates 85% of the gaming market - Intel/AMD users.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
When did you game on ATI hardware for nearly a decade? From the time I joined the forum, you've always had NV cards. You even purchased GeForce 5, the worst generation of all time which was impossible to buy for anyone buy NV fans. It was worse in every way compared to AMD's cards unless you just played ONLY OpenGL games and flight sims all day. MSAA/AF and DX9 performance was atrocious but you still bought GeForce 5.

2000-2008! I am a member of Rage3d since 2001. I did buy a 5900 for 199 for a side grade system for super-sampling, mixed mode flexibility and an image sharpening feature, since my main gaming 9800 pro and XT systems, sadly didn't. Have always desired SSAA flexibility since 1999.
 
Feb 19, 2009
10,457
10
76
@Techhog

Let's not get personal, no need for it. Debate the merits of the topic instead.

GW is IMO, money well spent by NV's marketing department, it helps continue pushing this same mantra: "AMD drivers suck, see, new games come out, their performance blows".

I read the steam forum for Project Cars, many NV users there bash the poor performance of AMD in that game as being "lol AMD drivers, its why I never buy their sh*t".

The masses don't research WHY GW games run so crap on AMD. They only see the result or hear it from their friends and to them, its another example of AMD's failing. It reinforces their already held believes. It makes them continue to want to pay more for NV GPUs.

It simply works very well for NV.

It's big bucks too, NV reportedly paid Ubisoft $2M to help them market AC.

"Unfortunately for both Ubisoft and nVidia, that situation didn’t develop well when Derek Perez, then Director of Public Relations at nVidia stated that "nVidia never paid for and will not pay for anything with Ubi[soft]. That is a completely false claim."

The case of Inconvenient Truth came out when Ubisoft’s own Michael Beadle stated that "there was a [co-marketing] money amount, but that [transaction] was already done. That had nothing to do with development team or with Assassin’s Creed." Back then, we were a part of a heated off-the-record conversation where I was told that Roy Taylor, then lead man for TWIMTPB program contacted Ubisoft and threatened to pull marketing support for all Ubisoft titles, and the sum in question ranged at around two million dollars".

http://www.vrworld.com/2009/11/04/batmangate-amd-vs-nvidia-vs-eidos-fight-analyzed/

There was that alleged $3M spent on Crysis 2 over-tessellation too..

Big bucks provide huge returns for NV in marketing & mindshare at the expense of AMD. As I've said several times, GW alone has a big chance of burying AMD dead if AMD does not retaliate in kind.

Imagine this scenario, Fiji XT with HBM beats Titan X by 10-20% soundly. But in GW titles, its 25% slower. As NV piles on the cash to buy-out devs into GW, more and more GW games will be used in benchmarks, suddenly Fiji XT is slower than Titan X. It wont ever matter that AMD can make better hardware, because GW cripples it. AMD should know & understand, GameWorks is a massive threat to their existence in the dGPU market.
 
Last edited:

casiofx

Senior member
Mar 24, 2015
369
36
61
Gameworks already penalizing the consumers in general, so I don't mind AMD to do the same.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
No, it isn't. PhysX can be run on a CPU but you cannot run PhysX on an Intel or AMD or Matrox or PowerVR GPU. PhysX is vendor locked only for NV GPUs. NV probably knows AMD's cards would run PhysX much much faster because GCN is superior in compute and it has more shaders which are directly tied to PhysX performance. If AMD got a hold of PhysX and optimized for it, the chance of a 1664 CUDA core 970 outperforming a 2816 shader 290X is probably close to 0%.


PhysX and Apex are multi-platform --- The reason why AMD doesn't support GPU physX is they don't desire to support CUDA. With Apex, nVidia is slowly moving to Compute.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
People here should stop assuming that Gameworks cripples other IHVs unless we test out each sample to see how poorly it runs ...

Correlation DOES NOT imply causation!
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |