AMD Announces their GameWorks Equivalent

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
the problem is we have only two vendors for graphics. Both are effectively proprietary to their respective vendors. Gameworks only for nvidia, this new one will only work on AMD.(because nvidia sure as hell won't support it, just like freesync). So at the end of the day we are still stuck with two proprietary formats for basically everything, and it seems to be getting worse as more and more things are created as a single vendor solution.

intel has good reason to support and improve this open-source library. There is no risk since the MIT license stops AMD from trying to take it away later.

nvidia does not write the game engines used in most games. The Unity, Unreal, etc. developers have a strong incentive to offer a single, cross-vendor library that works on 100% of cards. nvidia might or might not offer them much assistance but if you are making millions off of your game engine you don't necessarily need their help. Also, nvidia needs to benchmark well with these engines so the developers do have leverage.

Professors with imaging-related research projects and their armies of grad students. (For example that simulation of ancient London.)

Students and hobbyists looking to contribute to something interesting.

Open source and no licensing hell makes contributing much more attractive.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
So another spectacular fizzer like Mantle coming up?

You mean the one which formed the basis of Vulkan??

If this fizzles like that it means Nvidia will be using it too.

So are you saying Gameworks is doomed or something??
 

DeathReborn

Platinum Member
Oct 11, 2005
2,759
755
136
Ironic considering the best looking & optimized game of 2015 - SW:BF - is an open source title; and the entire track record of AMD GE vs. GWs shows your post is just a pigment of your imagination. Most objective gamers want their games to work well because we don't want to be locked into a specific CPU/GPU vendor for life.

Care to provide links to the source code and documentation proving that I won't be sued for toying with anyones IP/trademarks.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
"As you read this AMD will have officially announced GPUOpen, a platform that delivers open source tools, graphics effects, libraries and SDKs. GPUOpen is AMD’s initiative to offer developers support with a robust set of tools and resources to extract the most out of GPUs for both gaming and compute applications. It enables game developers to create more beautiful, complex and immersive game worlds. And facilitates the employment of the powerful parallel engines inside modern GPUs for computation. All under a cohesive and easily accessible open source umbrella."


This is why I have respect for AMD. Always trying to push the industry forward in an open source fashion.

http://wccftech.com/amds-answer-to-...-source-tools-graphics-effects-and-libraries/

My view is AMD uses open source so it isn't responsible for finished product
 
Feb 19, 2009
10,457
10
76
This is false.



Source

Sorry but that's pure BS. The first PR message from that studio was "Our software engineer has identified the problem for AMD, the physics is polling at 600hz and that destroys CPU performance on AMD." But obviously not for NV.

You can choose to go back and read their official forum posts yourself or you can peddle more lies.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
My view is AMD uses open source so it isn't responsible for finished product

...Why should a graphics card vendor be responsible for the development of a video game?

Unless, of course, they try and shoe horn closed-source, black-box and inefficient code into it.


Open source is always better than closed source, anyhow.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
...Why should a graphics card vendor be responsible for the development of a video game?

Unless, of course, they try and shoe horn closed-source, black-box and inefficient code into it.


Open source is always better than closed source, anyhow.

But its not just video games which AMD use open source is it!..Point is, if its crap or runs poorly, AMD will just pass the buck.....as always.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
So another spectacular fizzer like Mantle coming up?

This post is the definition of trolling from the post content to even the user image tag.

We could also talk about the lack of Gsync monitors and how that failed against Freesync despite Nvidia having 80% of the GPU market. How is that possible then that Freesync has more options?

If you twist the story/goal posts, everyone looks bad. But I doubt you care either way about facts, you've drawn your line in the sand pretty clearly.
But its not just video games which AMD use open source is it!..Point is, if its crap or runs poorly, AMD will just pass the buck.....as always.

When Gameworks runs poorly, we pass the buck onto game developers...
It's the same thing, so really I don't see the difference...
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
But its not just video games which AMD use open source is it!..Point is, if its crap or runs poorly, AMD will just pass the buck.....as always.

If it runs slow, and you're a developer, fix it and share the fix. Can't do that with closed source.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Sorry but that's pure BS. The first PR message from that studio was "Our software engineer has identified the problem for AMD, the physics is polling at 600hz and that destroys CPU performance on AMD." But obviously not for NV.

You can choose to go back and read their official forum posts yourself or you can peddle more lies.

Believe what you want, I tend to believe developers over, well... you. Sorry. :\
 

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
You know "open source" is just their way of saying they can't do it, so can some one else do it for free please?

Yah, because gimpworks is amazing software....

But its not just video games which AMD use open source is it!..Point is, if its crap or runs poorly, AMD will just pass the buck.....as always.

None of these nvidia fanboys are making any sense...
 
Last edited:

flash-gordon

Member
May 3, 2014
123
34
101
Better software from hardware maker is always respect for consumer money. That's what makes our GPUs last or truly shine.

The people that complain about an initiative like this are the ones playing gimped games and hoping Pascal fix their performances.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Better software from hardware maker is always respect for consumer money. That's what makes our GPUs last or truly shine.

The people that complain about an initiative like this are the ones playing gimped games and hoping Pascal fix their performances.

For GE titles, I'm going to be ok with R9 290 CF. For GW titles, I'm holding off on buying them till AI/PAscal, so that I have enough GPU HP to play them at decent quality. I won't be surprised if I wait 2 gens of GPUs to play some of those games in 4K as they're so poorly optimized.

The new batman doesn't support CF/SLI, so may be awhile before I touch that. But at least it has GW!
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I can't max out this game! *Whine*

My graphics card can play such-n-such title at 4k, 60 FPS! It must be a consolized piece of **** not worth my time, because it's obviously not making use of my video card! *Whine*
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
You will see a lot of 2016 DX-12 games on the AMD gaming Evolved initiative.

If you have a list you can share, I'd like to see it. Only games I've seen touted as DX12 by AMD (or partners of AMD) is Ashes, Dues, and possibly the PC version of the new Tomb Raider.

If AMD can secure a big list of titles, they can definitely change the tides. Otherwise, I'd expect Nvidia money hatting to keep up.

EDIT:

I can't max out this game! *Whine*

My graphics card can play such-n-such title at 4k, 60 FPS! It must be a consolized piece of **** not worth my time, because it's obviously not making use of my video card! *Whine*

This is the other side of the coin. Can't win with gamers.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
If you have a list you can share, I'd like to see it. Only games I've seen touted as DX12 by AMD (or partners of AMD) is Ashes, Dues, and possibly the PC version of the new Tomb Raider.

If AMD can secure a big list of titles, they can definitely change the tides. Otherwise, I'd expect Nvidia money hatting to keep up.

Also HITMAN and im sure BF5 (coming 2016 holidays) will be fully optimized for AMD hardware although not a pure GE title.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,548
2,547
146
Good news that it is open source.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Also HITMAN and im sure BF5 (coming 2016 holidays) will be fully optimized for AMD hardware although not a pure GE title.
Deus Ex is going to be dx12 probably and will have open source hair tech from amd. Optimized by developer for all GPUs contrary to TW3, where in 3 months the patches pixed horrendous performance impact of failworks.
 

Good_fella

Member
Feb 12, 2015
113
0
0
Deus Ex is going to be dx12 probably and will have open source hair tech from amd. Optimized by developer for all GPUs contrary to TW3, where in 3 months the patches pixed horrendous performance impact of failworks.

Strange think TressFX also took drivers and patches for fixing mess, but ended locked down.



I think Lichdom: Battlemage using TressFX 2.0. I wouldn't be shocked if AMD will use dirty tricks suck as TressFX 1 for Nvidia, TressFX 3 for AMD in Deus Ex.

Also what's the point for AOFX aka HDAO? Enabled HDAO looks same as turned off but kills performance... Nvidia's of course.



http://www.hardocp.com/image.html?image=MTM1NTUxNzk3MlNtdHptSllFZVlfN180X2wuZ2lm
http://www.hardocp.com/image.html?image=MTM1NTUxNzk3MlNtdHptSllFZVlfN181X2wuZ2lm
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
If this fizzles like that it means Nvidia will be using it too.
How does that help AMD?

It's all quite unbelievable. Here they are, bleeding millions every quarter, yet they continue to give away proprietary advantages on a silver platter to competitors.

Here's a hint AMD :- you don't make money by giving things away for free to competitors. That's why you're on the verge of bankruptcy.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
We could also talk about the lack of Gsync monitors and how that failed against Freesync despite Nvidia having 80% of the GPU market.
LOL, wut?

nVidia gets royalties for every gsync monitor.
AMD gets absolutely nothing from freesync.

nVidia is profitable and has increasing market share quarter after quarter.
AMD constantly loses money and market share, and is on the verge of bankruptcy.

By what metric do you consider AMD's situation to be "successful"?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That is not true and just wrong.
TressFX was a mess at the release of Tomb Raider. It needed a few patches to fix all of the problems.
AMD's CHSS in GTA5 were broken at release. I dont know if Rocksteady has fixed them.

Having access to the source code doesnt help much there...

So you know better than he does?
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Strange think TressFX also took drivers and patches for fixing mess, but ended locked down.



I think Lichdom: Battlemage using TressFX 2.0. I wouldn't be shocked if AMD will use dirty tricks suck as TressFX 1 for Nvidia, TressFX 3 for AMD in Deus Ex.

They use TressFX 2.2.
TressFX has two algorithm for translucency. The original PPLL data structure is good on every hardware, but it has high memory usage. The mutex based solution is better, but it may have problems with some hardwares. It's work very well on GCN, but doesn't really fit on any other architecture. That's why Microsoft don't recommend any mutex based approach in D3D11. It doesn't guarantee the good result.
TressFX 3.0 can be implemented with both approach. It is possible to use mutex based solution on GCN and PPLL on any other architecture. The results will be the same. The only difference is the performance. PPLL is much slower and uses much more memory.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |