Performance Review : Rise of the Tomb Raider

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
According to all AMD crazies, it is NEVER AMD's fault. NEVER. They are always th victim of a huge conspiracy. When Nvidia performs noticeably bad though, nothing at all is wrong.

Weird how the goal posts move in every single situation.

Leave the hyperbole and call outs at the door

Moderator Subyman
 
Last edited by a moderator:

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
Its quite sad though, all the games I look forward gets me shafted since I use AMD. Performance is utter crap especially Arkham knight that was supposedly fixed is giving me nothing but stutters and audio cutouts. Now tomb raider's performance is garbage but luckily I never bought it, will hold out awhile. AMD needs a definite win from their new architecture or else nvidia will have a full monopoly in the graphic card business.

Don't blame AMD. If say BMW would bribe all tire manufacturers so that all tires suck expect on BMW do you blame the tire manufacturer, BMW or the other car companies?

This is just another reason to never buy NV.
How to solve this:

Stop buying tires from bribed manufactures and stop buying BMWs. I think you get what I mean.
 
Feb 19, 2009
10,457
10
76
The thing is they need to contact the devs way earlier, AMD knew that both witcher 3 and fallout 4 are goty material and they completely ignored it, why? why not be more proactive and work with them so that the end users are happy, I don't think AMD has blamed any of this devs at all, don't you think AMD would actually speak up if their performance was gimped somehow artificially? after all they know way more than any reviewers and end users.

You didn't follow the news. AMD did speak out. They even blasted NV for gimping their performance in the Witcher 3 a few months before release when HairWorks was added.

CD PR at the time, said they can't fix it since they don't have access to the source code.. but 3 months after release, they finally update Witcher 3 with a more optimized HairWorks.

We can look at this excuse, that AMD didn't work with devs, with the other infamous example, Project Cars. The dev on Steam lied saying AMD didn't contact them during development. It turns out once AMD made public all the email exchanges, the dev had to recant and said they've been in constant communications.

Why would a developer behave that way? Is it a grudge against AMD or was the studio offered some kind of incentive by the splashing NV logos everywhere in that game?

Now, think about why neutral games, studios which aren't sponsored manage to release optimized titles that run great on both hardware vendors? In recent times I have not found a neutral title that runs gimped on AMD... does that mean AMD is constantly working with all the other studios? No.. because devs who aren't bribed and actually care about their creation, optimize it because ask any software engineer, optimization is their life and pride.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
http://www.computerbase.de/2016-01/rise-of-the-tomb-raider-benchmarks/2/

Another pointless abuse of tessellation for mostly no visual gains. I bet its running x64 mode.

So for AMD users, forcing it to x16 will bring performance up a lot while looking the same.

Since NV couldn't get HairWorks in, it's just easier to ramp up Tessellation to the max for everything else. lol

Have to admit, stunts like this make me hate NV more.

I think now it's time AMD started to "cheat" in drivers. Like the setting you mention above if it helps. If NV plays dirty, just play dirty as well.
 
Feb 19, 2009
10,457
10
76
According to all AMD crazies, it is NEVER AMD's fault. NEVER. They are always th victim of a huge conspiracy. When Nvidia performs noticeably bad though, nothing at all is wrong.

Weird how the goal posts move in every single situation.

So it's AMD's fault that tessellation usage is so wasteful?

http://www.computerbase.de/2016-01/rise-of-the-tomb-raider-benchmarks/2/

20-25% performance loss for zero visual difference as noted by the reviewer.

Let's reverse that, who's fault do you think that is? Crystal Dynamics, for doing such a horrendously unoptimized usage of tessellation (which would have crippled consoles even harder, so no, it's a "PC special feature"), or could it possibly be NV's fault for sponsoring these guys to give them an incentive to do such things?

What about HairWorks in Witcher 3, is it AMD's fault that the default x64 looks visually identical to x16 but performs much worse? Why would x64 be the default, even making CDPR invest to access and modify the source, so they updated it to optimize it?

What about GodRays in Fallout 4, why is it on Ultra vs High or even Medium, looks identical but just performs worse? Who's fault is that?

Someone at NV created these tech and set the default to a very unoptimized level. It must be AMD's fault, right? And you are calling others crazy... nice one.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Why did this end up a Gameworks title when the last TR was a Gaming Evolved title?

Eidos is looking for money. That's the reason why they did the deal with Microsoft.

nVidia paid more. Last time it was AMD. I guess nVidia has learnt their lesson.
 
Last edited:
Feb 19, 2009
10,457
10
76
Eidos is looking for money. That's the reason why they did the deal with Microsoft.

nVidia paid more. Last time it was AMD. I guess nVidia learned their lesson.

And despite receiving help from AMD to implement and improve TressFX 3.0, they realized AMD's open source tech doesn't obligate them to anything... and so, take the $$, switch sides to NV.

Promise VXAO which never made it, but it's good marketing exposure to showcase the new bling.

Dial up tessellation to extreme, it runs much better on NV to meet their obligations and there you have it. Sell out developers.

Because of that, they aren't getting my money.
 

thilanliyan

Lifer
Jun 21, 2005
11,912
2,130
126
They are insane if they think Nvidia is going to let their dominance slip away. We probably won't even see a strong push for DX12 titles until NV unveils/launches Pascal.

Why are people okay with that? What is it like 75-80% of dGPU buyers still give their money to nV so they won't change their methods. If the bolded is indeed what happens, and people actually care about getting some DX12 games asap, they should really vote with their wallets...performance is comparable between the two IHVs (unlike the CPU side) so it's unlikely you are really losing anything by going with one or the other IMO. The market really needs to be closer to 50/50...I certainly wouldn't be okay with it if DX12 games are delayed just to satisfy nV.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Seems like everyone here has more knowledge than the general public, how much money was given to Eidos from Nvidia?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
So it's AMD's fault that tessellation usage is so wasteful?

http://www.computerbase.de/2016-01/rise-of-the-tomb-raider-benchmarks/2/

20-25% performance loss for zero visual difference as noted by the reviewer.

Let's reverse that, who's fault do you think that is? Crystal Dynamics, for doing such a horrendously unoptimized usage of tessellation (which would have crippled consoles even harder, so no, it's a "PC special feature"), or could it possibly be NV's fault for sponsoring these guys to give them an incentive to do such things?

What about HairWorks in Witcher 3, is it AMD's fault that the default x64 looks visually identical to x16 but performs much worse? Why would x64 be the default, even making CDPR invest to access and modify the source, so they updated it to optimize it?

What about GodRays in Fallout 4, why is it on Ultra vs High or even Medium, looks identical but just performs worse? Who's fault is that?

Someone at NV created these tech and set the default to a very unoptimized level. It must be AMD's fault, right? And you are calling others crazy... nice one.

EVERY TIME. Not once is it ever AMD's fault. EVERY TIME it's Nvidia's fault or the dev's fault. EVERY TIME.

Every AMD hardcore fan is chicken little in training.

Again, tone down the language and discuss the tech instead of attacking people

Moderator Subyman
 
Last edited by a moderator:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
You didn't follow the news. AMD did speak out. They even blasted NV for gimping their performance in the Witcher 3 a few months before release when HairWorks was added.

CD PR at the time, said they can't fix it since they don't have access to the source code.. but 3 months after release, they finally update Witcher 3 with a more optimized HairWorks.

We can look at this excuse, that AMD didn't work with devs, with the other infamous example, Project Cars. The dev on Steam lied saying AMD didn't contact them during development. It turns out once AMD made public all the email exchanges, the dev had to recant and said they've been in constant communications.

Why would a developer behave that way? Is it a grudge against AMD or was the studio offered some kind of incentive by the splashing NV logos everywhere in that game?

Now, think about why neutral games, studios which aren't sponsored manage to release optimized titles that run great on both hardware vendors? In recent times I have not found a neutral title that runs gimped on AMD... does that mean AMD is constantly working with all the other studios? No.. because devs who aren't bribed and actually care about their creation, optimize it because ask any software engineer, optimization is their life and pride.

Well if AMD was aware that turning on hairworks was going to be sub optimal for them, did they ask them to probably implement another solution for their user base? they must be more proactive and I am sure CDPR has no evil conspiracy against AMD at all.

The second part you talked about is optimization, due to the complex nature of different PC configs I doubt it is ever possible to optimize a game properly. I know some devs like Blizard/Dice optimize very well but many studios lack that monetary support, they are constantly bullied by publishers to release before deadline ultimately resulting in an unoptimized game. I think that's the reason they are turning to NV so that they can already use a library instead of creating one from scratch. So the ultimate blame of an unoptimized game is actually both on the publishers and devs because you can't expect NV to thoroughly test their features on AMD cards.
 

tg2708

Senior member
May 23, 2013
687
20
81
With all this game work shenanigans it seems as if nvidia is scared of AMD. Look at it, gameworks run like crap on most nvidia cards and even worse on AMD. But why does a company go to the lengths of killing their own cards, burning their customers just to see another suffer? Game works might add a little more spark and or eye candy but when your own designs suffer tremendously it's time to go back to the drawing board.
All I'm saying is this, and while I don't follow the graphic card community closely, if both AMD and nvidia was given the same space or tools to work with I think the former would come out the victor. I'm brand agnostic so if another 3 games come out that I want to play runs well on nvidia I will switch back but I will never try to discount AMD's skill, capabilities etc because they are doing a pretty good job at leveraging an uneven playing field. It just sucks I can't enjoy the games I want to play.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
EVERY TIME. Not once is it ever AMD's fault. EVERY TIME it's Nvidia's fault or the dev's fault. EVERY TIME.

Every AMD hardcore fan is chicken little in training.
Thats the weakest and lamest response yet....
Address the issues raised or :whiste:
 

caswow

Senior member
Sep 18, 2013
525
136
116
EVERY TIME. Not once is it ever AMD's fault. EVERY TIME it's Nvidia's fault or the dev's fault. EVERY TIME.

Every AMD hardcore fan is chicken little in training.

dont you think EVERY TIME nvidia is involved with a game its EVERY TIME that it runs bad on AMD AND KEPLER cards? i have a kepler card and no matter what next time it wont be a nvidia card. it doesnt matter if amd card is worse it wont be a nvidia card again.

nobody ever mentions this but EVERYTHING nvidia is implementing its only USABLE with their strongest card. EVERYTHING. physx or gimpworks everything is not usable with a midrange nvidia card YET it has to be a selling point because amd doesnt have this. its NOT USABLE WITH MIDRANGE CARDS so its NO sellingpoint to so many people. even a 970 you cant use physx or gimpworks effekts without crippling your performance to unplayable fps.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
11,912
2,130
126
If AMD were less terrible with their drivers and not broke maybe they could compete?

I'm using Xfire 290s and have used AMD since the 6950 I had several years ago, onto a 7950, and never had any major issues with games. And I've run many AMD and nVidia cards before that. The only major issue I remember having was with an All-in-Wonder card from the late 90s IIRC. People aren't still trying to live in the 90s are they?
 

tg2708

Senior member
May 23, 2013
687
20
81
I can see certain cards having a performance loss in certain games due to driver immaturity or new tech introduced that the card just can't cope with. BUT the biggest loss in performance comes from a feature or code that adds little to visual fidelity in my eyes while tanking said performance to the ground. The name gameworks is synonymous with bad performance not only here but throughout the Internet and those that don't see the negative implications of this to us as gamers are just blind fanboys.
 

TXBPNT

Junior Member
Sep 19, 2015
19
0
36
nvidia fans,while protecting gameworks,don't realize or deny that their cards are also gimped.I was an GTX 780 owner,in recent gameworks games like fallout 4,a 280x not only outperforms a 780 but also kicks 780Ti's ass.It's like nvida gave me a middle finger and told me to upgrade.Hell no nvidia,that's why i bought a 390x
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Yeah, Kepler got heavily ignored so badly at one point that Nvidia even released a driver to fix things for 700 series, but not sure if it did much good. Makes me pretty negative on buying a future Nvidia card.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
nvidia fans,while protecting gameworks,don't realize or deny that their cards are also gimped.I was an GTX 780 owner,in recent gameworks games like fallout 4,a 280x not only outperforms a 780 but also kicks 780Ti's ass.It's like nvida gave me a middle finger and told me to upgrade.Hell no nvidia,that's why i bought a 390x
Latest beta patch seems to have broken the nvidia optimisations.

At launch I was actually surprised how well Kepler did, a 770 is rarely 35% faster than a 960.
http://www.guru3d.com/articles_pages/fallout_4_pc_graphics_performance_benchmark_review,7.html
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
http://www.computerbase.de/2016-01/rise-of-the-tomb-raider-benchmarks/2/





Another pointless abuse of tessellation for mostly no visual gains. I bet its running x64 mode.

So for AMD users, forcing it to x16 will bring performance up a lot while looking the same.

Since NV couldn't get HairWorks in, it's just easier to ramp up Tessellation to the max for everything else. lol

This is sad:



So consoles are running compute in Async Mode to speed up the scene massively (according to Crystal Dynamic's own statements from last year)... but they tried using DX12 and found no improvements... on NV GPUs maybe? -_-

Looks like we wont be getting that DX12 patch.

Have to admit, stunts like this make me hate NV more.

I'm dangerously close to quitting PC gaming...
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
dont you think EVERY TIME nvidia is involved with a game its EVERY TIME that it runs bad on AMD AND KEPLER cards? i have a kepler card and no matter what next time it wont be a nvidia card. it doesnt matter if amd card is worse it wont be a nvidia card again.

Exactly. NV doesn't only cripple AMD it cripples PC gaming in general with GimpWorks. They even cripple their own last-gen cards. It will drive people away to consoles over the long run hurting them too. It's a stupid short term management decision. They should be working with AMD to get every inch out of PC hardware so to show how much consoles suck an attract more PC gamers. But no, NV rather cripple AMD and alienate their own customers (Kepler owners).

Review sites should start stepping up and complain about GimpShit.
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
I'm dangerously close to quitting PC gaming...


People play games on the PC? All kidding aside, it's too much crazy to deal with drivers, patches, updates etc... in order to play a game. I feel like we're stuck in the mid 90's again and that's bad for gamers and the industry.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
The thing is they need to contact the devs way earlier, AMD knew that both witcher 3 and fallout 4 are goty material and they completely ignored it, why? why not be more proactive and work with them so that the end users are happy, I don't think AMD has blamed any of this devs at all, don't you think AMD would actually speak up if their performance was gimped somehow artificially? after all they know way more than any reviewers and end users.
Actually, even the devs are blaming Nvidia now:

AMD's Richard Huddy
We've been working with CD Projeckt Red (Witcher 3) from the beginning. We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal.
http://arstechnica.com/gaming/2015/...s-completely-sabotaged-witcher-3-performance/


Witcher 3 Dev says Nvidia HairWorks unoptimizable for AMD GPUs

Here is a statement made by CD Project's Marcin Momot, claiming that Nvidia's HairWorks code cannot be optimized to perform well on AMD GPUs;

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.
http://www.overclock3d.net/articles...nvidia_hairworks_unoptimizable_for_amd_gpus/1


The problem, as always, appears to be Nvidia's steadfast refusal to open up GameWorks so that AMD and developers can more easily optimize TWIMTBP titles for Radeon hardware.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |