[PCGH.de] Fallout 4 Benchmark

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Isn't it getting old blaming GameWorks every time AMD does poorly in a game? Seems to be the case more and more everyday. What was it yesterday? Oh yeah Anno 2205...surely it can't be that AMD drivers are just lagging in optimization or the game needs more patches, nope no way, it MUST be that evil GW!


Actually it can be both. The world is not binary. Also, Bethesda is known for releasing buggy, unoptimised games. But that doesn't change the fact that GW titles do tend to perform crap on AMD hardware. Does that exclude poor AMD driver optimisations? No, but since GW is a black box, AMD can't do much on these issues as they have to reverse engineer these things which is why performance will improve over time.

I always cracks me up when some NV owners tries to justify GW. I use a 980 and I don't want it at all. Just as I can gladly admit that Freesync is superior to G-Sync and that NV is greedy in not going open source(and Intel should be applauded for choosing the right platform).

GW isn't a win for anyone and it's time for some people to man up to that basic fact.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
You realize this is evidence for the exact opposite of what you're trying to say, right? You are trying to say the performance deficit is AMD's lack of optimizing drivers.

Well, AMD doesn't do drivers on the PS4 and Xbox One. What is common between all 3 platforms? The code in the game.

I'm sure driver updates will bring both sides up as time goes on, as well updates from Bethesda.

All 3 have AMD hardware in common as well. But if you read my post, I said the game will likely need patches + AMD driver optimizations before performance picks up. But it's easier to blame NVIDIA and GW for AMD's problems rather than own up to AMD's own failures.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
You think AMD wrote the Xbone api or the PS4 api? Really?

AMD doesn't write drivers on console. MS and Sony take a much firmer and more direct hand in the microcode and other driver level interfacing code on the consoles. It is literally impossible for the problem to be 100% due to AMD's drivers since they didnt write any in 2 of 3 cases of bad performance.

The problem is quite obviously with the codebase itself in great part. I dont doubt AMD could do more in their PC driver. Im sure they will as time goes on. I highly doubt it's the primary mover here since the evidence does not suggest it. It's a bethesda game. Its buggy at launch. This is how it goes.

It's possible there is underlying weakness in the AMD hardware vis a vis God Rays, as was the case in FarCry 4, but FarCry 4 also showed the GCN 1.2 based cards to do a lot better than earlier GCN revisions. In the first OP graph though, the Fury barely leads the 390x, which does not support that conclusion. Since Tonga + Fiji are a lot stronger at tesselating compared to GCN 1.0 and 1.1, GCN 1.2 should be punching above it's class if weak tesselating hardware is the problem, but I'm not seeing that trend in the graphs here so far.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
I haven't tested it myself but word is fps shot up by 40%.


Ooh, found it.

Typical gameworks. Not sure why people are always so happy about nvidia performance on gameworks titles.... Needing a 980ti to play at max settings at 1080p especially when enabling a setting can remove up to 30% performance i just don't get it.

It hurts both sides....
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Per usual, cripple performance of AMD cards for no visual upgrade when on Ultra.

So what's the performance like with visual settings that don't change how it looks but give huge boost to performance?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
You think AMD wrote the Xbone api or the PS4 api? Really?

AMD doesn't write drivers on console. It is literally impossible for the problem to be 100% due to AMD's drivers since they didnt write any in 2 of 3 cases of bad performance.

The problem is quite obviously with the codebase itself in great part. I dont doubt AMD could do more in their PC driver. Im sure they will as time goes on. I highly doubt it's the primary mover here since the evidence does not suggest it. It's a bethesda game. Its buggy at launch. This is how it goes.

So explain why it runs well on NVIDIA hardware but poorly on all AMD platforms? Don't use the GW cop out either because this isn't a GW title and even if it was, that still wouldn't be a valid excuse.

Don't AMD fans always tell us how games made on consoles with AMD hardware will translate into better performance and compatibility with AMD PC hardware? So what happened lately? This is two new games now where AMD hardware does poorly. The GW excuses are not going to be much comfort for AMD customers that buy their hardware and find it doesn't run well in new games or they have to wait a long time for AMD drivers to catch up.

For what it's worth, I do think the game will probably need code optimizations for AMD hardware but AMD will likely need to do a lot of work on the driver end as well. It just seems NVIDIA did their part early and came out ahead with a major AAA game and now their market share will get even bigger while AMD hardware owners will be twiddling their thumbs waiting on patches + optimal drivers.

Ultimately it's a fundamental failure of AMD not reaching out to developers to get ahead of these performance issues before the game is released. It should have behooved them to do so with a highly anticipated AAA title like this one and clearly they failed.

Typical gameworks. Not sure why people are always so happy about nvidia performance on gameworks titles.... Needing a 980ti to play at max settings at 1080p especially when enabling a setting can remove up to 30% performance i just don't get it.

It hurts both sides....

Simple, don't run it on ultra if your hardware can't handle it.
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Yah, this looks rather...Crowy.

Though, wasn't Skyrim a similar situation? AMD performed not-so-great initially, but then went neck and neck (even a bit better in frame-times at the high-end) with NVidia once the drivers were hunky-dory?

No big surprise, though; ol' Howard made it known that they were "good friends" with NVidia, but nary a touch on AMD.


More importantly, has the damn 64hz bug been fixed yet? I hate having to run Skyrim with iFPSClamp.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
So explain why it runs well on NVIDIA hardware but poorly on all AMD platforms? Don't use the GW cop out either because this isn't a GW title and even if it was, that still wouldn't be a valid excuse.

Either there is some weakness in GCN itself, or its the game code (including GW code). You can say all day "GW is a cop out" but it doesn't make it true. Black box code is black box code. I suspect it is both the hardware and the code. I very much doubt it's the drivers alone.

If you've got evidence that says the god rays are not black box nvidia provided code I would be very interested to see it. Other posters are saying it is. I haven't seen evidence either way except for this performance data which aligns with every other instance of black box nvidia code. Thus I am leaning that way given its the only present evidence... I am not partisan and quite open to contrary evidence if it exists. I've got no dog in the race, I just want to know what's going on in truth

The other option is that Bethesda coded the god rays (and other effects like it) and they run poorly. Which I think is very likely, because bethesda games are buggy at launch. And this is still a Gamebryo game, and their Gamebryo releases have always been the buggiest of their entire library.

Please recall the last Gamebryo game (Skyrim) launched using x87 and not even SSE. They couldn't even be bothered to make a few small changes to use widely available hardware for PC last time. The original mod maker for skyrim perf claims they didn't even set the compiler flags to use SSE... http://www.kotaku.com.au/2011/12/mod-boosts-skyrim-performance-by-up-to-40-per-cent-could-be-faster/
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Actually it can be both. The world is not binary. Also, Bethesda is known for releasing buggy, unoptimised games. But that doesn't change the fact that GW titles do tend to perform crap on AMD hardware. Does that exclude poor AMD driver optimisations? No, but since GW is a black box, AMD can't do much on these issues as they have to reverse engineer these things which is why performance will improve over time.

I always cracks me up when some NV owners tries to justify GW. I use a 980 and I don't want it at all. Just as I can gladly admit that Freesync is superior to G-Sync and that NV is greedy in not going open source(and Intel should be applauded for choosing the right platform).

GW isn't a win for anyone and it's time for some people to man up to that basic fact.

So just explain to me if GW is a black box how can AMD increase performance via drivers? because it has happened multiple times in the past for e.g. witcher 3, project cars so that argument doesn't seem valid at all. Skyrim wasn't a TWIMTBP title and AMD didn't have CF profiles for it for how long? as long as AMD customers are happy with so little effort on their part they will never bother to release game ready drivers in time.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Either there is some weakness in GCN itself, or its the game code (including GW code). You can say all day "GW is a cop out" but it doesn't make it true. Black box code is black box code. I suspect it is both the hardware and the code. I very much doubt it's the drivers alone.

If you've got evidence that says the god rays are not black box nvidia provided code I would be very interested to see it. Other posters are saying it is. I haven't seen evidence either way except for this performance data which aligns with every other instance of black box nvidia code. Thus I am leaning that way given its the only present evidence... I am not partisan and quite open to contrary evidence if it exists. I've got no dog in the race, I just want to know what's going on in truth

The other option is that Bethesda coded the god rays (and other effects like it) and they run poorly. Which I think is very likely, because bethesda games are buggy at launch. And this is still a Gamebryo game, and their Gamebryo releases have always been the buggiest of their entire library.

Please recall the last Gamebryo game (Skyrim) launched using x87 and not even SSE2. They couldn't even be bothered to set the right compiler flags for PC last time.


My proof is the game is not advertised as a GW title by NVIDIA which it would have if that were the case. Now those that say it is a GW title should prove otherwise. And as I said, even if it were a GW title, it's not a valid excuse for AMD doing so poorly in a highly anticipated AAA title. It's a sign of weakness within AMD as it's not the first time they've failed with developer outreach early to ensure they have top performance on release. Given AMD's financial situation, I'm not surprised and I think we'll see more of this happen in the future.
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
AMD needs to get off their asses and get game ready drivers on launch day. This game is too big not to focus attention on it.

Having said that, GameWorks suck. Nothing comes out of GameWorks except tanking performance while offering piss poor graphic fidelity in return. Developers already have a hard enough time optimizing and fixing bugs for their games, introducing GameWorks worsen the problem with little positive return.
 

Samwell

Senior member
May 10, 2015
225
47
101
Doesn't seem to be a Gameworks Problem:


-27% for GTX980 and -30% for R9 390X for Gameworks. So the Problem is something else than the effects.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
AMD needs to get off their asses and get game ready drivers on launch day. This game is too big not to focus attention on it.

Having said that, GameWorks suck. Nothing comes out of GameWorks except tanking performance while offering piss poor graphic fidelity in return. Developers already have a hard enough time optimizing and fixing bugs for their games, introducing GameWorks worsen the problem with little positive return.

I sort of like the Gameworks additions to FFXIV

Had to play on the HD 5870 (effin' mother-in-law) and you really do notice the small little differences. (Reason I used DX9 is performance cost, DX11 @ 1080p hits that old 5870 like a rock!)
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Doesn't seem to be a Gameworks Problem:


-27% for GTX980 and -30% for R9 390X for Gameworks. So the Problem is something else than the effects.
So gameworks is a 30% fps hit? I mean that's my point. Everyone gets so excited I swear I the 980ti had 30 fps and th fury x had 15 fps at 720p fanboys would still be talking about how great nvidia or amd is(if situations were reversed).
Overall Image quality to gpu horsepower just doesn't seem to matter at all.

No one seems to care that it takes a 980ti to hit 60 fps 1080p res and for what? This game doesn't even look great.

Pc gamers are so stuck on nvidia vs amd that we're completely ignoring the low performance from every gpu, as well as the ridiculous framerate hits for features offering little to no iq benefits.....
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
From nvidias pictures (guide) turning the setting for god rays to low is a huge performance saver and looks better. ULTRA is actually a DOWNGRADE because it makes the light shafts sharper than they should be.

Compared to other games, low is still pretty bad but since the devs never bothered creating their own implementation, we are stuck with it or no light shafts at all.





AMD needs to get off their asses and get game ready drivers on launch day. This game is too big not to focus attention on it.

"Game ready" drivers aren't a must. You've just been conditioned to think they are. Without the nvidia settings performance on amd hardware would not look any sort of way but normal. Sure release a driver for marginal gains later, but its not essential for launch except for marketing.
 
Last edited:

tg2708

Senior member
May 23, 2013
687
20
81
So gameworks is a 30% fps hit? I mean that's my point. Everyone gets so excited I swear I the 980ti had 30 fps and th fury x had 15 fps at 720p fanboys would still be talking about how great nvidia or amd is(if situations were reversed).
Overall Image quality to gpu horsepower just doesn't seem to matter at all.

No one seems to care that it takes a 980ti to hit 60 fps 1080p res and for what? This game doesn't even look great.

Pc gamers are so stuck on nvidia vs amd that we're completely ignoring the low performance from every gpu, as well as the ridiculous framerate hits for features offering little to no iq benefits.....

Amen. Was not paying much attention to the graphs but if a 980ti is needed for 60fps at 1080p to max the game then its poorly optimized. 1080p games should be a cakewalk for such a powerful gpu. I still want the game though but ill need to hold off. I will be boycotting AMD tomorrow. Who's up?
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I always thought of Ultra as some Bullplop setting designed to make people want to upgrade and for taking screen shots.

I'm not adverse to lowering a setting or two to get smooth 60 FPS (god knows I have to on some games that are CPU limited, yo Guild Wars 2 you fix your animations yet?).

"Ultra" setting to me is that "but can it run Crysis" scenario of brute force > optimization. Now if a game runs like horrendous crap at High or lower settings, then its time to get the pitchforks.

From what little I learned of God Rays, put it on High or lower, and enjoy a huge performance gain. Only people going to try to run it on Ultra got OC"d 980 Tis or CFX Fury X's. Let them worry about it.

TL;DR:
"Ultra" settings to me have always been brute force > optimization. Only epeen jocks brag about it. Crysis Syndrome.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
You pretty much only need a GTX980 to hit 60FPS in 1440p with ultra settings.

did you notice there is a chart directly from nvidia showing the 980ti getting below 60fps on ultra at 1080p?
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91

vissarix

Senior member
Jun 12, 2015
297
96
101
Meehhh...not impressed...mark my words guys...3 years from now fury x will catch up the gtx 980...it just needs some driver optimisations :sneaky:
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Of all the things that Nvidia could have done with for their proclaimed support of feature level 12_1 which is great for volumetric rendering but what we get is tessellated volumes with transparent textures ?! *faceplam*

I'd be fine with GameWorks more often if Nvidia actually tried to push the boundaries of computer graphics with it ...
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Meehhh...not impressed...mark my words guys...3 years from now fury x will catch up the gtx 980...it just needs some driver optimisations :sneaky:

aint gunna happen. One or two patches and the balance beteen 960 and furyX will be restored Just enable tessellation of thin air, tessellate the radiation, and if that doesn't help, tessellate the space. Guys at amd will not know what hit them. Also, the next gen will fly of the shelves so that people can reach 30 fps at max settings.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |