Mass Effect: Andromeda Benchmarks [PCGH]

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Piroko

Senior member
Jan 10, 2013
905
79
91
Piroko, could you translate this paragraph for me? The online translators aren't really specific enough and I'd like to know what it says concerning the temporal anti aliasing.
Sure.

"There are two different AA settings in the new ME: FXAA and temporal. Both are post processing AAs, but temporal is preferable. FXAA doesn't work on vegetation and will cause heavy flickering on it. Temporal fixes that by applying a temporal component (sic). Though it does produce a softened image which is more visible the lower your resolution is. We didn't notice it negatively at all at 4k."
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Thanks. It's a shame I can't read German since the German game reviewers have such excellent technical analysis. The only English speaking game analyzers that comes close is Digital Foundry. They just put up their PC vs console video in fact, and we can see that tessellation is applied to quite a few objects (including the characters), so its use is very pervasive throughout the game:

 

Piroko

Senior member
Jan 10, 2013
905
79
91
Thanks. It's a shame I can't read German since the German game reviewers have such excellent technical analysis.
Oh I don't consider them to be perfect (or else I would have never started posting on AT). They have the same issues with rushed reviews and short embargo times, though they generally call the short review time out rather than doing a review with underhanded commentary resulting in forced followup clarifications. The last example for this was the Ryzen review where there was hardly any deep analysis in the german reviews (imho).
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The last example for this was the Ryzen review where there was hardly any deep analysis in the german reviews (imho).

I kind of agree with you on the lack of deep analysis. But very few websites overall offer that sort of thing to be honest, regardless of what language they speak. Where the German websites excel at though is benchmarking. They are to me very thorough and will cover a wider assortment of hardware and games compared to say AnandTech.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
This game has the highest RAM usage I've ever seen in a game, at just under 12.5GB. It has gone over 12.5GB a few times as well. Not that it matters, because RAM should definitely be used as much as possible.
 
Reactions: ZGR

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Thanks. It's a shame I can't read German since the German game reviewers have such excellent technical analysis. The only English speaking game analyzers that comes close is Digital Foundry. They just put up their PC vs console video in fact, and we can see that tessellation is applied to quite a few objects (including the characters), so its use is very pervasive throughout the game:


I didn't notice tessellation being applied to characters, but the linked video does indeed provide a very clear example of it being applied to the ground layer at 2m49s.

It also applies it to some (but not all) of the various rock formations in the game, but here it looks subtle enough, that I doubt anyone would notice the difference between 8x and 16x.

On that note it would be interesting to see some comparisons of up close geometry to see the difference between 8x (AMD optimised) and 16x tessellation. The computerbase.de examples (8x, 16x) makes it quite clear that the difference between distant geometry is all but unnoticeable, but it doesn't tell us anything about up close geometry.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
With my 7600 non K and Strix OC 1070 I have no issues maintaining a locked 60FPS with only slight dips in the Tempest (ME2 had a similar ship issue) so far anyway. Autodetected Ultra with only shadows and lighting on high. Using the stock heatsink temps hit 60 celsius or a bit less, a 25 degree rise from normal idling/desktop work. Will get some screens up.
 

Blockheadfan

Member
Feb 23, 2017
33
55
61
Any company that intentionally reduces tessellation factors is cheating, period. If your hardware can't handle high tessellation well, stop calling it a DX11 or DX12 card.

At first I was thinking to myself, that's an odd logical leap to make from an openly disclosed optimisation that is clearly already an acknowledgement of lower tessellation performance to "...cheating, period..." and "...stop calling it a DX11 or DX12 card." Then I looked at your name and post history.

There is nothing dishonest about this which is a prerequisite to call anything "cheating". Also would you care to link me the minimum tessellation performance specified by DX11 and DX12? Don't worry I know you can't. In summary, nothing you've said is substantiated.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Some pics, click twice for full res, although these are original .pngs not .bmps:






Haven't seen GPU usage hit 99% (closer to 90%)
, CPU usage occasionally cracks 90%, no huge FPS drops or spikes so far. Although if you want 120FPS you'd want an i7. For 60FPS an i5 is sufficient.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
At first I was thinking to myself, that's an odd logical leap to make from an openly disclosed optimisation that is clearly already an acknowledgement of lower tessellation performance to "...cheating, period..." and "...stop calling it a DX11 or DX12 card." Then I looked at your name and post history.

There is nothing dishonest about this which is a prerequisite to call anything "cheating". Also would you care to link me the minimum tessellation performance specified by DX11 and DX12? Don't worry I know you can't. In summary, nothing you've said is substantiated.

Hm, have you sent the memo to Microsoft and Khronos that the Tessellation Factors are not specified?
DX11 - 1 to 64: https://msdn.microsoft.com/en-us/library/windows/desktop/ff476340(v=vs.85).aspx#Tessellator_Stage
OpenGL - must be at least 64: https://www.khronos.org/opengl/wiki/Tessellation#Tessellation_levels

Changing the factor behind the back of the developer is cheating.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126

This is false. The max tessellation level must be at least 64, but there is no min level, which means you can go as low as you want for the effective level (with 1 effectively being no tessellation).

You can in fact also use levels below 1 for both DX12 and OpenGL, which in both cases results in the tessellation patch being culled.

Then there's of course the even bigger issue that you completely misread Blockheadfan's post, seeing as he never said that tessellation factors weren't defined, he said that tessellation performance wasn't defined.
 
Last edited:
Reactions: Krteq

Blockheadfan

Member
Feb 23, 2017
33
55
61
Hm, have you sent the memo to Microsoft and Khronos that the Tessellation Factors are not specified?
DX11 - 1 to 64: https://msdn.microsoft.com/en-us/library/windows/desktop/ff476340(v=vs.85).aspx#Tessellator_Stage
OpenGL - must be at least 64: https://www.khronos.org/opengl/wiki/Tessellation#Tessellation_levels

Changing the factor behind the back of the developer is cheating.

Factors don't equal performance and even if they did AMD are entirely capable of using 64 so that wouldn't make them somehow non-complainant with DX11 or DX12 as the person I was replying to implied.

Again with the cheating! Nothing is happening behind anyone's back or in a dishonest way, if anyone has moral misgivings about changing a tessellation factor in Mass Effect then they are free to turn the optimisation off. I've heard a few Nvidia owners voice the opinion that they would also enjoy the option to gain some performance for a negligible visual difference.
 
Reactions: Krteq

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Haven't seen GPU usage hit 99% (closer to 90%)
, CPU usage occasionally cracks 90%, no huge FPS drops or spikes so far. Although if you want 120FPS you'd want an i7. For 60FPS an i5 is sufficient.

I bet you're missing that 5930K This game hammers the CPU at times, especially when loading an area. I've seen my CPU usage climb to 100% momentarily, and I'm on 8 cores now!
 

Samwell

Senior member
May 10, 2015
225
47
101
Again with the cheating! Nothing is happening behind anyone's back or in a dishonest way, if anyone has moral misgivings about changing a tessellation factor in Mass Effect then they are free to turn the optimisation off. I've heard a few Nvidia owners voice the opinion that they would also enjoy the option to gain some performance for a negligible visual difference.

It's not dishonest that's true, but you have to look at every review wheather they really turned off the optimization for fairness reasons. It should be default off with the possibility to turn it on and not other way around. Looking at the scaling numbers of the 480 tesselation performance was fixed and amd loose the same amount of performance as nvidia with the 1060. It's just a problem in amds older gpus.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
It's not dishonest that's true, but you have to look at every review wheather they really turned off the optimization for fairness reasons. It should be default off with the possibility to turn it on and not other way around. Looking at the scaling numbers of the 480 tesselation performance was fixed and amd loose the same amount of performance as nvidia with the 1060. It's just a problem in amds older gpus.

Honestly I'm not sure I agree with this.

AMD's primary responsibility is to provide it's consumers with the best experience possible, not to provide tech sites with the best experience possible. Obviously for enthusiast like the people who frequent this site it might be a better experience to have it off by default, since we generally don't mind fiddling around with stuff like that and we obviously read the various reviews. However for the 90%* or more of AMD's consumer base that does not belong to the enthusiast category, and never read any reviews, nor has the technical knowhow to turn this setting on/off in the drivers, the best experience would actually be to have it turned on by default.

So from AMD's point of view it then becomes a question of whether or not they should cater to the 90% or to the 10% plus tech sites. And as long as they are open about what they are doing and provide the possibility to turn it off, then I don't blame them for going with the 90%.

*This number is of course just a guess, since I don't really know how many belongs in either category, but I think it's safe to say that the non-enthusiast greatly outnumber the enthusiasts.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
This is false. The max tessellation level must be at least 64, but there is no min level, which means you can go as low as you want for the effective level (with 1 effectively being no tessellation).

You can in fact also use levels below 1 for both DX12 and OpenGL, which in both cases results in the tessellation patch being culled.

Developers can choose whatever factors they want. The APIs specify what the hardware has to support. In both cases it is a factor of 64. AMD is reducing the maxium factor from 64x to whatever they want. In Mass Effect it is 8x instead of 64x.

Then there's of course the even bigger issue that you completely misread Blockheadfan's post, seeing as he never said that tessellation factors weren't defined, he said that tessellation performance wasn't defined.

Performance is related to the applied factor. A higher factor means more information and more data transfer. Why do you think AMD is reducing the factor when performance doesnt matter?

APIs dont specifiy performance. So using "performance" to justify AMD's cheating is just missleading.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Developers can choose whatever factors they want. The APIs specify what the hardware has to support. In both cases it is a factor of 64. AMD is reducing the maxium factor from 64x to whatever they want. In Mass Effect it is 8x instead of 64x.

AMD is not reducing the max factor supported by their hardware, they are reducing the effective factor run by the game (and again there is no minimum effective tessellation factor specified by any of the APIs), so they are still perfectly in compliance with the APIs.

Performance is related to the applied factor. A higher factor means more information and more data transfer. Why do you think AMD is reducing the factor when performance doesnt matter?

APIs dont specifiy performance. So using "performance" to justify AMD's cheating is just missleading.

You seriously need to work on your reading comprehension.

No one said performance and tesselation factor wasn't related, nor did anyone say that performance didn't matter (quite the contrary actually). And no one said that it didn't matter either, people are simply saying that it doesn't constitute cheating.

Secondly, no one used performance to justify "cheating". What actually happened, is that nvgpu tried to argue that tessellation performance justifies whether or not a GPU can be called compliant with the API ("If your hardware can't handle high tessellation well, stop calling it a DX11 or DX12 card."). Blockheadfan then pointed out that the APIs have no performance level specified. You then completely misread Blockheadfans post and started talking about tessellation factor specifications.

In ME:A a max tess factor used by devs is x16, not x64.

Those cheating scumbags.
 
Last edited:

Samwell

Senior member
May 10, 2015
225
47
101
Honestly I'm not sure I agree with this.

AMD's primary responsibility is to provide it's consumers with the best experience possible, not to provide tech sites with the best experience possible. Obviously for enthusiast like the people who frequent this site it might be a better experience to have it off by default, since we generally don't mind fiddling around with stuff like that and we obviously read the various reviews. However for the 90%* or more of AMD's consumer base that does not belong to the enthusiast category, and never read any reviews, nor has the technical knowhow to turn this setting on/off in the drivers, the best experience would actually be to have it turned on by default.

So from AMD's point of view it then becomes a question of whether or not they should cater to the 90% or to the 10% plus tech sites. And as long as they are open about what they are doing and provide the possibility to turn it off, then I don't blame them for going with the 90%.

*This number is of course just a guess, since I don't really know how many belongs in either category, but I think it's safe to say that the non-enthusiast greatly outnumber the enthusiasts.

In my opinion AMDs first responsibility is to show the game in the way the developer wants it, even if it's practically not visible. Maybe in some special cases in the game you can see it. After that they can give people the option to reduce stuff and enhance performance. With your argumentation it would also be ok to leave other details out, because they're getting "the best" experience. Hey, why doesn't nvidia just instead of living with their badly async just skip these effects and enhance the experience? Antialiasing, come on, if the game wants 8xaa we give it 4xaa because no one sees it. In the end we live in a downward spiral where Nvidia and AMD provide you with what they think is the best experience by reducing details. We had this already during Dx8/DX9 time and it wasn't nice.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
In my opinion AMDs first responsibility is to show the game in the way the developer wants it, even if it's practically not visible.

Why in the world would AMD have a greater responsibility towards the developers than their costumers? Unless the developers are paying AMD money like AMD's costumers are, I really don't see how this makes any sense.

Obviously AMD would want to stay on friendly terms with developers, but I very much doubt anyone at Bioware is even remotely bothered by this.

Maybe in some special cases in the game you can see it. After that they can give people the option to reduce stuff and enhance performance. With your argumentation it would also be ok to leave other details out, because they're getting "the best" experience. Hey, why doesn't nvidia just instead of living with their badly async just skip these effects and enhance the experience? Antialiasing, come on, if the game wants 8xaa we give it 4xaa because no one sees it. In the end we live in a downward spiral where Nvidia and AMD provide you with what they think is the best experience by reducing details. We had this already during Dx8/DX9 time and it wasn't nice.

The "best" experience is obviously a mix of performance and image quality, and not just purely performance, and exactly what kind of mix of the two a given individual prioritizes is highly subjective. As such if the majority of Nvidia's costumers prefer the async effects to be skipped or the AA lowered in the name of performance, then I don't see the problem with Nvidia doing so (again this obviously assumes that Nvidia makes it clear that they are doing so and provides an on/off option, similar to what AMD is doing here).

In this particular case I think it's safe to say that the visual impact is minimal, whereas the performance impact is quite significant, as such I would be surprised if the majority of AMD's costumers didn't prefer running the game with the tweak turned on.

I really don't see any issue with Nvidia and AMD providing us with what they think is the best mix of image quality and performance for a given piece of hardware, as long as they are open about it and don't take away control. In fact this is arguably exactly what Nvidia's Geforce Experience optimization does. The difference between this and the DX8/DX9 days, is that Nvidia and AMD wasn't always open about their tweaks back then and you couldn't necessarily turn them on and off.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
AMD and Nvidia should by default provide exactly what the devs asked for. If I run the game at max settings I expect exactly the same result whoever's graphics card I am using. That is equally how the game should be tested - if it says max settings then it should be max settings.

Then it's my choice to lower settings as I see fit to up my framerate to acceptable - AMD and Nvidia are both welcome to suggest tweaks that cause the minimal visual imact for the greatest gains.

As for automatically setting tesselation to x8, well if it makes no difference why did the devs choose x16? There's plenty of settings I can barely see the difference on, but that doesn't make it ok for my gpu maker to lower the settings behind my back.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
To some of those who want to complain about cheating - people here don't take you seriously because you only selectively complain only when one particular company does things like this, and remain lips sealed any other time. Don't shoot the messenger. But this is why no one can believe you.

Personally, it seems clear to me that the line between optimization and quality degradation is very fine. IMO if it increases performance for zero IQ reduction, it is optimization. If it reduces quality, then it is "turning down settings." This seems to be a very small visual hit, but still perceptible. So I'd say its worth doing but also worth being aware of and turning off if you want that tess factor back.

AMD and Nvidia should by default provide exactly what the devs asked for. If I run the game at max settings I expect exactly the same result whoever's graphics card I am using. That is equally how the game should be tested - if it says max settings then it should be max settings.

I would like to agree, but both AMD and nVidia already outright fully replace shaders through the driver and we encourage them to do so because it improves performance. I don't see how this is much different.
 
Last edited:
Reactions: Bacon1

nurturedhate

Golden Member
Aug 27, 2011
1,761
757
136
If we REALLY wanted to go down that road then we should be only using the launch drivers plus bug fixes for games and not all of these game ready drivers that change how several things are handled. Bring it all back to exactly how the devs wrote the code. But that wouldn't be fair to Nvidia would it? Those game ready drivers are great aren't they.

A handful of pro one sided posters have turned a thread about a game into accusations that AMD isn't DX11/12 compliant and they engage in outright cheating. Bravo!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |