Various Wolfenstein II's Benchmarks

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Krteq

Senior member
May 22, 2015
993
672
136
Yeah, reactivated A/C is doing some magic for GCN cards, but there is still something borked in worst case scenario for GCN cards, but nothing serious.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Yeah, reactivated A/C is doing some magic for GCN cards, but there is still something borked in worst case scenario for GCN cards, but nothing serious.

There's no way a gain that large is solely from AC. It might have to do with a combination of factors, like16 bit precision and A/C. It will be impossible really get an accurate answer though, as this game has so many specific optimizations for Vega.

In fact, I hope AMD paid Bethesda well for this level of optimization, because the lack of stability optimization for NVidia GPUs at launch wasn't very good (and even now still), and it cost them with a lot of refunded games and bad user reviews since the majority of the PC gaming market is NVidia.

Computerbase also says AC still needs a new driver for Nvidia

Yeah, AC has been deactivated on all NVidia cards. This ticked me off, because I haven't had any issues with stability due to turning AC on. Hopefully NVidia gets a new driver out the door ASAP, because the performance hit is actually noticeable to me, having played most of the game with it on since it launched.

This game is really a showcase for AC, as it uses it VERY heavily. AMD users can no longer say that NVidia's AC solution is half baked or software driven.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
The claims about Nvidia's AC solution being half baked goes back to the pre-Pascal GPUs though. Maxwell couldn't even enable AC in Computerbase's first test.

I personally couldn't care less how Nvidia's solution works as long as it nets a gain in performance. Still using Kepler myself though, it's a bit annoying to watch AMD's equivalents usually aging better.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The claims about Nvidia's AC solution being half baked goes back to the pre-Pascal GPUs though. Maxwell couldn't even enable AC in Computerbase's first test.

Still, quite a few people still claimed that Pascal's solution was half baked as well, specifically that it was rooted in software which isn't true. This rumor was spread mostly by Mahigan if you recall.

I personally couldn't care less how Nvidia's solution works as long as it nets a gain in performance. Still using Kepler myself though, it's a bit annoying to watch AMD's equivalents usually aging better.

The notion that AMD GPUs age better is kind of true, and there are reasons for this. The biggest reason of course is that AMD GPUs share similar architecture with the GPUs in the consoles. That's a big advantage, but it's taken a long time to manifest. Also not every vendor is willing to push the ante when it comes to enabling console style optimizations for AMD GPUs. NVidia remarkably is still easily capable of competing though due to their amazing software scheduler.

The second reason is that AMD GPUs always take a long time to reach optimal performance in their life cycle from driver updates, which provides the illusion that they are aging better when in reality, it's just taking longer for their architecture to peak. NVidia is much faster than AMD when it comes to pushing driver updates that exploit new architectures.

I'd wager that Vega at the end of its life cycle should be solidly outperforming the GTX 1080 despite being mostly slower today. GTX 1080 is already topped out, but Vega still has room to grow. It won't reach GTX 1080 Ti levels of performance though.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Still, quite a few people still claimed that Pascal's solution was half baked as well, specifically that it was rooted in software which isn't true. This rumor was spread mostly by Mahigan if you recall.



The notion that AMD GPUs age better is kind of true, and there are reasons for this. The biggest reason of course is that AMD GPUs share similar architecture with the GPUs in the consoles. That's a big advantage, but it's taken a long time to manifest. Also not every vendor is willing to push the ante when it comes to enabling console style optimizations for AMD GPUs. NVidia remarkably is still easily capable of competing though due to their amazing software scheduler.

The second reason is that AMD GPUs always take a long time to reach optimal performance in their life cycle from driver updates, which provides the illusion that they are aging better when in reality, it's just taking longer for their architecture to peak. NVidia is much faster than AMD when it comes to pushing driver updates that exploit new architectures.

I'd wager that Vega at the end of its life cycle should be solidly outperforming the GTX 1080 despite being mostly slower today. GTX 1080 is already topped out, but Vega still has room to grow. It won't reach GTX 1080 Ti levels of performance though.

Whilst I agree for the most part I would swap importance of drivers vs console influence. The AMD "fine wine" thing really came to the fore as GCN 1.0 products aged, you can even notice it with the first 6-9 months of Tahiti. And this is much too small a time frame for the console's influence to come through. So I think AMD's driver team has a lot to answer for.

That being said, and what does tie into the console thing, I believe AMD's GCN GPUs are fundamentally more flexible and "forward looking" in the sense of pure hardware capabilities. Anyone is welcome to disagree, but as far as I'm concerned it's not a surprise game devs and driver teams take some time to catch up.

And Vega 64 is now averaging over 25% faster than a 1080 with patch 2 in this game. So it's a surprise to see the easy claim it "won't reach GTX 1080 Ti levels of performance though". It's not that I'm a believer in the upcoming magic drivers or anything. Just saying this is a pretty bold claim... Especially given Vega is already around "GTX 1080 Ti levels of performance" with this game.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Whilst I agree for the most part I would swap importance of drivers vs console influence. The AMD "fine wine" thing really came to the fore as GCN 1.0 products aged, you can even notice it with the first 6-9 months of Tahiti. And this is much too small a time frame for the console's influence to come through. So I think AMD's driver team has a lot to answer for.

The console influence has taken a lot longer to manifest, but the effects are now indisputable. I mean, look at Wolfenstein 2 and Doom. Both games use shader instrinsic functions to increase performance substantially, something that's only really possible because of the consoles.

That being said, and what does tie into the console thing, I believe AMD's GCN GPUs are fundamentally more flexible and "forward looking" in the sense of pure hardware capabilities. Anyone is welcome to disagree, but as far as I'm concerned it's not a surprise game devs and driver teams take some time to catch up.

I would agree with this statement. On the NVidia side, Kepler was a particularly short lived architecture because of how it sacrificed compute performance at a time when the use of compute in games was beginning to skyrocket. NVidia corrected themselves very fast though with Maxwell, and even more with Pascal. So while I would agree that AMD GPUs are typically designed to achieve a long shelf life (while NVidia design their GPUs more for the present), this can be a double edged sword. Look at asynchronous compute for instance. AMD has had this functionality built into their GPUs since Tahiti, but only very recently is it being used to any great effect. So for however many years, all the transistors used for the ACEs were just sitting there taking up space on the die.
And Vega 64 is now averaging over 25% faster than a 1080 with patch 2 in this game. So it's a surprise to see the easy claim it "won't reach GTX 1080 Ti levels of performance though". It's not that I'm a believer in the upcoming magic drivers or anything. Just saying this is a pretty bold claim... Especially given Vega is already around "GTX 1080 Ti levels of performance" with this game.

That's only in the best case scenario. If you look at the worst case scenario, then the GTX 1080 is still ahead of Vega 64, although its lead has diminished somewhat due to having asynchronous compute disabled in the game.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
That's only in the best case scenario. If you look at the worst case scenario, then the GTX 1080 is still ahead of Vega 64, although its lead has diminished somewhat due to having asynchronous compute disabled in the game.
It's not "best case scenario" it's average fps.

If you mean this game is a "best case scenario" then I still think you could be wrong. It's obviously the best case for Vega atm, but Vega still has some features to enable and optimise as well, and there are plenty more games/GPUs coming out in the future. Some nV cards might catch up with newer drivers/patches, I don't know. But I'm not the one who's claiming to know the future:
I'd wager that Vega at the end of its life cycle should be solidly outperforming the GTX 1080 despite being mostly slower today. GTX 1080 is already topped out, but Vega still has room to grow. It won't reach GTX 1080 Ti levels of performance though.
Vega 64 is already averaging in the same ballpark as a 1080Ti in this game atm. So I think your claims of certainty are a bit rich. Personally I don't see Vega 64 matching a 1080Ti across the board any time soon, if ever, but I think it's a bit presumptuous to discount the possibility. Let alone make a claim, like you have, that they will never (in any situation?) have equal performance.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's not "best case scenario" it's average fps.

Evidently you didn't read the review. Computerbase.de tested several scenarios, one in which the Radeon has a large lead (best case scenario), and the other in which the Geforce cards have a large lead (worst case scenario). So the game is not universally faster on Radeons.

Vega 64 is already averaging in the same ballpark as a 1080Ti in this game atm. So I think your claims of certainty are a bit rich. Personally I don't see Vega 64 matching a 1080Ti across the board any time soon, if ever, but I think it's a bit presumptuous to discount the possibility. Let alone make a claim, like you have, that they will never (in any situation?) have equal performance.

Vega 64 is not averaging in the same performance as the GTX 1080 Ti in this game. Read the entire review, and not just look at a single graph. Also, I don't think it's unreasonable to say that Vega 64 will never reach GTX 1080 Ti levels. The GTX 1080 Ti FE is about a good 30% faster than Vega 64 on average. No amount of driver optimizations is going to make up such a large gap. And AMD doesn't have the money to finance all of these special optimizations that Id Tech did for Wolfenstein 2. These games are going to be a rarity compared to the vast majority of titles that use DX11, or even DX12.
 

Krteq

Senior member
May 22, 2015
993
672
136
Evidently you didn't read the review. Computerbase.de tested several scenarios, one in which the Radeon has a large lead (best case scenario), and the other in which the Geforce cards have a large lead (worst case scenario). So the game is not universally faster on Radeons.
Yes, it is. Check some side-by-side performance review videos on YouTube, there are tons of those and there is a variety of scenes tested in them. Still Radeons are faster across the board.

And there is nothing like "best case scenario" in CB.de tests.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Vega 64 is not averaging in the same performance as the GTX 1080 Ti in this game. Read the entire review, and not just look at a single graph.
OK I admit they didn't take a fps reading over the course of the entire game. Shame on them. What they did was take a common "demanding" scene and they graphed the averages. From the review (Google translated of course):
The first test scene corresponds to a demanding scenario , as it occurs in the game again and again. It is a sequence inside and outside a building in combination with various particle effects. The second scene, on the other hand, is a worst-case scenario . For the first hour and a half, there has only been one sequence of this kind.
Now I think we arrive at the problem:
Vega 64 is not averaging in the same performance as the GTX 1080 Ti in this game.
Yes. Yes it is. Vega 64 average performance about the same as a 1080Ti in this game at this time. It just is.

Roughly at least. I can't see any direct comparisons but we can deduce that Vega 64 is easily averaging within 10% with the current patch. Heck, since nV products have lost a little performance with the most recent patch I would bet Vega 64 is within a couple of percent.
The GTX 1080 Ti FE is about a good 30% faster than Vega 64 on average. No amount of driver optimizations is going to make up such a large gap.
I don't want to accuse you of grasping at straws, or even building straw men out of them, but nobody thought just drivers would make up ~30% performance. Other things can make up ~30% performance, like this game engine...

And if a game engine can make Vega 64 over 25% faster than a 1080, then how can you deny the possibility a different game engine and/or drivers can make it 30% faster? It doesn't need much (if any) more performance to match a 1080Ti (in this situation), and we've already agreed in theory about fine wine. So it seems you're sticking your head in the sand to me.

Edit: I probably should have checked before. But in the initial testing the 1080Ti performed anywhere from 15-26 percent faster than the 1080. So assuming that proportion remains the same then Vega 64 is already faster than a 1080Ti in this situation. Obviously this is the best case we have for Vega at the moment, but why do I have to bother correcting people who insist Vega will never match a 1080Ti in gaming performance??
 
Last edited:
Reactions: Det0x

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Yes, it is. Check some side-by-side performance review videos on YouTube, there are tons of those and there is a variety of scenes tested in them. Still Radeons are faster across the board.

A test that was uploaded a week ago which compares Vega 64 to GTX 1080 Ti and GTX 1080 in Wolfenstein 2:


Vega manages to outclass the GTX 1080 for the most part, but it still doesn't catch up to the GTX 1080 Ti across most of those scenes, even with asynchronous compute disabled on the latter. Also, this game is still brand new and has a bunch of optimizations for Vega off the bat. Geforce optimizations will be coming down the pipeline assuredly. It will be similar to Doom, where the Geforce GPUs will get faster over a longer period of time.

And there is nothing like "best case scenario" in CB.de tests.

Well German isn't my native language, but it seemed to me like that was their intent.
 
Reactions: tviceman

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Yes. Yes it is. Vega 64 average performance about the same as a 1080Ti in this game at this time. It just is.

Vega 64 is still well behind the GTX 1080 Ti from that YouTube video I posted above, and that was with asynchronous compute disabled for the GTX 1080 Ti. To be honest, this game's performance optimization will occur over a long period of time just like Doom.

When Doom's Vulkan patch first debuted, Fury X picked up a massive performance gain, almost enough to reach the GTX 1080. Later on though, subsequent patches and driver updates boosted NVidia's performance in the game substantially and the gap between the GTX 1080 and the Fury X became wider.

Roughly at least. I can't see any direct comparisons but we can deduce that Vega 64 is easily averaging within 10% with the current patch. Heck, since nV products have lost a little performance with the most recent patch I would bet Vega 64 is within a couple of percent.

The performance loss can be explained by the disabling of asynchronous compute. Also, NVidia optimizations will eventually be implemented in the game as well. It will just take longer to manifest.

Other things can make up ~30% performance, like this game engine...

Do you honestly think this game engine will be representative of the large majority of other 3D engines on the market? If so, I have an igloo to sell you somewhere in the Sahara desert.

AMD PAID to have these optimizations implemented. So unless AMD has enough wealth to convince all the other major 3D engine developers to implement these changes, I don't see it happening across the board.
And if a game engine can make Vega 64 over 25% faster than a 1080, then how can you deny the possibility a different game engine and/or drivers can make it 30% faster? It doesn't need much (if any) more performance to match a 1080Ti (in this situation), and we've already agreed in theory about fine wine. So it seems you're sticking your head in the sand to me.

Um, we're talking about averages here. As magnificent as Id Tech 6.5 is, it's a proprietary tech that won't be adopted by other developers outside of Bethesda. Also, as I've said repeatedly, this game's optimization is not finished. NVidia's turn will come in this game, and performance will increase much like it did with Doom.

Obviously this is the best case we have for Vega at the moment, but why do I have to bother correcting people who insist Vega will never match a 1080Ti in gaming performance??

Vega is using shader intrinsics, 16 bit floats and probably some other architectural optimizations yet it still cannot close the gap completely on the GTX 1080 Ti. So if this is the best case scenario, it doesn't look so good for Vega matching the GTX 1080 Ti across the board in the future.
 
Reactions: tviceman
May 11, 2008
20,055
1,290
126
snip

Also, NVidia optimizations will eventually be implemented in the game as well. It will just take longer to manifest.

.

I sure do not hope so, then we get another wolfenstein new order which crashes often on AMD cards and workarounds have to be used that solve the crashes but also slow down performance.
IDtech 5 was riddled with Nvidia specific optimizations that caused many problems.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Do you honestly think this game engine will be representative of the large majority of other 3D engines on the market? If so, I have an igloo to sell you somewhere in the Sahara desert.

AMD PAID to have these optimizations implemented. So unless AMD has enough wealth to convince all the other major 3D engine developers to implement these changes, I don't see it happening across the board.

AMD itself has described their Shader Intrinsics as bringing over the GCN optimizations on console to PC. Both PS4 Pro and Xbox One X support FP16. If other engines start supporting FP16 and the other new features, it's only to be expected AMD will try getting them supported on the PC version as well.
It's different for Nvidia since their specific features would require all new code and be PC only (would be sweet if we get news about Switch optimizations benefiting Nvidia on PC though).
 

CPUGuy

Junior Member
Nov 20, 2008
16
3
76
I wouldn't count the chickens just yet for this title. There are many more patches and drivers to come.
Primitive Shaders are not enabled nor working from either idtech 6 or AMD Drivers.
Although RPM is enabled from idtech 6 it's not enabled in AMD Drivers Yet.

All we have is Async Compute from the beta patch and it's only with AMD 17.10.3 but disabled for Nvidia 388.13.

No offense, but I think that's a silly justification for the poor performance. I think that because the game is released and I've already spent my hard earned money on it, as have many others. I shouldn't have to wait for patches to make the game playable, or have decent performance, that may or may never come.

I guess what I'm saying is that if those graphical features and performance patches hat you mentioned weren't ready by the scheduled release date, maybe they should've delayed the game until they were implemented.


A test that was uploaded a week ago which compares Vega 64 to GTX 1080 Ti and GTX 1080 in Wolfenstein 2:


Vega manages to outclass the GTX 1080 for the most part, but it still doesn't catch up to the GTX 1080 Ti across most of those scenes, even with asynchronous compute disabled on the latter. Also, this game is still brand new and has a bunch of optimizations for Vega off the bat. Geforce optimizations will be coming down the pipeline assuredly. It will be similar to Doom, where the Geforce GPUs will get faster over a longer period of time.



Well German isn't my native language, but it seemed to me like that was their intent.




Wege12, I'm sure they had there reasons for not delaying the game. But as you see those performance gains are there as evidence.

The problem is we are only seeing the tree instead of the forest before us. This sort of development is pioneering stuff. And it will take time before fully utilized. It should have been pioneered a few years ago. And I agree with that. Now we are finally starting to see the DX12/Vulkan train gaining momentum and finally leaving the train station. Anyone thinking that DX11 will never die needs to stop smoking the stuff. Sure you will have an out liner here and there but overall the market is flexible.


Acause for concern is that from what is being told about Volta is that it's not specifically targeted for gaming like Maxwell & Pascal. Also, from what the head of Nvidia is saying is that they are focusing on none gaming enterprises. I am not sure if Volta will ever be released as a viable upgrade from Pascal at established pricing as it's design will be something similar to what AMD is doing, non gaming. Now you got Intel involved. 3 major players all looking to get a piece of that none gaming pie.

Addition:
https://twitter.com/idSoftwareTiago/status/913947204146073601
This is not some sort of dubious post of using A/C in games. We already know that other developers are doing this. What this is saying to me is that they see the obvious writing on the wall. Pic looks like Job Security. Gaming is going compute because profit is to be made for gpus in none gaming enterprises. Raja leaves AMD and goes to Intel is a huge sign that Intel wants to be part of this. Now, other game developers must see writing on wall.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
I sure do not hope so, then we get another wolfenstein new order which crashes often on AMD cards and workarounds have to be used that solve the crashes but also slow down performance.
IDtech 5 was riddled with Nvidia specific optimizations that caused many problems.

Nah, I think it was just that it was an openGL game doing unusual things for the time. Actually, that was Rage.., which iirc had similar issues at launch for AMD. Strange they weren't able to do the same for wolfenstein. I guess the fixes were more ad-hoc for Rage instead of any opengl games with similar problems.

That one doesn't.

It does actually, along with the other Polaris GPUs.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |