AMD Radeon RX Vega 64 and 56 Reviews [*UPDATED* Aug 28]

Page 36 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

tential

Diamond Member
May 13, 2008
7,355
642
121
I've looked at two different reviews, TPU and Anandtech, and both show Vega 64 about 1% faster than a 1080 based on the 99th percentile results (and averages if otherwise not available) . These are the only two reviews I've actually looked at. I should clarify that neck-and-neck refers to 4k gaming. I haven't done a weighted review of any other sites yet as I haven't looked at them.

4k Freesync monitors are cheap and 4k is great for general PC usage (text looks really nice) that, personally, I won't game at a lower resolution. There's just no value to me. Both Vega and the 1080 are marginal at 4k at max/near max details (the kind sites like to test) and, unfortunately, don't test at a setting I'd actually game at. But you go with what you can get, right?

In any event, 2 of 2 review sites say Vega 64 is faster at 4k than a 1080, even if just barely.

As a 4K Freesync monitor owner, I see no point in throwing $600 at Vega for 4K Freesync. I'll cut my losses and get a GTX 2070 and Gsync monitor instead and actually have a viable upgrade path. AMD won't have anything to replace Vega 64 for a long time. 4K gaming just is going to ALWAYS be a game of large compromises moving forward for AMD.

I'd focus on 1440p or lower if I was using Freesync.
 

Elixer

Lifer
May 7, 2002
10,376
762
126
As a 4K Freesync monitor owner, I see no point in throwing $600 at Vega for 4K Freesync. I'll cut my losses and get a GTX 2070 and Gsync monitor instead and actually have a viable upgrade path. AMD won't have anything to replace Vega 64 for a long time. 4K gaming just is going to ALWAYS be a game of large compromises moving forward for AMD.

I'd focus on 1440p or lower if I was using Freesync.
This is why nvidia really sucks.
Instead of supporting the open standard, which they could very easily do, they force people to make much more expensive purchases, just so they can tie them into their hardware.

Of course, with the release of Vega64, and it not being able to compete against the 1080Ti, people with 4K monitors are left in a big bind.
They made a tangible investment in a expensive 4K monitor, and they are then stuck unless they want to sell their stuff, and start over again.
While you still could use a nvidia card on the same monitor, it wouldn't do adaptive-sync.

However, the newest HDMI Specification 2.1 does list, Game Mode VRR features variable refresh rate, (which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing), as a feature, so, Nvidia might just be forced to be supporting that after all, if they want to be HDMI 2.1 compliant.
http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Still not seeing where most sites show the 1080 faster than Vega 64 by more than Vega 56 is faster than a 1070.

The first review I read was Hardocp, where Vega's performance was described as subpar ( and MSAA really killed it).
https://www.hardocp.com/article/2017/08/14/amd_radeon_rx_vega_64_video_card_review/17

http://techreport.com/review/32391/amd-radeon-rx-vega-64-and-rx-vega-56-graphics-cards-reviewed/12
Has a clear lead for the 1080 over V64, and tie for the 1070 and V56

http://www.pcgamer.com/the-amd-radeon-rx-vega-56-and-vega-64-review/
Has a clear lead for the 1080 over V64, and tie for the 1070 and V56 (yeah, same as tech report).

Techspot showed a best case 2% win over 1070 with 25 games, given the behavior at techreport and and PCGamer I think they would almost certainly show a bigger win in favor of 1080.
https://www.techspot.com/review/1468-amd-radeon-rx-vega-56/page8.html

Techspot has a nice graph that really explains the state of what is going on:
https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1468/bench/GameComparison.png

Basically big swings depending on which the game favors. So reviews can swing back and forth depending on the included games, but it gets closer to a tie with larger number of games, with overal 1080 winning it's battle, and Vega 56 winning it's contest. Overall it looks like 1080's edge is better to me, but really it's moot, the individual game swings are huge backs and forth, swamping the minuscule average in one direction or the other.
 
Reactions: dlerious

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
This is why nvidia really sucks.
Instead of supporting the open standard, which they could very easily do, they force people to make much more expensive purchases, just so they can tie them into their hardware.

Of course, with the release of Vega64, and it not being able to compete against the 1080Ti, people with 4K monitors are left in a big bind.
They made a tangible investment in a expensive 4K monitor, and they are then stuck unless they want to sell their stuff, and start over again.
While you still could use a nvidia card on the same monitor, it wouldn't do adaptive-sync.

However, the newest HDMI Specification 2.1 does list, Game Mode VRR features variable refresh rate, (which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing), as a feature, so, Nvidia might just be forced to be supporting that after all, if they want to be HDMI 2.1 compliant.
http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

I think the need for adaptive refresh is vastly overrated. If you have any high refresh monitor lag is greatly reduced even with Vsync on. So if you tolerate a tiny bit of lag, turn on Vsync and tearing problems are solved.

OTOH if you can't stand any lag run with Vsync off. It has even less lag that GS/FS and at higher refresh rates tearing is still less noticeable than it was on 60Hz monitors. Or run Vsync on for eye candy non twitch games, and Vsync off for competitive twitch games. It's what we did a few short years ago before FS/GS, and now with higher refresh the downsides are much reduced just using Vsync on/off.

I agree it will be interesting to eventually see what NVidia does with HDMI 2.1, though I expect this a long way from mattering.
 
Reactions: tviceman
Mar 10, 2006
11,715
2,012
126
This is why nvidia really sucks.
Instead of supporting the open standard, which they could very easily do, they force people to make much more expensive purchases, just so they can tie them into their hardware.

Or maybe there are tangible user experience benefits that NVIDIA is delivering by tightly controlling the hardware ecosystem w/ the proprietary G-Sync module, high standards for the panels, etc.

Just saying.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Or maybe there are tangible user experience benefits that NVIDIA is delivering by tightly controlling the hardware ecosystem w/ the proprietary G-Sync module, high standards for the panels, etc.

Just saying.
Not really; it just ensures the experience is only available to the top end. Top end freesync monitors are comparable in performance and frequency range to top end G-sync ones, for $200 cheaper, but midrange and low-end G-sync monitors don't even exist.

Plus, you're acting like this is a zero sum game; that NVIDIA can't maintain high standards without moving off of G-sync. Of course they can. Displayport A-sync technology is just as capable as G-sync; NVIDIA can simply not allow makers to brand their monitors with G-sync 2 if it doesn't meet certain standards. They don't have to follow AMD's lead, they can continue to only support the high-end. Difference is, the monitor will work with any GPU and won't need an extra ASIC. No one benefits from NVIDIA using their reduntant proprietary solution, other than monitor makers and NVIDIA itself.

NVIDIA is free to do what's going to make them the most money, but I don't see how it can be argued that buyers benefit from it.
 
Reactions: kawi6rr and guachi

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,425
8,388
126
almost no *gaming* reason to buy a 1070 right now.
Can you just complete the thought with the rest of the explanation as to why this is the case?
TPU Charts or something.
I understand your reasoning, but the numbers make it a lot more compelling.
do i really need to provide a chart for how the 1080 is a bunch faster than the 1070?
 

tential

Diamond Member
May 13, 2008
7,355
642
121
This is why nvidia really sucks.
Instead of supporting the open standard, which they could very easily do, they force people to make much more expensive purchases, just so they can tie them into their hardware.

Of course, with the release of Vega64, and it not being able to compete against the 1080Ti, people with 4K monitors are left in a big bind.
They made a tangible investment in a expensive 4K monitor, and they are then stuck unless they want to sell their stuff, and start over again.
While you still could use a nvidia card on the same monitor, it wouldn't do adaptive-sync.

However, the newest HDMI Specification 2.1 does list, Game Mode VRR features variable refresh rate, (which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing), as a feature, so, Nvidia might just be forced to be supporting that after all, if they want to be HDMI 2.1 compliant.
http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx
When is this spec mainstream. I'm guessing not Volta but the generation after it?
Same with Navi?
Or even the generation after that actually given how late AMD is to the party with supporting standards. Remember Fury and HDMI 2.0....
 
Last edited:
Reactions: jrphoenix

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
The first review I read was Hardocp, where Vega's performance was described as subpar ( and MSAA really killed it).
https://www.hardocp.com/article/2017/08/14/amd_radeon_rx_vega_64_video_card_review/17

http://techreport.com/review/32391/amd-radeon-rx-vega-64-and-rx-vega-56-graphics-cards-reviewed/12
Has a clear lead for the 1080 over V64, and tie for the 1070 and V56

http://www.pcgamer.com/the-amd-radeon-rx-vega-56-and-vega-64-review/
Has a clear lead for the 1080 over V64, and tie for the 1070 and V56 (yeah, same as tech report).

Techspot showed a best case 2% win over 1070 with 25 games, given the behavior at techreport and and PCGamer I think they would almost certainly show a bigger win in favor of 1080.
https://www.techspot.com/review/1468-amd-radeon-rx-vega-56/page8.html

Techspot has a nice graph that really explains the state of what is going on:
https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1468/bench/GameComparison.png

Basically big swings depending on which the game favors. So reviews can swing back and forth depending on the included games, but it gets closer to a tie with larger number of games, with overal 1080 winning it's battle, and Vega 56 winning it's contest. Overall it looks like 1080's edge is better to me, but really it's moot, the individual game swings are huge backs and forth, swamping the minuscule average in one direction or the other.

I'm not sure why anyone would pass over Vega56. Use AA that's not MSAA and you're much faster than the 1070.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I'm not sure why anyone would pass over Vega56. Use AA that's not MSAA and you're much faster than the 1070.

I think you are reading that wrong. Most of the reviews aren't using MSAA and they are in a dead heat. MSAA is crippling AMD.

MSAA is one of the best AA modes. The postprocessing AA modes tend to make everything blurry.
 
Reactions: tviceman and Phynaz

Elixer

Lifer
May 7, 2002
10,376
762
126
When is this spec mainstream. I'm guessing not Volta but the generation after it?
Same with Navi?
Or even the generation after that actually given how late AMD is to the party with supporting standards. Remember Fury and HDMI 2.0....
January 4, 2017 it was officially released. Too late for Vega, not too late for Navi, and should be possible on Volta.
Though, not sure Nvidia wants to go anywhere near it, I bet they would drop HDMI, and just go all DP.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
When is this spec mainstream. I'm guessing not Volta but the generation after it?
Same with Navi?
Or even the generation after that actually given how late AMD is to the party with supporting standards. Remember Fury and HDMI 2.0....

I don't think NVidia will do anything about it until it becomes a competitive disadvantage. Which only happens after many screens support the feature and AMD supports it.
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
I think you are reading that wrong. Most of the reviews aren't using MSAA and they are in a dead heat. MSAA is crippling AMD.

MSAA is one of the best AA modes. The postprocessing AA modes tend to make everything blurry.

I don't care about 8x or 4x MSAA. I'm sure I'm one of the few, but it does me nothing. I haven't see a review that doesn't use 4x MSAA.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I don't care about 8x or 4x MSAA. I'm sure I'm one of the few, but it does me nothing. I haven't see a review that doesn't use 4x MSAA.

Most reviews are just using Defaults, so not MSAA. Most games will default to one of the ugly Post processing AA modes because they are low overhead. Some games don't support MSAA at all (BF1).

HardOCP always tweaks for the best visuals in their reviews so they turned on MSAA in some games, and noticed the big performance hit. I suspect no one else bothered which is why no one else noticed the big performance hit from MSAA.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
MSAA is one of the best AA modes. The postprocessing AA modes tend to make everything blurry.

CMAA doesn't blur. But then its aimed at saving resources on bandwidth constrained systems, so it won't look top-notch. FXAA is worst in blurring.

For me, since I don't notice anything in action, any blur or performance loss means all negative, zero positives. There are few games that aliased edges seem particularly bad, but most don't.

If you are looking for the best, its SSAA, if you are so sensitive to notice the difference in AA implementations.

Vega is just mediocre. MSAA performance being bad just adds to it. Its probably tied to the effect of looking bandwidth constrained(even though memory overclocks seem to do nothing).
 
Reactions: Phynaz

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Bandwith constrained?
It does better at 4k. Memory oc does little as you say.
All facts points to it beeing not bandwith constrained.
But aparently the memory not going up in freq where it should be automaticly puts half the board members in skitzofrenic mode.
Hypetrain decease.

The color compression seems excellent to me. If anything this actually seems to work fine in vega and is a solid imprivement even from polaris.
Credit where credit is due.
Crucial for former apu that is surely bandwith constrained. So good news for apu.
 

Tup3x

Golden Member
Dec 31, 2016
1,008
996
136
This is why nvidia really sucks.
Instead of supporting the open standard, which they could very easily do, they force people to make much more expensive purchases, just so they can tie them into their hardware.

Of course, with the release of Vega64, and it not being able to compete against the 1080Ti, people with 4K monitors are left in a big bind.
They made a tangible investment in a expensive 4K monitor, and they are then stuck unless they want to sell their stuff, and start over again.
While you still could use a nvidia card on the same monitor, it wouldn't do adaptive-sync.

However, the newest HDMI Specification 2.1 does list, Game Mode VRR features variable refresh rate, (which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing), as a feature, so, Nvidia might just be forced to be supporting that after all, if they want to be HDMI 2.1 compliant.
http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx
Now they would have the right time to introduce G-SYNC Lite i.e. adaptive sync support. They could market it as cheaper G-SYNC - not necessarily as good but better than without. AMD could not use Freesync display cost advantage as an excuse after that.
 
Reactions: Kuosimodo

Peicy

Member
Feb 19, 2017
28
14
81
I'm not sure why anyone would pass over Vega56. Use AA that's not MSAA and you're much faster than the 1070.

I don't care about 8x or 4x MSAA. I'm sure I'm one of the few, but it does me nothing. I haven't see a review that doesn't use 4x MSAA.
First things first: No, Vega 56 is not much faster when you don´t use MSAA.
Also, could you be more specific when you write that haven´t seen a review that doesn´t use 4x MSAA? Specific games? Specific reviews?

In addition, i wanted to point out that using Vega 64 or 56 as a 4k card is not the best idea. The entire argument that i keep reading that "its better at 4k" is highly flawed because it simply does not provide the raw performance necessary. Sure you can start reducing details, but its highly unlikely that now or in the future you can maintain >60 FPS stable with these cards. That also goes for the 1070 or 1080 as well.
Of course there will be outliers such as Doom, but the vast majority of games will not be optimized like that.
 
Reactions: Muhammed

french toast

Senior member
Feb 22, 2017
988
825
136
I dont know what this chill is.
Clearly its amd responsibility how they run the card.
But why this obsession with reviews and how its reviewed? Each can make his own evaluation for personal use and everyone can change a simple powerprofile. This is not bios change or similar but a simple setup in the driver settings.
There has to be a basic framework to conduct reviews, how else are you going to compare GPUs ?
Of course all features and profiles should be sampled in the review, but the main review must be done in standard - out of the box mode - with overclocking and other features taken into consideration.
Power saving mode makes Vega look better in perf/W, but it then falls behind a 16 month old, smaller and still much more efficient gpu- which ever way you look at it Vega looks bad, I think AMD should have put power saving as standard mode and launched with better drivers to pull ahead of 1070.
Still would have been bad but much better.

Vegas woes are compounded by numerous factors, many of which we have covered plenty- this is literally the very worst it could have been- I expect navi to be much more competitive, not sure about Vega 11- uarch has obvious bottlenecks and imbalances and I'm not confident that those will be fixed with Vega 11.
 

french toast

Senior member
Feb 22, 2017
988
825
136
Bandwith constrained?
It does better at 4k. Memory oc does little as you say.
All facts points to it beeing not bandwith constrained.
But aparently the memory not going up in freq where it should be automaticly puts half the board members in skitzofrenic mode.
Hypetrain decease.

The color compression seems excellent to me. If anything this actually seems to work fine in vega and is a solid imprivement even from polaris.
Credit where credit is due.
Crucial for former apu that is surely bandwith constrained. So good news for apu.
We don't have enough information yet so I would be off on judgement, could be a memory controller issue but hbcc is helping with higher resolution?
Or it might be a geometry or fillrate issue? Unbalanced design? Hardware fault? Or maybe the hardware is as intended and they are waiting for game optimisations and better drivers over a 3 year period? - a long play so to speak as they do not have the resources to do otherwise?

Who knows, I just want them to sort it by navi
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
First things first: No, Vega 56 is not much faster when you don´t use MSAA.
Also, could you be more specific when you write that haven´t seen a review that doesn´t use 4x MSAA? Specific games? Specific reviews?

In addition, i wanted to point out that using Vega 64 or 56 as a 4k card is not the best idea. The entire argument that i keep reading that "its better at 4k" is highly flawed because it simply does not provide the raw performance necessary. Sure you can start reducing details, but its highly unlikely that now or in the future you can maintain >60 FPS stable with these cards. That also goes for the 1070 or 1080 as well.
Of course there will be outliers such as Doom, but the vast majority of games will not be optimized like that.

Look, I'm not going to get into a long drawn out argument over my personal preferences. I use 2xAA and don't require more. I game at 1080p with a 25 inch monitor.

I've seen reviews where they turned off MSAA in favor of CSAA and the card ate the 1070 and rivaled the 1080.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Or maybe there are tangible user experience benefits that NVIDIA is delivering by tightly controlling the hardware ecosystem w/ the proprietary G-Sync module, high standards for the panels, etc.

Just saying.
If my monitor was set to gsync standards, Vega wouldn't be an issue.
In fact, we saw this issue with Vega launch and the dumb monitor it was bundled with.
Edit:
It's all worth the wait...
https://www.geforce.com/whats-new/a...tion-4k-trailer-nvidia-gameworks-enhancements
Final Fantasy.... Gameworks.... guess I only have one option anyway.

I honestly don't see a future for Vega if Nvidia ramps up Gameworks + DX12
Look, I'm not going to get into a long drawn out argument over my personal preferences. I use 2xAA and don't require more. I game at 1080p with a 25 inch monitor.

I've seen reviews where they turned off MSAA in favor of CSAA and the card ate the 1070 and rivaled the 1080.
Like I said before, the results are odd to talk about given that it looks like there is a bug with AA. We need to hear more from AMD. The wait continues
 
Last edited:
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |