AMD Vega (FE and RX) Benchmarks [Updated Aug 10 - RX Vega 64 Unboxing]

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cm123

Senior member
Jul 3, 2003
489
2
76
Just my opinion... however think Fury Pro Duo, even yet today it and 1080Ti are very close (as long as software supports Crossfire).

So, if AMD has Vega which is about 1080 speeds, why not repeat then - the delay could give AMD a couple of months to release (put wraps on what was going to come anyways) Vega Pro Duo so as to once again claim the single card (which is AMD's thing over the years anyways) that beats them all, again... Its better than what's on the table from appearance sake for AMD unless they are about ready to pull a rabbit out of their hat.

This would also explain why AMD did a demo of Cross-fire Vega (even though that didn't seem to go over so well) - My opinion, they're working bugs out of cross-fire and making ready Vega Pro Duo...
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
NVIDIA wouldn't delay Volta based upon how bad or good RX Vega is - they have their own internal cycle that is laid out two generations beyond Volta. It would cost them more money to delay Volta than to release it on schedule. Why wait for AMD to close the gap when NVIDIA could get another 30-50% performance increase over Pascal/Vega now? They wouldn't shoot themselves in the foot financially at all - they could simply charge even more for the 20xx series without losing money on existing 10xx inventory.
 

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
Remember that this card seems to draw up to 300w. Thats insane for this level of performance in this day and age.

With a good fan or AIO that shouldn't be a big concern outside the SFF market.

Which is a better upgrade for a gamer assuming same price for video card and 180W for 1080 and 300W for Vega?

http://www.ebay.com/itm/Acer-XR342C...634745&hash=item4655f58326:g:sTAAAOSwxKtX~4fL

Acer 75Hz 34" Freesync monitor, $780 shipped from newegg.

http://www.ebay.com/itm/Acer-Predat...456380&hash=item58e85fdb45:g:JKoAAOSwoydWo49O

Acer 100Hz 34" Gsync monitor, $1300 shipped from newegg.

$420 saved on monitor seems a fair trade for the 120W, although 75Hz vs 100Hz isn't a direct comparison. However; neither is pro level twitch gamer fast, so I have to wonder how much that extra 25Hz matters in average user gaming?

I'm gaming on a 3440 X 1440 60Hz monitor with a 1080Ti. If the Vega had been out when I was buying, I'd have a Vega and the Freesync panel.

Either a Vega or 1080 should handle 3440 X 1440 easily.
 
Reactions: Bacon1

Glo.

Diamond Member
Apr 25, 2015
5,763
4,667
136
From what I saw, Shrout tested the GPU in 4K Metro, at 100% fan speed, and finally the temps were stable, and clock speed also was stable@ 1.6 GHz. At that settings the GPU was drawing 280W under load.

Well. This is definitely not a SFF GPU . However, this is actually good information. 1.2 GHz RX Vega should be possible to be squeezed in 150W TDP, with just one 6 pin connector.
 

cm123

Senior member
Jul 3, 2003
489
2
76
NVIDIA wouldn't delay Volta based upon how bad or good RX Vega is - they have their own internal cycle that is laid out two generations beyond Volta. It would cost them more money to delay Volta than to release it on schedule. Why wait for AMD to close the gap when NVIDIA could get another 30-50% performance increase over Pascal/Vega now? They wouldn't shoot themselves in the foot financially at all - they could simply charge even more for the 20xx series without losing money on existing 10xx inventory.

-------
I agree, however for different reasons - AMD releases Vega Pro Duo and takes top single card role, Nvidia follows and releases Volta Titan (which is coming no matter what) to reclaim crown, same as in the past.

-------

Just my opinion... however think Fury Pro Duo, even yet today it and 1080Ti are very close (as long as software supports Crossfire).

So, if AMD has Vega which is about 1080 speeds, why not repeat then - the delay could give AMD a couple of months to release (put wraps on what was going to come anyways) Vega Pro Duo so as to once again claim the single card (which is AMD's thing over the years anyways) that beats them all, again... Its better than what's on the table from appearance sake for AMD unless they are about ready to pull a rabbit out of their hat.

This would also explain why AMD did a demo of Cross-fire Vega (even though that didn't seem to go over so well) - My opinion, they're working bugs out of cross-fire and making ready Vega Pro Duo...
 

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
From what I saw, Shrout tested the GPU in 4K Metro, at 100% fan speed, and finally the temps were stable, and clock speed also was stable@ 1.6 GHz. At that settings the GPU was drawing 280W under load.

Well. This is definitely not a SFF GPU . However, this is actually good information. 1.2 GHz RX Vega should be possible to be squeezed in 150W TDP, with just one 6 pin connector.
With this level of power a blower style fan on RX Vega would be pretty surprising.

Also, if it's another AIO card like Fury X and can maintain 1600MHz, a lot of the benches we've seen to date may be irrelevant as most say clock speed was variable with lots of throttling.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
From what I saw, Shrout tested the GPU in 4K Metro, at 100% fan speed, and finally the temps were stable, and clock speed also was stable@ 1.6 GHz. At that settings the GPU was drawing 280W under load.
He didn't test anything in between, which is simply raising the fan profile to compensate better. He just set it to 100% and it was maxing out at mid 50c, which is overkill. The guy on youtube the night before raised the fan profile and was able to sustain 1600Mhz without setting fan to jet engine and peaking around 82c.
 
Reactions: ZGR

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
AMD Vega Frontier Edition is power-limited in Metro Last Night 4K. 100% fan speed and frequency is < 1600 MHz: https://youtu.be/bhGAS_oGN3c?t=8472
You're right, so likely the higher workload caused lower clock speed. This is why we needed better testing, thanks. So will voltage regulators on AIO version allow higher sustained clocks? TDP by specs is 375w AIO vs 300w for normal version.
 

Tup3x

Golden Member
Dec 31, 2016
1,012
1,002
136
Theoretically, from what I understand, the architecture improvements: load balancing, two times higher geometry throughput, thanks to Primitive shaders, Tile-Based Rasterization, and Memory system should make Vega twice as fast as it is right now.

Vega should fly in High-resolution(5, 8K) textures, thanks to all of those improvements. Why is it not happening? I have absolutely no idea. Either software or drivers.
That's what I'm wondering too. There are some great and welcome changes but why those do not show? On paper everything looks great but somehow that doesn't translate to real word performance. This is basically a tweaked GCN so I don't think that drivers should be a big problem. At least at the moment with this card. Or maybe they should just completely bin GCN for good and start from scratch.

The fact that they have water cooled variants and didn't send press samples of VEGA FE to press makes me wonder... I have a bad feeling about this. If things don't improve a lot this is going to be pretty bad for consumers. I really, REALLY hope that this doesn't turn out to be their GeForce FX.
 
Last edited:

Peicy

Member
Feb 19, 2017
28
14
81
@Peicy
Is it confirmed air-cooled Vega FE can't keep 1600 MHz? Should custom models (better heat-sink, 2-3 fans) should be able to hit 1600-1650 MHz stable with 300W power draw? So if 56 CU model has the same performance as GTX 1080 (and full 64 CU chip 10% better), the same price only ~70W higher TDP, is that really such a big deal? How can 230W be amazingly efficient and 300W total disaster?

This looks a lot like Fiji launch, where 56 CU Fury wasn't much faster than 44 CU Grenada (~10% in 1440p), even though it had more than 20% advantage in raw power (TFlops). So probably the biggest (not to say the only) mistake AMD made with Vega is they didn't launch mid-size chip (44-48 CUs, 300-350 mm^2, probably GDDR) first with performance around GTX 1070 level, probably slightly faster. It would be cheaper to produce, test the architecture, gain more customers...
Nothing is confirmed at the moment. Sure its possible that a better cooled version can clock to 1600 (stable) and beyond, could be the watercooled one with a 375W TDP for example.


What you are talking about has nothing to do with Drivers.

Drivers can only make visible the features to the application, but its developers job to use them. If they do not use them - the hardware will not benefit from them.

It is a form of hardware vendor lock in. Specific optimization for Specific hardware. Only possible in Vulkan, DirectX12 and Metal.

They did worked with one company before the Vega release. Bethesda. Prey should be Vega optimized game. So far it baffles me that NOBODY thought about this and decided to not test Prey.
I am aware of that. It doesn´t make sense to me that something as basic as the tile based rendering mode is a "feature" that works on a per application basis though, Nvidia seems to have it implemented in their driver. It is possible that its the case, but that would be a huge waste since all older games would not benefit and if its actually integral to get this card to perform close to its specs...well that doesn´t sound good.

Same goes for their memory management. While Vulkan and DX12 give developers a lot more control, its not wise to expect developers to actually do so much work for one architecture, hence the driver team comes into play again.

Primitive shaders, thats something that requires work from the developers. But there must be a possibility to get good performance out of this card without going near console levels of optimization for a single architecture.

Even if a lot of features need to be worked in by whoever creates the application...why didn´t AMD work with Futuremark then? Thats a major factor that does not add up.

In summary, its possible that Vega needs a driver that does way more than AMD GPU drivers did in the past plus application specific optimization. Raja already hinted at the increased workload of the driver team. It would be baffling if the lowest hanging fruits haven´t been plucked yet by the driver team one month before release though.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
Some thoughts.

Remember not too long ago when Charlie @ S|A said this?

https://twitter.com/CDemerjian/status/872445675056898048
AMD made the right call to delay Vega, trust me here. They took a short term hit for long term gain and credibility. It took guts to do.

We now have Vega FE released first and the performance is definitely not where it needs to be for gaming. AMD has said numerous times that this is not a gaming card. Wait for the RX Vega for gaming results. The question as always is why? I think it stands to reason that AMD believes they can fix the performance issues with more time. If they didn't think they could get more performance they would have released RX Vega at whatever prices they needed to in order to compete. What the uplift will be I don't know, but we saw in PCPer benching that Vega FE is @ 1070-1080 levels depending on the game. The variance suggests AMD should at least be able to move the bar to solid 1080 levels. If a dev programs for the architecture I think we can see closer to 1080ti levels. I expect AMD to have some game, probably AoTS, demo'd with Vega using all of the new tricks.

The DBSR is not working in the PCPer benching. My Fury shows the same patterning. So it's not working or it's not triggering and fell back to traditional mode. We don't know what the deal is, but Vega does work in both modes dynamically.
http://techreport.com/review/31224/the-curtain-comes-up-on-amd-vega-architecture/3

Primative shaders. When this was first announced AMD stated that it would require developer work. However several months later Raja said AMD could get it working on their end. It's also confirmed in the below article. AMD may need more time to fine tune all of these game profiles. It's probably one of those things that if AMD does it then it gets a performance uplift. If the devs do the work it gives even more performance.
https://www.techpowerup.com/reviews/AMD/Radeon_Vega_GPU_Architecture/3.html

Vega FE marketing or lack there of. As long as AMD delivers the goods @ Siggraph it won't matter what they did or did not do right now. The RX benchmarks will either reinforce the FE results or they will better them and it will change people's minds.
 

thilanliyan

Lifer
Jun 21, 2005
11,912
2,130
126
Custom GTX 1080 Ti models consume 250-290W, some of them ~320W in peak. Some better GTX 1080 models draw ~230W. I really don't think people who spend $1000+ on gaming PC care about ~60W. Especially when enitere system consumption remains at 500W or less. And I really don't know anyone who owns $500+ GPU with average 500W PSU. People who have bougth GTX 980 or GTX 1080 are using 80+ Gold 700W+ PSUs, even if entire system power consumption is ~400W. Don't make a big deal of this 300W.

The fact guy managed to run all tests without issues with 550W PSU tells a lot. Sure it's now good for PSU if it is under 90-100% load, but good 700-750W PSUs cost $100, and will be under 60-70% laod (peak). If you really expected Vega will be as efficient as Pascal, than I don't know what to say

It's not the power consumption per se that is the issue IMO, but cooling that becomes the problem. An extra 50-60w is nothing to sneeze at when it comes to cooling a GPU. The harder it is to cool, the more unlikely that it will run at max clocks, making it look even worse performance wise.
 
Reactions: Phynaz

tential

Diamond Member
May 13, 2008
7,355
642
121
NVIDIA wouldn't delay Volta based upon how bad or good RX Vega is - they have their own internal cycle that is laid out two generations beyond Volta. It would cost them more money to delay Volta than to release it on schedule. Why wait for AMD to close the gap when NVIDIA could get another 30-50% performance increase over Pascal/Vega now? They wouldn't shoot themselves in the foot financially at all - they could simply charge even more for the 20xx series without losing money on existing 10xx inventory.
It's not like there are investors or management with compensation tied to company performance or anything....
/sarcasm
 
Reactions: nathanddrews

eek2121

Diamond Member
Aug 2, 2005
3,051
4,276
136
This is why people should wait for SIGGRAPH.

An AMD official has stated that the drivers are not gimped, but they are older drivers. If the drivers are old, they could be missing architectural specific changes like tile based rendering. Worst case scenario is that the drivers only have a few tweaks to support Vega...in fallback mode. AMD may even be intentionally NOT releasing optimized drivers until RX Vega rolls out. I've read many cases where people claim that drivers cannot possibly account for a 30-50% speed boost...but if everything that makes Vega great is not implemented in the driver, then it's losing a ton of performance just by virtue of not having those features. PCPer themselves may have proved that the drivers aren't in great shape. They stated there was no difference between pro mode and gaming mode except a UI change. That tells me that something is definitely wrong. Gaming mode should be orders of magnitude faster because gaming drivers are optimized for performance, while pro drivers are optimized for accuracy.

As many others have stated, it makes no sense why Vega is performing similarly to an overclocked Fury. In theory, with a geometry engine 2.5x as fast, and with a faster shader engine, it should beat the Fury-X at the same clock speed. Just based on what I've read about the architecture thus far, it should be nearly 3 times faster than the Fury-X.

I'm betting that AMD wanted more time with the drivers, but they had to meet deadlines, so they launched Vega with the most stable driver they could. When RX Vega rolls around I'm sure we'll actually see a decent performance bump with the RX gaming drivers. I guess we'll find out the beginning of August.
 
Last edited:
Reactions: MangoX and Bacon1

[DHT]Osiris

Lifer
Dec 15, 2015
14,628
12,761
146
@JDG1980
If nVidia puts GTX 1080 perfomance in ~$250 in next 6-12 months, they will shot themselves in the leg. Since no-one with 1080p monitor (95% of users) would bother to spend more than that for GPU. Even if they are looking for high FPS experience in popular multiplayer games.
4k@60 is becoming the new hotness for cheap, rapidly, and 4k 144hz monitors are coming online (albiet expensively). It won't be hard to convince people to buy in if they can get a 4k monitor + new monitor for under $500, and as that happens more and more, EVERYONE will expect great 4k performance.

The exact same thing happened with 1080p monitors once they crossed the $250-300 threshold (it also came along with a hard push to LCD from CRT).
 
Reactions: ZGR

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
As no one has brought this up yet I'm just throwing it out here ...

Perhaps all the performance optimization features are enabled (culling, rasterizer, etc...), and it needs each and every one of them just to keep up with Fury in performance per clock?
This is my worst nightmare if true, but perhaps they just pulled a Bulldozer? In their quest to get higher clocks for NCUs vs CUs, they failed to get them anywhere near the design target, yet took all the performance hits from this new high-clock design?

This is what I'm thinking. The reason why it appears to perform the same as an overclocked Fury is because that basically what it may be. All the changes are to get the clocks up, not to increase IPC.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
This is what I'm thinking. The reason why it appears to perform the same as an overclocked Fury is because that basically what it may be. All the changes are to get the clocks up, not to increase IPC.
Except for the actual part of the design to increase IPC as well, apparently.

 
Reactions: Bacon1

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
From what I saw, Shrout tested the GPU in 4K Metro, at 100% fan speed, and finally the temps were stable, and clock speed also was stable@ 1.6 GHz. At that settings the GPU was drawing 280W under load.

Well. This is definitely not a SFF GPU . However, this is actually good information. 1.2 GHz RX Vega should be possible to be squeezed in 150W TDP, with just one 6 pin connector.

What would be the point, that's Polaris performance territory right?
 
Reactions: raghu78

eek2121

Diamond Member
Aug 2, 2005
3,051
4,276
136
This is what I'm thinking. The reason why it appears to perform the same as an overclocked Fury is because that basically what it may be. All the changes are to get the clocks up, not to increase IPC.

AMD has clearly stated increased IPC over the Fury in their Vega architecture presentation. If they simply wanted higher clocks, the die shrink alone would have gotten the clockspeeds to at least 1.6 Ghz.
 
Reactions: Bacon1

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
This is why people should wait for SIGGRAPH.

An AMD official has stated that the drivers are not gimped, but they are older drivers. If the drivers are old, they could be missing architectural specific changes like tile based rendering. Worst case scenario is that the drivers only have a few tweaks to support Vega...in fallback mode. AMD may even be intentionally NOT releasing optimized drivers until RX Vega rolls out. I've read many cases where people claim that drivers cannot possibly account for a 30-50% speed boost...but if everything that makes Vega great is not implemented in the driver, then it's losing a ton of performance just by virtue of not having those features. PCPer themselves may have proved that the drivers aren't in great shape. They stated there was no difference between pro mode and gaming mode except a UI change. That tells me that something is definitely wrong. Gaming mode should be orders of magnitude faster because gaming drivers are optimized for performance, while pro drivers are optimized for accuracy.

As many others have stated, it makes no sense why Vega is performing similarly to an overclocked Fury. In theory, with a geometry engine 2.5x as fast, and with a faster shader engine, it should beat the Fury-X at the same clock speed. Just based on what I've read about the architecture thus far, it should be nearly 3 times faster than the Fury-X.

I'm betting that AMD wanted more time with the drivers, but they had to meet deadlines, so they launched Vega with the most stable driver they could. When RX Vega rolls around I'm sure we'll actually see a decent performance bump with the RX gaming drivers. I guess we'll find out the beginning of August.

Except for the actual part of the design to increase IPC as well, apparently.


Right.. given the information we have I'm thinking RX Vega will be fine. Going to be endless speculation for the next month.. however we shouldn't be seeing an IPC regression. The only thing I could think of is if this is true then the IPC regression is only in DirectX software like games.. could be that it is fine in compute workloads which would align well with AMD's vision to get into enterprise.
 
Reactions: Bacon1

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
AMD has clearly stated increased IPC over the Fury in their Vega architecture presentation. If they simply wanted higher clocks, the die shrink alone would have gotten the clockspeeds to at least 1.6 Ghz.

1. Stop listening to the marketing people. Their job description is literally to influence your perception and behavior.
2. You have no proof a die shrink alone would increase clock speed.
 
Last edited:

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
1. Stop listening to the marketing people. Their job description is literally to influence your perception.
2. You have no proof a die shrink alone would increase clock speed.

Well that's just plain silly! (Sorry if you were being sarcastic, as pretty much every new node has brought frequency improvements which I am sure you know.. after all the shrunk Maxwell - Pascal took advantage of 16nm frequency!)

Samsung/GF gave a 15% increase from 14LPE to 14LPP alone
The second generation process called 14LPP (Low-Power Plus) is advertised as bringing performance as well as power improvements over the 14LPE (Low-Power Early) predecessor. The new node is described as being able to increase switching speed of up to 15%
http://www.anandtech.com/show/9959/samsung-announces-14lpp-mass-production
 
Reactions: Bacon1

SpaceBeer

Senior member
Apr 2, 2016
307
100
116
It's not the power consumption per se that is the issue IMO, but cooling that becomes the problem. An extra 50-60w is nothing to sneeze at when it comes to cooling a GPU. The harder it is to cool, the more unlikely that it will run at max clocks, making it look even worse performance wise.
You mean it will need huge heatsink, 3 fans, it will occupy 3 slots and weight 3 pounds? Something like this
http://i.imgur.com/3u710qB.jpg

Or they could make lower clocked model, let's say around 1450 MHz. It would have similar performance as GTX 1080 and similar TDP as GTX 1080. And if it has similar price, than what's the actual problem?
http://cdn.wccftech.com/wp-content/uploads/2017/01/AMD-VEGA-10-specifications.jpg
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |