Vega/Navi Rumors (Updated)

Page 218 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Paratus

Lifer
Jun 4, 2004
16,848
13,784
146
He says they are not enabled the Frontier Edition, which everyone knows already.

I am talking about RX Vega. Which is supposed to have them, which AMDs slides with performance numbers should already reflect.

That can't be used as an excuse anymore for RX Vega's poor numbers in AMDs own RX Vega numbers.
Ryan quite clearly states it will be enabled at launch. Launch is in two weeks.

While I don't expect it to make a huge difference the comment that Vega currently has features disabled is supported by AMDs own words.
 
Reactions: Yakk

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Ryan quite clearly states it will be enabled at launch. Launch is in two weeks.

While I don't expect it to make a huge difference the comment that Vega currently has features disabled is supported by AMDs own words.

The only "currently" for the public, is the Frontier Edition. It is inaccurate to say Vega RX doesn't have these features running.

Launch is when they go public. That doesn't mean they haven't been running for months internally.

If Vega hasn't already been running with these features enabled internally for weeks, if not months, they aren't shipping with them enabled publicly, in two weeks.

Those AMD slides represent Vega RX with it's features running. It is really grasping at straws to think AMD revealed performance numbers that are meaningless this close to launch.
 
Reactions: Muhammed

DownTheSky

Senior member
Apr 7, 2013
787
156
106
Taken from Beyond3d:
Wondering where did most of Vega's transistors go? Just to increase clock speeds!
Talking to AMD’s engineers, what especially surprised me is where the bulk of those transistors went; the single largest consumer of the additional 3.9B transistors was spent on designing the chip to clock much higher than Fiji. Vega 10 can reach 1.7GHz, whereas Fiji couldn’t do much more than 1.05GHz. Additional transistors are needed to add pipeline stages at various points or build in latency hiding mechanisms, as electrons can only move so far on a single clock cycle; this is something we’ve seen in NVIDIA’s Pascal, not to mention countless CPU designs. Still, what it means is that those 3.9B transistors are serving a very important performance purpose: allowing AMD to clock the card high enough to see significant performance gains over Fiji.

Hah.
 

Roger Wilco

Diamond Member
Mar 20, 2017
3,955
5,826
136
If I was in the market for a 1070, the Vega 56 would be a strong consideration. Decent price, manageable TDP, cheaper monitor options, etc. But when Volta drops...
 

Veradun

Senior member
Jul 29, 2016
564
780
136
Do we have rumors about "when", regarding the Nano?

And most importantly, what about getting one if you already own a Ryzen and a freesync monitor?
 

leoneazzurro

Golden Member
Jul 26, 2016
1,015
1,610
136
The only "currently" for the public, is the Frontier Edition. It is inaccurate to say Vega RX doesn't have these features running.

Launch is when they go public. That doesn't mean they haven't been running for months internally.

If Vega hasn't already been running with these features enabled internally for weeks, if not months, they aren't shipping with them enabled publicly, in two weeks.

Those AMD slides represent Vega RX with it's features running. It is really grasping at straws to think AMD revealed performance numbers that are meaningless this close to launch.

This is your opinion, not corroborated by FACTS. You could also look at the driver version used in the test (it is written at the end of the slides), and what driver version i.e. enables faster geometry throughtput (it is also written) and see that the two are not the same (17.30 vs 17.32). So we don't know which features are enablet in 17.30, we can know for sure that faster geometry is not, and maybe also others will be available only at launch or even later. Not to say this is a good policy, but it is what it is, probably programming such a complex architecture is not easy when you have a much lower R&D budget compared to your competitor.

PS: also it seems power savings features were turned off in VEGA FE, not to say RX VEGA will magically consume half of it, but maybe we can see some lower power draw in the Rx version.

For more info, look at the whole architectural slide package here:
https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical_Overview/7.html
 
Last edited:
Mar 11, 2004
23,181
5,646
146
I think they aimed for much higher clock frequency but failed. I guess GCN just doesn't scale.

That seems very plausible. Nvidia admitted it took a lot to push things with Pascal. And AMD doesn't have the resources they did. Seems like that might be a major dud in the design so far (all the extra transistors added for higher clockspeeds, although it is likely something they were going to have to address at some point, but the tradeoff on Vega doesn't seem like it's terribly worth it so far).

As a professional/compute card, Vega isn't too bad. Radeon WX 9100 should sell decently; at $2,199 it's only a bit more expensive than Quadro P5000, but far more powerful.

The problem isn't that AMD is not competing in the high-end gaming segment; writing that off is a plausible business decision. The problem is that their marketing team was so flagrantly dishonest about it. I've lost a lot of respect for AMD over this fiasco. Any gamer who bought into the hype has been stuck with lesser products for months "waiting for Vega" when they could have had a superior Nvidia card at a price no higher than Vega will now cost. Worse, prices have become inflated from the mining craze, with the GTX 1070 almost unavailable and the GTX 1080 temporarily back up in price, so anyone who waited is likely to now incur a real financial loss unless they are willing to wait longer.

If you wanted 1070/1080 performance before now, you should have bought. We've been seeing this exact whining for over a year now. There's been multiple great deals on 1070/1080s (and the 1080Ti is a great card, even a pretty good value), it made no sense to wait. AMD was not going to seriously undercut Nvidia on perf/$. It seems like there's people that were expecting a 1080Ti killer at $500. That simply was not going to happen.

It is not AMD's fault that mining has caused prices to get stupid. Holding that against them is pure idiocy. Unsurprisingly, seems the same people blaming AMD for them waiting, are the same people that use this argument. Its almost becoming a recurring theme (along with the general "I was betrayed by AMD!!!" nonsense).

Well, Vega is a complete failure for gamers. Months ago, I was hoping for at least -10% from a 1080ti. Total disappointment. AMD have regressed in essentially every parameter, perf/watt, perf/mm^2 etc, without solving the essential issues that held Fiji back. The only thing that's holding me back from buying nvidia is Freesync (I haven't bought a monitor yet, I'm waiting for the C27HG70).

Its not good but its not nearly the disaster people make it out to be. At release, if the performance doesn't fit your needs/price don't buy. I'm guessing prices will decline on it sooner than later (especially if mining starts to fade like people act like is starting to happen), and then decide then (or buy cheap used card when the miners dump their stock). Which by then hopefully we'll know more about Volta, but again, unless you can wait (in which case, why bother frothing at the stupid speculation in video card subforums, just see what options you have at your price range when you're ready to buy, then maybe check to see if there's some impending release that might change things), don't.

That's why "ecosystems" are crap and people should stop buying into them unless there's very good reasons. Variable refresh isn't good enough that I'd let that dictate multiple $300+ purchases. I'm genuinely baffled at how obsessive over this people seem to be, as I think it is one of the biggest shams perpetrated in gaming. It never should have happened either because this never should have been an issue (adaptive sync should have been something that occurred long before now). Hopefully it being part of DP spec will put an end to this particular madness.

All AMD had to do was to remove the bottlenecks of Fiji then add +50% to everything. Think about it. A Vega with 1TB/s mem bandwith 6000+ SP and crapload of raw geometry performance. At ~1400-1500Mhz. Would have destroyed 1080Ti. But what do they do? They go for professional market and spend all transistor budget on features no gamer needs.

Hell. Even 2x Polaris would had been better.

Really they probably didn't need to add 50% more to Fiji across the board, they could have probably done better just by doing with what clockspeed increases they could get (probably up to Polaris level) from the process, and adding 25% to the ROP and texture units, along with whatever other architecture (not tied to upping clockspeed) changes. I do agree the decision to halve the bus width (and thus limiting memory bandwidth) was odd/bad too. Not even sure they'd have needed to keep the full width, as say just 3 blocks would have still offered plenty of bandwidth (~750GB/s), and they could have gone with 9GB and 12GB. Would be interesting to know the cost/complexity issues in determining this stuff.

I'm not sure I agree with that. Not that it would necessarily be worse, but I think there's a decent chance it wouldn't have been better either. Since quite a few people already think the ROP and TMU are limiting for gaming on Vega (due to Fury), only the TMU would be a bit higher. But Polaris doesn't clock as high (already is fairly power hungry at its lower clock speed too, about in line with Vega even), and a larger version very possibly wouldn't clock any better (maybe even worse). And it likely would not have some of Vega's features (that might not be terribly beneficial now, but getting developers access to those features with cards in their hands, could help them be beneficial later).

Vega seems very much to be a transitional design with forward looking aspects that aren't likely to be well utilized soon. People keep lamenting this about AMD's GPUs, but I think its their only choice, as they can't dictate people cater to them or hire a lot of software developers to get things implemented. They have to think ahead. Its often frustrating as average consumers/gamers (why don't we talk about how amazing of a gaming design they'd have if they'd have stuck with VLIW4!), but it hasn't been nearly as bad as people have been acting like for years. Seriously, this has got to be about the 5th year in a row we heard how GCN was the worst design ever and AMD is doomed to failure because of it. We'll see how Vega ages but the doom and gloom that pervades this forum at every AMD GPU release is almost outright laughable.

Thankfully, Ryzen is good, because otherwise, this could be company destroying for AMD.

They must have known it was going to be this bad, even before tape out, and yet they just went along with this train wreck. It doesn't bode well for bouncing back.

I really think they need outside talent to straighten out their GPU design, but where would that come from these days?

I don't agree. It could also be that they focused performance on things other than gaming. And those happen to have more growth and higher profit margins than the high end consumer video card segment. They released a card that is ok in gaming performance, which keeps developers at least considering their hardware (and gives them hardware that can utilize newer features so they can work on implementing them and get software support for them which will benefit later GPUs).

WTF is all this talk about their design team and junk? Seriously, I'm confident that almost none of the people constantly spouting off about this know much of anything when it comes to architecture design and engineering of GPUs, let alone know the internal setup of AMD/RTG or what their targets are/were. And yet we're getting "they need to fire Raja and clean house" from random people (many of which don't even have many posts on here). Especially considering that AMD has been involved in multiple GPU projects (Sony, Microsoft, their own stuff; there's even hints that they've been helping Apple develop their mobile GPU), I'm not sure how people can say some of the stuff they do (let alone how they do it definitively like they're some authority on the subject). I'd say disgruntled former employees but they don't really act like that (then I seem to remember people saying NVidia needed to ditch JHH after Fermi). Its clear that their GPUs have plenty of performance capability, its just they might not be stupendous for gaming. They're still ok for that use.
 
Reactions: Feld

Slappi

Member
Dec 7, 2002
72
31
86
None of this matters anyway..... miners will grab every available card for any price and most gamers will never see one of these even if they wanted to.
 
Reactions: Head1985

Yakk

Golden Member
May 28, 2016
1,574
275
81
Vega Technical Overview.

https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical_Overview

Welp AMD just solidified their hold on mining. Vega has specific instructions for mining. This explains the reports I've see that Vega was really good for mining.


AMD always had more direct hardware command calls on their GPUs, they are just continuing their lead in that area. The side effect of this was their improved mining performance for many years. Looks like that may continue.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Vega 56 at $400 and 210W TDP (and going by recent AMD offerings, about 90-95% of Vega 64) sounds like an excellent deal. It reminds me of R9 290. I expect good overclocks as well, getting the card to stock Vega 64 performance levels.

It really shouldn't remind you of the 290. The 290 was actually (barely) faster than the $500 780, matched the $1000 Titan, and the $700 GTX 780 Ti was only 15% faster. I'm not expecting the Vega 56 to be faster than the $500 1080, and within 15% of the $700 1080 Ti. And it had more VRAM than the Ti, which is opposite this time.

Mining ruined these prices, FWIW.

It looks like AMD is aiming for the Vega 56 to edge out the reference 1070 but not reach the higher card, so I'm not expecting the 290 parallel to hold up. 1070 max OC can match 1080 stock, so I will be curious about Vega 56 max OC. Besides power consumption, it could be a very viable alternative to the 1070. Looking forward to reviews, just out of curiosity.
 
Reactions: tential

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
No reviews yet but I've seen reports for Vega 64 being 15% faster than 1080. The Vega 56 should be equal or maybe slightly better for $399. And they're listing TDP at ~220W IIRC. Assuming performance is as I've seen rumored most recently, why wouldn't that $399 card sell really well? Keep in mind that quite a few people prefer AMD's open ecosystem where they aren't getting gouged for every new feature. I'll take Freesync support and the best support for Vulcan and DX12, GTX 1080 performance, for $400.

I haven't looked back to verify those numbers. It's my best recollection. I believe they are correct though.

LOL! Dude you'd have way more credibility if you just came out and said "I love every AMD GPU no matter what and will always find something positive to say, even if it's a completely unfounded rumor that is being proven false this very moment."
 
Last edited:
Reactions: tential

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Well, if there is any good news here, it sure isn't aimed at the gaming crowd, that is more of a thud than anything else.
Vega is squarely geared toward the more professional crowd, and from what we see from the show, that will be their main focus.
At least they will make more $$$ selling to that crowd than the gaming crowd.
If they had wanted the pro crowd they would have spent a bit more on the hardware - quad stacked HBM with ECC support - while dropping clocks so it worked more efficiently. Instead of wasting all those transistors on failing to clock high, they would have used some giving decent 32 float support.

Vega is very much a gaming card.
 

pj-

Senior member
May 5, 2015
481
249
116
That's why "ecosystems" are crap and people should stop buying into them unless there's very good reasons. Variable refresh isn't good enough that I'd let that dictate multiple $300+ purchases. I'm genuinely baffled at how obsessive over this people seem to be, as I think it is one of the biggest shams perpetrated in gaming. It never should have happened either because this never should have been an issue (adaptive sync should have been something that occurred long before now). Hopefully it being part of DP spec will put an end to this particular madness.

VRR is awesome and I will not buy a monitor in the future that doesn't have it. The ability to play games at 80-120 fps without ever having to worry about screen tearing or leaving my hardware under utilized is great.

It is ridiculous that nvidia doesn't support the standard. I wouldn't be surprised if monitor manufacturers liked the extra margins and have told nvidia to take their time moving away from gsync.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
AMD always had more direct hardware command calls on their GPUs, they are just continuing their lead in that area. The side effect of this was their improved mining performance for many years. Looks like that may continue.

Correct. AMD had 1 instruction that was very useful for mining computations that Nvidia didn't support.

What reports? Everything I've seen points to ~33mh/s for ETH which is ok until you look at the power consumption to do it which is over 2x what a 1070 pulling ~31mh/s uses.

From people writing the mining software.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
It is ridiculous that nvidia doesn't support the standard. I wouldn't be surprised if monitor manufacturers liked the extra margins and have told nvidia to take their time moving away from gsync.
With 75% of the market and little competition at the high end, they don't really have to care about VRR. They've solidified themselves in consumers minds as the premium product and G-Sync is just that, the premium VRR solution.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
With 75% of the market and little competition at the high end, they don't really have to care about VRR. They've solidified themselves in consumers minds as the premium product and G-Sync is just that, the premium VRR solution.

Nvidia's performance lead increases every generation.
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
And what made Sweeny show up at an AMD presentation? He would be one of the last people I'd ever thought to see there.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Correct. AMD had 1 instruction that was very useful for mining computations that Nvidia didn't support.

From people writing the mining software.

Hopefully that's the case for AMD's sake. Mining is the only chance they have to sell these things although with the power consumption and cost they'd better pull 50mh/s or more in ETH and do well in the other cryptos.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
VRR is awesome and I will not buy a monitor in the future that doesn't have it. The ability to play games at 80-120 fps without ever having to worry about screen tearing or leaving my hardware under utilized is great.

It is ridiculous that nvidia doesn't support the standard. I wouldn't be surprised if monitor manufacturers liked the extra margins and have told nvidia to take their time moving away from gsync.
Monitor manufacturers don't care - they probably get a bigger cut with the more expensive g-sync monitors so if people buy them they're happy. The only people that can change Nvidia's mind are the consumers by not buying Nvidia due to that feature. It also has to be said VRR is not the better solution, it's just the cheaper one, and it's only cheaper because Nvidia charge for g-sync not because it really costs much more for the hardware. VRR has a distinct lack of quality control with half the monitors having bad implementations (e.g. 48-75hz only), all the g-sync monitors work well as Nvidia won't certify anything that doesn't. It's this sort of thing that makes Nvidia appear "premium" and hence people buy them not the competition despite the cost.
 
Reactions: xpea and Phynaz
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |