Vega/Navi Rumors (Updated)

Page 143 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jpiniero

Lifer
Oct 1, 2010
15,177
5,717
136
I agree it just doesn't make any sense either. AMD and their partners would just be thrilled to sell every single card they can produce.

Last time this happened AMD lost out on sales because consumers (gamers) couldn't get cards; and then when Bitcoin's price crashed the miners put their cards up on auction sites and people bought those instead of buying new.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
What if rev 02 or whatever was supposed to be the late June launch was found to not have enough usable chips per wafer in March?
A few tweaks refining chip and process and rev 03 becomes volume batch in September/October?

Wasn't the original launch for this supposed to be Q4 last year? Its not like they quit working on them once they launch or make engineering samples, would make sense to me to launch with the higher margin prosumer or professional (but lower volume) parts.

If they launch with low yield chips with a high volume part they risk enraging the fans and getting the "phantom edition" bad press.
AMD has never said anything about launching Vega in 2016 - that's speculation based on a very vaguely labeled roadmap image where the dot for "Vega" is slightly to the left of the text "2017", but with no indications of when years actually begin or end.

Not to mention that GloFo's 14nm process seems to have fantastic yields at this time, at least if Ryzen is any measure (apparently Ryzen yields are better than expected). Of course CPU and GPU architectures are very different, but I'd say it's rather safe to call that process mature as of now. What seems most likely to be holding consumer Vega back in my eyes is a combination of factors: I think we're seeing a combination of drivers still needing work, HBM2 being relatively scarce (and the 16GB Frontier Edition gobbling up a lot of it), and power/voltage/temperature tuning and refinements. AMD seems far more concious of the negative effects driver and heat issues have had on their brand than previously, and I can't help but think they're fine-tuning Vega as much as possible on before launch. Combine this with brand-new, untested tech like the HBCC, and you're bound to see development take time.
 
Reactions: Kuosimodo and w3rd

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Last time this happened AMD lost out on sales because consumers (gamers) couldn't get cards; and then when Bitcoin's price crashed the miners put their cards up on auction sites and people bought those instead of buying new.

Its impossible for AMD to have lost out on sales if people were buying them used, as that means AMD had already sold the card.

What hurts them is the cards price skyrocketing while the retailers are getting the money and not AMD charging more. So the price is higher than they want and thus worse on price/perf while they don't benefit from the extra income (retailers do)
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I agree it just doesn't make any sense either. AMD and their partners would just be thrilled to sell every single card they can produce.

AMD spent a lot of time and money marketing Polaris to gamers and releasing it at a pricepoint for those gamers.
How are you unable to see that a completely different market, with a completely different price/marketing/etc. is different than if AMD sold out that target market to gamers?

You're completely ignoring the customer acquisition cost AMD is incurring only to be taken advantage of by savvy entrepreneurs looking to make a good profit.

There's no point of having a conversation regarding business if you're going to take such a short sighted approach to the analysis of the situation.

AMD selling 10k GPUs to miners vs 10k GPUs to gamers is equal revenue, but signals COMPLETELY different things.

Its impossible for AMD to have lost out on sales if people were buying them used, as that means AMD had already sold the card.

What hurts them is the cards price skyrocketing while the retailers are getting the money and not AMD charging more. So the price is higher than they want and thus worse on price/perf while they don't benefit from the extra income (retailers do)

This as well, and this is assuming the retailers are able to react fast enough. Almost all AMD cards are sold out right now. Think that's because of gaming?
 

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
AMD spent a lot of time and money marketing Polaris to gamers and releasing it at a pricepoint for those gamers.
How are you unable to see that a completely different market, with a completely different price/marketing/etc. is different than if AMD sold out that target market to gamers?

You're completely ignoring the customer acquisition cost AMD is incurring only to be taken advantage of by savvy entrepreneurs looking to make a good profit.

There's no point of having a conversation regarding business if you're going to take such a short sighted approach to the analysis of the situation.

AMD selling 10k GPUs to miners vs 10k GPUs to gamers is equal revenue, but signals COMPLETELY different things.



This as well, and this is assuming the retailers are able to react fast enough. Almost all AMD cards are sold out right now. Think that's because of gaming?
AMD doesn't care if they sell the cards to salvage yards wanting to melt them down for scrap metal, sales are sales.

The same can be said of NVIDIA and Intel.

They marketed to gamers at that price point because gamers would pay more for the product than cubicle farms, and their competition in the gamer market charged that for that level of performance.

Anything else is PR speak, salesmanship.

Raja doesn't care if you can "be proud of your card", but he can't tell you the real reason the cards aren't here because his first concern is the stock price.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Remember: If AMD can make a 250W Vega card that performs at a level X, a dual GPU version will need to be a 500W card to potentially perform 2X with perfect scaling (which doesn't happen).
That is not how power draw scales.
 

w3rd

Senior member
Mar 1, 2017
255
62
101
Didn't spot that this was a reply directed at me. This doesn't qualify as a board shot, though. Sure, it shows the PCB extending the length of the shroud. It also has a matte rubberized cover, not the brushed metal aesthetic shown in their renders, not to mention the 6+8 power delivery rather than 8+8. Engineering sample? Non-final board? No matter what, the core of it is that we don't know anything about what's on the board. From what we know of Vega's die/package size, that board could be 1/3 or more bare PCB to make room for a honking huge cooler.

We have no indication that Vega has the capability of having multiple GPUs act as one over Infinity Fabric. While this is theoretically possible with the technology, and no doubt something AMD is working towards, there is unfortunately nothing indicating that this is ready for Vega. Navi, on the other hand, might have it - who knows? And as has been stated by multiple people above: a fast interconnect does not make two GPUs appear or act like one to the OS and software. That's an entirely different ball game.

Please explain how you know this. Please. I've been asking you this for months, and you seem entirely unable to come up with an argument. Do you have any basis for this outside of your own gut feeling?

Really? You seem to have a very limited grasp of how tech enthusiasts think, and how new games are always ever more demanding. The way you're talking, everyone would stop looking at new GPUs if one were to appear on the market capable of fulfilling their wishes. That's bull, plain and simple. There's always a new, hot, game with amazing graphics launching in a few months that promises to crush even the fastest GPUs. Thus, gamers and tech enthusiasts are always looking for the next piece of shiny silicon.

Slightly OT here, but as someone who does textual analysis for a living, I find it fascinating that you consistently use language that conveys belief rather than knowledge. Your writing has striking similarities with the manners of speech used in various charismatic religions. Makes me wonder whether this is conscious or not, and regardless of this, what your thinking behind your way of expressing yourself is. As you seem singularly unwilling to answer critical questions with anything but reiterations of your unfounded statements, unfortunately we're not getting any closer to this. But nevertheless, it's truly fascinating.

Show us where Infinity fabric doesn't scale. Please...

Vega x2 is real, & I know this through a trusting friend. I am not new to the industry, I just don't play in it anymore. I am sure AMD doesn't like my knowledgeable posts and would rather keep trolling Nvidia with beliefs. But we all know AMD just spend their entire Companies fortune on fabric. So Dr Su is going to leverage that against Intel and Nvidia.

Whether fanbois like it or not. You do not have to believe me at all, I don't expect you. But you don't even have to take what I am saying with a grain of salt, just look at what AMD's engineers have said already. (As a matter of fact and to your point: Does it really matter of Vega isn't X2, but Navi would be..? The outcome is still the same, isn't it... Nvidia getting rolled over in 2018? Because they can't compete on economy of scale.)

Saying you don't believe in Vegax2, is like saying you don't believe in ThreadRipper... it leaves you kinda of in the dark.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Show us where Infinity fabric doesn't scale. Please...

Vega x2 is real, & I know this through a trusting friend. I am not new to the industry, I just don't play in it anymore. I am sure AMD doesn't like my knowledgeable posts and would rather keep trolling Nvidia with beliefs. But we all know AMD just spend their entire Companies fortune on fabric. So Dr Su is going to leverage that against Intel and Nvidia.

Whether fanbois like it or not. You do not have to believe me at all, I don't expect you. But you don't even have to take what I am saying with a grain of salt, just look at what AMD's engineers have said already. (As a matter of fact and to your point: Does it really matter of Vega isn't X2, but Navi would be..? The outcome is still the same, isn't it... Nvidia getting rolled over in 2018? Because they can't compete on economy of scale.)

Saying you don't believe in Vegax2, is like saying you don't believe in ThreadRipper... it leaves you kinda of in the dark.
Yeah, I'm sure NVIDIA will have a lot of difficulty competing with GloFo churning out those 500mm^2 Vega dies...? Not seeing the advantage.

Navi isn't coming until 2019, by the way. The OP of this thread includes the relevant slides. Yeah, it's Sweepr but the slides are real.

AMD's slides from their latest presentation corroborate this as well: https://www.extremetech.com/wp-content/uploads/2017/05/GPU-Growth-Large.jpg

I don't see Navi in their stack for 2018.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
AMD doesn't care if they sell the cards to salvage yards wanting to melt them down for scrap metal, sales are sales.

The same can be said of NVIDIA and Intel.

They marketed to gamers at that price point because gamers would pay more for the product than cubicle farms, and their competition in the gamer market charged that for that level of performance.

Anything else is PR speak, salesmanship.

Raja doesn't care if you can "be proud of your card", but he can't tell you the real reason the cards aren't here because his first concern is the stock price.
This is 100% wrong and you have no business doing or commenting on financial analysis with comments like that.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Looks like Vega has a ton of rumors/speculation/gossip surrounding it. I guess this is what happens when AMD decides to start hyping up a product over 6 months before its retail release..

Aside from all that at least we will know something more in 3 days.
 

davide445

Member
May 29, 2016
132
11
81
@OatisCampbell @Valantar
About how relevant the mining topic is, let's say to make a comparison Apple is investing billions in creating and communicate a product for a target appreciating the thinness, UX, ecosystem, price, status.
All of this create loyal, returning customers, that purchase many iterations of your product even if next iteration it's not the market best in every spec. They also create a secondary market where they work hard to maintain a price and availability that contribute to strengthen the perception of these values, focusing on the long term brand creation, since a returning loyal customer is far more profitable than a new one.
Let's say at a certain point these customers cannot purchase the product or only as far higher price because a group in another planet find it perfect as dog bone for his shape.
So Apple invested for features completely useless for their new customers, that are not interested in the product but at just one of his features, disposing it at lower price when their dogs finished crunching it, in a status mostly un interesting for the original customers, and with no interest in the brand so that they will change as soon they find a bone with let's say a different flavor.
Summing up you spend more than needed, your customers didn't receive the product, your brand didn't receive any benefit if not will be wounded, the price on the market communicate unreliability instead of value, the secondary market is competing with the primary for price at ruined products, you need to reinvest all from the beginning since your target are still new customers.
So IMHO there are probably many factors to account for delay, shortage and pro users presentation, I've a strong feelings the mining topic is not irrelevant.
 
Last edited:

w3rd

Senior member
Mar 1, 2017
255
62
101
It is about connecting the dots.

Caspian & Cream event, Bethesda said they have near 100% scalability. AMD engineers already confirmed this, and not only that, the ability to de-couple the gpu's and use them for different tasks, etc. (Real time TruAudio and other 3D environmentals).

Not sure why so many people are feigning ignorance on any of this. This is the future and something AMD has been aiming at since acquiring ATi, plus they already have their APU as proof of design, as to what they can do with Vega on fabric (or refined & on Navi).

Non of that matter's to the joe consumer really. But upon their release joe will see Radeon Vega as premium choice. Which will also elevate the Radeon brand, giving it massive headspace.



Might as well get on board (wrap your head around) that AMD's "secret sauce" is not hype, it is a revolution taking place. And We (as consumers) are about 3 months away from watching the floodgates open. Some just can't see it.

Obviously single Vega (& cut downs) are "entry level" to the 4k Gaming Market.
And being the cheapest tier the initial premium for Vega's HBM2 will be costly step for many consumers to make. But also worth it, because of the "other aspects" of what Vega's platform brings. One of those aspects is the simple fact that HBM2 will not be about high frames, but ultra maintained minimums. Another aspect is FreeSync2 is available on Vega's platform.

Play around with and pretend knowing for sake of argument:
  • baby vega is single die gpu
  • big vega is double die gpu

Ponder on that^ strategy for a bit....
Then envision Vega's entire product stack/sku and what it is going to look like in 6 months from now. ($399 ~ $1,200?)

tick
tock
Holiday shopping Season!

"and you get fabric.. and you get some fabric.. everyone gets fabric..!"



Economics would suggest, that by December not so right "cut down" x2 version of Vega to be the best buys. But still way more capable than a $800 product from a competitor.

Can you wait..? Is all...!
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
This is 100% wrong and you have no business doing or commenting on financial analysis with comments like that.
Can you expand on this, as accepting your statement, implies a degree of faith. Maybe, it's just your opinion. I see a lot of speculation [not bad] that some are claiming as truths [bad]. We can all fabricate what if scenarios, but this is all guesswork at the end of the day.
 
Reactions: Kuosimodo

Ajay

Lifer
Jan 8, 2001
16,094
8,109
136
That is not how power draw scales.
Uh, yeah, it does scale that way with two independent dice. Of course, 500W would be nutz - that's a different story.
Hopefully, AMD will give us some data @ Computex, but we'll have to wait and see.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Uh, yeah, it does scale that way with two independent dice. Of course, 500W would be nutz - that's a different story.
Hopefully, AMD will give us some data @ Computex, but we'll have to wait and see.
No. The way he intended, it does not worl this way. Nobody ever told or even thought that AMD's goal with a dual GPU card would be _200% performance_, his post lacks any real life meaning. To keep this thread about decent discussion and not about trolling over semantics, I'd propose that we all try to avoid w3rd's insanity and keep in mind that in case of dual GPU graphics cards, power draw NEVER scales this way.

Sent from my VTR-L09 using Tapatalk
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
That is not how power draw scales.
Really? A single-die card is 250W. How would a card with two of those dice at the same clocks and with the same memory per die not be 500W? A graphics card can roughly be divided into four power consuming parts: power delivery, GPU, RAM, and I/O. Outside of the I/O, which is by far the lowest power draw of those four, everything would need to double (unless they purposefully used inefficient VRM components on the 250W card...).

The solution to this is usually to clock the twin-die card lower, which gives you the silly scenario where your $1000 graphics card ends up performing worse than a $500 card in a majority of games (especially in launch reviews, which is what matters most in terms of sales). Which is downright embarrassing. Of course, with modern clock scaling tech it would probably be trivial to bump up clocks on the core with the most utilization if the headroom is there/the utilization delta is above X% - but that's just putting a band-aid on a turd.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Really? A single-die card is 250W. How would a card with two of those dice at the same clocks and with the same memory per die not be 500W? A graphics card can roughly be divided into four power consuming parts: power delivery, GPU, RAM, and I/O. Outside of the I/O, which is by far the lowest power draw of those four, everything would need to double (unless they purposefully used inefficient VRM components on the 250W card...).

The solution to this is usually to clock the twin-die card lower, which gives you the silly scenario where your $1000 graphics card ends up performing worse than a $500 card in a majority of games (especially in launch reviews, which is what matters most in terms of sales). Which is downright embarrassing. Of course, with modern clock scaling tech it would probably be trivial to bump up clocks on the core with the most utilization if the headroom is there/the utilization delta is above X% - but that's just putting a band-aid on a turd.
I was gonna give you the benefit of the doubt and assume you've let w3rd to steer you off the course of reality, but then you reply to me with this biased nonsense. I don't think I should add anything, it would just make it personal, instead I just rest my case

Sent from my VTR-L09 using Tapatalk
 

Ajay

Lifer
Jan 8, 2001
16,094
8,109
136
No. The way he intended, it does not worl this way. Nobody ever told or even thought that AMD's goal with a dual GPU card would be _200% performance_, his post lacks any real life meaning. To keep this thread about decent discussion and not about trolling over semantics, I'd propose that we all try to avoid w3rd's insanity and keep in mind that in case of dual GPU graphics cards, power draw NEVER scales this way.

Sent from my VTR-L09 using Tapatalk
I see, no, dual GPU cards don't run @ double the TDP or provide 2x the performance. I have some ppl on ignore - so I miss out on context sometimes.
 
Reactions: lobz

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I was gonna give you the benefit of the doubt and assume you've let w3rd to steer you off the course of reality, but then you reply to me with this biased nonsense. I don't think I should add anything, it would just make it personal, instead I just rest my case

Sent from my VTR-L09 using Tapatalk
Sorry, biased? For or against what, exactly? Didn't I go through AMD's actual previous dual GPU cards in my previous post? I was replying to a comment stating that "that's not how power scaling works", which is utterly false - which I showed. If you have an issue with my reasoning, argue for it. Of course a 500W GPU is insane (it was insane with the 295X2, even more so now), and I don't believe AMD would make another one, but that means lowering clocks, and brings back the issue of games without profiles performing worse than a far cheaper single GPU card. Even though rather small clock speed drops can yield big power savings, that's an undeniable truth. Of course, again, it can be mitigated as I suggested above, but then you're still left with an overpriced GPU unless the main games you play are ones with gold Crossfire profiles.

If you have a 250W full power single die card, and a 350W 75% power per die dual die card, that card is, unfortunately, going to perform at a level around 75% of the single die card in every single game that lacks a CF profile. And it would still have cost the buyer far more than the 250W single die card. Which is why a single die is, until Infinity Fabric can change this, the only sensible solution.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Can you expand on this, as accepting your statement, implies a degree of faith. Maybe, it's just your opinion. I see a lot of speculation [not bad] that some are claiming as truths [bad]. We can all fabricate what if scenarios, but this is all guesswork at the end of the day.
If you truly believe that a sale in which I trash the GPU is equivalent to a sale in which I use the GPU, potentially review the GPU, etc. etc. then there really isn't a point in me bothering to explain it. It's just a hopeless endeavor since the basic principles are beyond the grasp of the person attempting to understand.

That's why financial conversation sucks to have on here, because something this basic is apparently hard to grasp. It's not an opinion, it's a fact.

If you think a sale to JayZTwoCents is equivalent to a sale in which I buy the GPU and dump it in the trash. I can't even begin to have an intellectual conversation with you on this matter.

Lets just move on to tech stuff.
 
Last edited:

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
If you truly believe that a sale in which I trash the GPU is equivalent to a sale in which I use the GPU, potentially review the GPU, etc. etc. then there really isn't a point in me bothering to explain it. It's just a hopeless endeavor since the basic principles are beyond the grasp of the person attempting to understand.

That's why financial conversation sucks to have on here, because something this basic is apparently hard to grasp. It's not an opinion, it's a fact.

If you think a sale to JayZTwoCents is equivalent to a sale in which I buy the GPU and dump it in the trash. I can't even begin to have an intellectual conversation with you on this matter.

Lets just move on to tech stuff.
You're adding other conditions to the equation.
You don't think miners recommend cards to each other? Scrap metal people might recommend them as well.
My point was AMD isn't on some serve the gamers altruistically mission. They're just a company trying to make as much money as they can, where they can. They probably "like" video cards and gaming about as much as your accountant likes your W2s.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Uh, yeah, it does scale that way with two independent dice. Of course, 500W would be nutz - that's a different story.
Hopefully, AMD will give us some data @ Computex, but we'll have to wait and see.
They'd likely underclock it to fit in a 375W TDP or so. On the other hand, the 295x2 drew 430-500W. Nonetheless, if we do get a dual Vega, it won't be using an interposer/infinity fabric interconnect.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You're adding other conditions to the equation.
You don't think miners recommend cards to each other? Scrap metal people might recommend them as well.
My point was AMD isn't on some serve the gamers altruistically mission. They're just a company trying to make as much money as they can, where they can. They probably "like" video cards and gaming about as much as your accountant likes your W2s.
No, I'm not adding other conditions. Conditions isn't even the right term, which again is why this is pointless and is the last post I'm making on the topic because we're simply not having an equivalent conversation. This is a similar conversation to a person who claims the world is flat because that's all they can see. Without being able to understand how to make that next mental leap to understand anything beyond the initial thing you first see, there really is no hope for further education on this matter.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
If you truly believe that a sale in which I trash the GPU is equivalent to a sale in which I use the GPU, potentially review the GPU, etc. etc. then there really isn't a point in me bothering to explain it. It's just a hopeless endeavor since the basic principles are beyond the grasp of the person attempting to understand.

That's why financial conversation sucks to have on here, because something this basic is apparently hard to grasp. It's not an opinion, it's a fact.

If you think a sale to JayZTwoCents is equivalent to a sale in which I buy the GPU and dump it in the trash. I can't even begin to have an intellectual conversation with you on this matter.

Lets just move on to tech stuff.
My experience is that anytime someone uses this statement, they really can't explain their theory or belief. No problem however.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |