Vega/Navi Rumors (Updated)

Page 62 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Click on he user name to get the profile pop up and then click ignore. Easy peazy

While I got nothing against the ignore list, it turns forums/social medium into echo chambers. Also in this situation, it feels like people would ignore the voice of reason and letting the "troll" off the hook.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Guys, just ignore the troll. There is no reason to even entertain their asinine statements, ignore and hope they fade away into oblivion like so many of their fallen brothers have before.

Or, worse case, we have a good laugh as they are proven wrong and try to hold on to that last shred of pride with excuses.

Is there any basis that the statements will be proven wrong? From what i can see there is just as much of a chance that they will be proven right as wrong. If this thread isn't about discussing Navi/Vega rumors then the title should be changed to "Navi/Vega 'FACTS'", or closed.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Guys, just ignore the troll. There is no reason to even entertain their asinine statements, ignore and hope they fade away into oblivion like so many of their fallen brothers have before.

Or, worse case, we have a good laugh as they are proven wrong and try to hold on to that last shred of pride with excuses.

I never thought we'd see the AMD version of Rollo, but we're seeing it.
 
Reactions: Headfoot

zinfamous

No Lifer
Jul 12, 2006
110,821
29,576
146
Is there any basis that the statements will be proven wrong? From what i can see there is just as much of a chance that they will be proven right as wrong. If this thread isn't about discussing Navi/Vega rumors then the title should be changed to "Navi/Vega 'FACTS'", or closed.


We know some things about Vega Uarch. We can speculate through extrapolating some numbers. We have one demo in a controlled test, where we don't know anything beyond a simple FPS counter. We have what, 2 sythetics?

That's it.

That's not nothing, but it none of that says anything about actual performance when it matters. What we actually have though, since you agree that we are speculating--is a long history of AMD with interesting tech on paper that, for various reasons, never pans out in actual performance. Theoretical performance--which is all this speculation is about--is completely meaningless. People purchase on actual performance. Time and time again, we can argue "well, AMD's card will probably be better than the same nVidia card 6 months, 1 year, 2 years from now!" Yeah, that's probably right, but making a bold claim such as "more powerful than nVidia's best for cheaper," a claim that runs completely against the last 5 or so years of actual releases and actual data is a ridiculous claim that requires a preponderance of evidence to back it up.

To this day, Polaris 480 still has a theoretical 1 or so TFlop advantage over the 1060, but only comes out, now, even or just barely ahead overall. Where the hell is that "missing" TFlop? Theory really doesn't mean anything in the end, no matter how true it may be that the tech within the chip is "superior." That hardware doesn't mean anything if it isn't being utilized.

This troll offers no evidence. He confuses his wild speculation for actual reality, then demands that others prove him wrong using the evidence that even he doesn't have. There is a difference between rumors and claiming wild fantasies that you claim are actual reality, based on nothing.

I want AMD to "win" here--I'd absolutely love it if they manage to beat nVidia with Vega 10. I've been building AMD CPU and ATI/AMD GPU boxes since 2002. Only one Intel chip and nVidia card in all that time. This troll thinks I'm an nVidia schill because I'm calling him out on his nonsense.

Bold claims require real evidence. Challenging such a ridiculous claim requires nothing more than for the troll to put forth real numbers. No numbers exist. None of what he is doing is speculating: it's pure trollery. Anyway, I'm done with this kid. I wonder why anyone would defend his nonsense as "just rumor." If you read his comments, he is not proposing rumors. He is claiming some truth and attacking anyone that dare call him out on it. Trolls like this do absolutely nothing but blow up the hype train for everyone, leaving to nothing but profound disappointment when AMD doesn't "deliver" on some absurd performance that they never promised.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Warning: Wall of text incoming. This is in hope of eliciting some sort of actual response, though, as I'm getting rather tired of the ad hominem arguing here.
I don't think you have to explain things to that guy. He is completely ignorant of technology & His line of reasoning is the same in every post..
AMD sucks and can't do anything right. Nvidia Yo...

Too many people have had to explain things to him.


Ironically, he doesn't even know that Polaris isn't Vega & to him, they are the same uarch. Nvidia is the extent of Valantar's knowledge.
Uhm... You're making no sense. I have actually never owned an Nvidia GPU in my life. Currently, I have a Fury X, which I'm very happy with. Before that, a 6950, before that dual 4850s (god, that single-slot cooler was awful! especially in stereo! ended up replacing them with Arctic Accelero S1s), and before that a 9600 All-in-wonder or some such (I believe I "Omega drivered" it to act as a 9700-something at the time).

Also, I don't know that Polaris isn't Vega? What? Huh? Where on earth are you getting that from? Are you denying that Vega is built on GCN (and thus at least in part on Polaris)? I don't even understand what you're getting at with this. Saying that something is based on something else doesn't mean the same as saying they're the same. That should be pretty obvious ...

But back to the point. Simply because I'm arguing caution against getting our hopes up too high, I'm somehow saying that AMD sucks? Sorry, but you need to work on your reading skills, logic, or both. I'm simply saying that it's foolish to buy into rumors just because we want them to be true. Do I want them to be true? Of course I do. Every single GPU I've ever bought with my own money has been ATI/AMD. I really, really don't want to support companies with massive market leads unless I damn well have to. But I've seen the devastating effect overblown rumors have had on AMD's reputation time and time again (1600MHz Polaris OCs for all! The Fury X will revitalize AMDs high-end offerings! The Fury X is "an overclocker's dream"! (that's a direct quote, btw)), and I don't want to see that happen yet another time. I desperately want AMD to succeed. Heck, I've bought a Ryzen CPU too. The point is, overhyping products hurts AMD, just like overhyping No man's sky hurt Hello Games (just one example among many).
He has repeatedly suggested, there is not enough information out there, therefore AMD sucks. He has said, he can't even speculate (or imagine) that AMD tech could be better? And is backing up his remarks by further suggesting, someone to knock on his door & hand him proof (Because Nvidia Yo..)
Stop putting words in my mouth. Please, if you can, show me a quote from one of my posts to demonstrate that I've said anything to that effect. Have I said that there's not enough data to assume huge IPC gains? Absolutely. Does that mean that AMD sucks? Hell no. If that is your conclusion form reading that, you need to check your paranoia (and remember that you are not AMD, nor does the success or failure of any AMD project (or any other brand) reflect on your character as a person). Have I said that I "can't even speculate (or imagine) that AMD tech could be better"? F*ck no. Stop making sh*t up. Please, show me a quote of me saying that. I dare you. Oh, and look up a dictionary to see what "speculate" means and how that word is used, because you're using it wrong (and certainly not in a way I'd ever use it). Hint: it's not synonymous with "imagine".
What is funny, is when people engage me (attack me?), then don't show a logical reasoning behind their anti-AMD rebuttal.
Have I attacked you? Really? Put that victim card back in your wallet, please. You won't get it stamped here.
Also, what exactly in my arguments against extreme optimism and evangelizing based on speculation and hope shows any anti-AMD bias?
But what kills me, is when they do speak about AMD, the do so with ignorance and dismissive nature. Because they are not interested, or concerned with GPU tech, but with holding a torch on a forum.
Really? What ignorance? What dismissiveness? Again, I'd love to see some examples. Quote me all you like.
Who should be able to defend themselves, and their words and their own actions.
I have no problem doing that. It is however refreshing to see other people who actually want something resembling a realistic and balanced debate rather than screaming and running in circles.
You both are defending a person who is feigning ignorance, because they don't want to learn. And who has repeatedly said AMD tech is dismissive, because we have no facts. Therefore suggesting, we can't even have a conversation about AMD future, because they don't like it, because (again) nobody knocked on their door and handed them a packet of info..

That is Nvidia Yo...
This is interesting. I'm feigning ignorance? So I'm pretending not to know something that I actually do? So you're saying that in reality, I'm aware of AMDs massive, overpowering technological lead and how they will crush Nvidia in gaming performance, but that I'm conciously denying it to somehow promote Nvidia? A company that I've never supported economically, rarely advise people to buy, and don't have any feelings towards except a vague antagonism due to their huge lead in market share?That's a pretty big leap, I have to say. Again, check your paranoia. Psychoanalyzing your opponents in a discussion is rarely a good idea even when you know them personally. You clearly haven't read a single post I've ever written about GPUs in these forums, let alone ever talked to me, so you don't exactly have the best basis for it either, I'm sorry to say.

Also, again, look up the meaning of "dismissive". A product or technology can't be dismissive, though an attitude can. Nor have I dismissed AMDs technological improvements with Vega, I've simply pointed out that as of now, we have no data to show how they affect gaming performance. The HBCC is a prime example of this - architecturally, it seems like a perfect fit for data centers, workstations and the like (as I've stated before). It might be really good for gaming performance too, of course. What do I know? The point is, you don't know either, yet you act as if you do, and that what you're saying is fact, not speculation. Which is blatantly not true. All we've seen so far is that it should significantly improve memory management in low-memory situations, allowing 4GB or even 2GB GPUs to improve drastically compared to today. But so what? Do you think high-end Vega will have less than 8GB of VRAM? That seems extremely unlikely, given that high-end GPUs have had 8GB or more for quite a few years.

Also, please leave the "you're stifling discussion" card off the table. That's bull. Just because I don't agree with your wild speculation doesn't mean I'm against discussing. Heck, the fact that I'm arguing against you should prove that quite clearly. This is a discussion, no?
I played a game with their own logic, and suggest their line of reasoning applies to Volta too. That it can't possibly be better than Pascal, because it comes afterwards and has no specs. Which kinda illustrated the ignorance & bias in this thread.
Yep, and I responded to your horribly flawed logic. I wonder why you haven't argued against that. Can you? You're saying that just because Vega is newer, it must be more powerful than Pascal. My counter-example was that if I go out tomorrow and build a GPU, it will also be newer than Pascal. Must it also thus be faster? There is no correlation between the two. By your logic, there is. If you disagree, please argue your point. Don't mistake correlation for causation. A 2017 Fiat Punto isn't faster than a 2000 Ferrari just because it's newer - and those are even made by the same company!

What we can be sure of is that Vega will be more powerful (and within reason: more power efficient) than Polaris. Why? Because it's made by the same people, developed on the basis of the same technology, just added to, improved and augmented. See? That's logical. From the same logic, we can assume that Volta will (in all likelyhood) be more powerful than Pascal. How much? Nobody knows (well, outside of some product designers, design leads and engineers at Nvidia, I assume). Will it be more powerful than Vega? Again, we can't know that until we have data, as the two have no relation to each other. Heck, the only reason we can assume Volta to be faster than Polaris is that it's based on Pascal, which is already faster than Polaris, and that performance regression between architectures is very unlikely. Otherwise, logically, it could be slower. Is that so hard to grasp?

To put it simply:
Polaris < Vega
Pascal < Volta
Vega ??? Pascal
Volta ??? Vega

The last two are unknown, and can't be known until we know the performance of Vega (and in time, Volta) based on more than hopes and speculations.

Again: if you disagree with my logic, show me how it's faulty. Argue against it. Don't say I'm a shill (as that doesn't disprove anything, just inflames the debate in silly ways), don't say I'm denying things, show me how I'm wrong. Seriously. I have yet to see you try.
refuting my w3rds. Which are, that Vega is more powerful than Pascal and the Radeon RX Vega is more powerful than the 1 year old 1080ti. And cheaper...
(emphasis added: stating speculation as fact. None of those are known. We know that one example of a Vega ES was faster than the 2nd-tier Pascal. That doesn't in any way show us that Vega as an architecture is more powerful (and more or equally efficient, as that woult be a necessity in today's market) compared to Pascal as an architecture, just that it can get close. Nor does it show us how whatever the top Vega card will be called (heck, we don't even know if there'll be a card specifically named RX Vega, which you are also assuming) will compare to the 1080Ti. Again: it just shows us that they're somewhat close. For all we know the 1080+10% Battlefront demo was consuming 225W. Or 150W. Or anything else. Again: we don't know.)

Again, you're presenting rumor and speculation as fact. You might of course be right. I don't doubt that Vega will be cheaper than Pascal - AMD traditionally is. AMD is also usually good at beating Nvidia at price/performance. Unfortunately, at the high end they've faltered for two generation now. My dear Fury X (seriously, I really really like that card, I'm planning to keep it for years) still performs worse than the 980Ti in most benchmarks and games. Polaris was a good step towards catching up to Nvidia's efficiency lead, but not nearly enough. AMD has yet to compete with high end Pascal. I think this was a good choice strategically from AMD (rather than rushing some huge, inefficient Polaris die or launching a brand-killing 300W+ dual-GPU card), but it doesn't leave me overly confident in their ability to compete in the extreme high-end. For that, they need to at least come close in efficiency, if not catch up. Now, to clarify, since you seem unable, unwilling, or both to understand what I'm saying: I'm very sure AMD will launch a card that beats the 1080. No doubt about it. I mean, they've shown ES cards beating the 1080 (even if that's only in one relatively AMD-friendly game, it's proof of something). I have no trouble at all taking that at face value.

What I do have trouble taking at face value is that you're saying that this means that AMD will beat the 1080TI - a ~25% faster card - at less power. The only indication of Vega's power consumption is that the ES cards have 8+6-pin power connectors, and that it can be cooled by a rather standard-looking blower cooler. That tells us nothing more than that Vega should need less power than the Fury X (same manufacturer, 8+8-pin power delivery). Even that can't be confirmed, though, as a 8+6-pin setup can deliver close to 300W. Given that it's already a few months since this was demonstrated, and we still don't know when Vega will launch, it's more than reasonable to expect both performance to increase and power draw do drop in final silicon. So, what can we say for reasonably sure? That Vega will perform more than 10% better than a GTX 1080 in Star Wars Battlefront at 4k resolution at some power draw that can be cooled by a reasonably-sized blower. I'd say that means less than 250W, but even that's pure speculation. That the ES beat the 1080 by ~10% (that's a very rough estimate since we haven't actually seen any numbers other than on-screen in-game) is a good sign - 20% or even 30% might be attainable through optimization. But we still don't know this. That's the point you don't seem to grasp. I'm not saying it's impossible. I'm not saying that AMD isn't capable of it - far from it! I'm not saying that it isn't going to happen. I'm simply objecting to you presenting speculation, inductive reasoning and unsubstantiated rumor as fact. Which it isn't. Plain and simple.
You can disagree, but you'd have to explain your reasons using logic, not emotion. (ie: try fab process, wafer yields, news, memory structure, patents, etc.)
Uhm, is that how you're arguing? I haven't seen that, at least. AMD is on GloFo's 14nm process, which seemingly is comparable to TSMC's 16nm, although we can't really know as no two identical chips are made on both processes. At the very least they're comparable, although we know that GloFo 14 isn't optimized for high clocks. This shouldn't matter much in a sub-2GHz GPU, though. Wafer yields? What does that have to do with anything? As long as they avoid a chip far above 500mm2, they should be fine. Memory structure? We don't yet know anything substantive about the gaming-related consequences of AMD's new memory technology. It might be revolutionary. It might be awesome, but require significant patching of games and/or Windows. We don't know. As such, assuming massive gains is putting the cart before the horse, so to speak.

Oh, and lastly, here's a list of my recent posts containting the words Nvidia, AMD, GPU or CPU. Show me some anti-AMD bias, please. I dare you.
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
To this day, Polaris 480 still has a theoretical 1 or so TFlop advantage over the 1060, but only comes out, now, even or just barely ahead overall. Where the hell is that "missing" TFlop?

This is because of tiled rendering. Kepler had almost no advantage over GCN in performance per TFlop, at least once AMD had a couple of months to mature their drivers. Maxwell (1st generation) introduced tiled rendering, which gave it about a 33% per-TFlop advantage over Kepler and GCN in DX11 performance at 1080p. There have been incremental improvements on both sides, and the result is about the same: Polaris RX 480 needs about 33% more TFlops to get the same performance as Pascal GTX 1060.

That's why Vega is such a big deal, because AMD is finally adding this secret sauce to their own cards to close the gap. GCN is about the only major GPU architecture that still lacks tiled rendering. Nvidia has it and the best mobile architectures, such as PowerVR, have it as well. That gap is about to close, and Nvidia will no longer be able to count on a major advantage in perf/TFlop to keep them ahead.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
We know some things about Vega Uarch. We can speculate through extrapolating some numbers. We have one demo in a controlled test, where we don't know anything beyond a simple FPS counter. We have what, 2 sythetics?

That's it.

That's not nothing, but it none of that says anything about actual performance when it matters. What we actually have though, since you agree that we are speculating--is a long history of AMD with interesting tech on paper that, for various reasons, never pans out in actual performance. Theoretical performance--which is all this speculation is about--is completely meaningless. People purchase on actual performance. Time and time again, we can argue "well, AMD's card will probably be better than the same nVidia card 6 months, 1 year, 2 years from now!" Yeah, that's probably right, but making a bold claim such as "more powerful than nVidia's best for cheaper," a claim that runs completely against the last 5 or so years of actual releases and actual data is a ridiculous claim that requires a preponderance of evidence to back it up.

To this day, Polaris 480 still has a theoretical 1 or so TFlop advantage over the 1060, but only comes out, now, even or just barely ahead overall. Where the hell is that "missing" TFlop? Theory really doesn't mean anything in the end, no matter how true it may be that the tech within the chip is "superior." That hardware doesn't mean anything if it isn't being utilized.

This troll offers no evidence. He confuses his wild speculation for actual reality, then demands that others prove him wrong using the evidence that even he doesn't have. There is a difference between rumors and claiming wild fantasies that you claim are actual reality, based on nothing.

I want AMD to "win" here--I'd absolutely love it if they manage to beat nVidia with Vega 10. I've been building AMD CPU and ATI/AMD GPU boxes since 2002. Only one Intel chip and nVidia card in all that time. This troll thinks I'm an nVidia schill because I'm calling him out on his nonsense.

Bold claims require real evidence. Challenging such a ridiculous claim requires nothing more than for the troll to put forth real numbers. No numbers exist. None of what he is doing is speculating: it's pure trollery. Anyway, I'm done with this kid. I wonder why anyone would defend his nonsense as "just rumor." If you read his comments, he is not proposing rumors. He is claiming some truth and attacking anyone that dare call him out on it. Trolls like this do absolutely nothing but blow up the hype train for everyone, leaving to nothing but profound disappointment when AMD doesn't "deliver" on some absurd performance that they never promised.


The most recent example of execution, claims and theoretical vs actual performance is Zen and that turned out rather well, despite history. You can't judge future products by history, or on marketing. His claims are valid to the degree that AMD does in fact have a cohesive strategy of GPU/CPU integration going forward. Having Infinity Fabric on both Vega and Ryzen suggests to me that there is a symbiotic nature between the 2. IMO there is more to be optimistic about than there is to be pessimistic. I get that people want to keep expectations low, because there is an emotional investment that they can't overcome if the products don't live up to there own expectations, but everyone has a right to speculate in a 'rumor' thread do they not? Are people no longer able to form an opinion on there own without the need to have it validated by someone else? I don't care what anyone says, I have my own opinions and people can write whatever they want. I have the ability, as i'm sure everybody does, to process the information i read and parse it as something that makes sense to me, or not. If someone writes that the moon is made of cheese i'm not going to get upset and start a war over it if i don't believe it, i'll just dismiss the claim.
 
Last edited:
Reactions: w3rd

zinfamous

No Lifer
Jul 12, 2006
110,821
29,576
146
The most recent example of execution, claims and theoretical vs actual performance is Zen and that turned out rather well, despite history. You can't judge future products by history, or on marketing. His claims are valid to the degree that AMD does in fact have a cohesive strategy of GPU/CPU integration going forward. Having Infinity Fabric on both Vega and Ryzen suggests to me that there is a symbiotic nature between the 2. IMO there is more to be optimistic about than there is to be pessimistic. I get that people want to keep expectations low, because there is an emotional investment that they can't overcome if the products don't live up to there own expectations, but everyone has a right to speculate in a 'rumor' thread do they not? Are people no longer able to form an opinion on there own without the need to have it validated by someone else? I don't care what anyone says, I have my own opinions and people can write whatever they want. I have the ability, as i'm sure everybody does, to process the information i read and parse it as something that makes sense to me, or not. If someone writes that the moon is made of cheese i'm not going to get upset and start a war over it if i don't believe it, i'll just dismiss the claim.

what does Zen have to do with Vega? what does CPU design have to do with GPU design? On top of this, AMD rather loudly announced, some time ago, that they are building RTG group as a more or less separate entity from AMD. Zen is pretty amazing considering the budget and the gains that they have made, but bear in mind that this is because Intel hit that wall already about 3 years ago, which is exactly where Zen is suddenly landing. GPU architecture isn't anywhere near hitting the thermal design limits that are now a matter of physical law with x86 CPU design. nVidia has been pumping out 30-40% gains in class year on year throughout the same time period that Intel has been pumping out 5% gains with core architecture.

I do think Vega and Zen share a lot of intentional design similarities and absolutely represent a new and hopefully great direction for AMD. I think Vega will be fantastic. I simply don't see any reason for people to hang themselves on paper design specs when they have absolutely no idea if these will be effectively utilized within the software that is supposed to benefit from them. AMD has scored a few coups with devs, it's pretty clear that DX12 and with repeated console wins, that general game development is "probably" more favorable to AMD going forward, but we just have to see.

speculating on this stuff is fine, but don't confuse what you are doing--speculating--with what troll bot is doing: claiming all of these assumptions is the absolute and irrefutable reality. You only set yourself up for disappointment.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
what does Zen have to do with Vega? what does CPU design have to do with GPU design? On top of this, AMD rather loudly announced, some time ago, that they are building RTG group as a more or less separate entity from AMD. Zen is pretty amazing considering the budget and the gains that they have made, but bear in mind that this is because Intel hit that wall already about 3 years ago, which is exactly where Zen is suddenly landing. GPU architecture isn't anywhere near hitting the thermal design limits that are now a matter of physical law with x86 CPU design. nVidia has been pumping out 30-40% gains in class year on year throughout the same time period that Intel has been pumping out 5% gains with core architecture.

I do think Vega and Zen share a lot of intentional design similarities and absolutely represent a new and hopefully great direction for AMD. I think Vega will be fantastic. I simply don't see any reason for people to hang themselves on paper design specs when they have absolutely no idea if these will be effectively utilized within the software that is supposed to benefit from them. AMD has scored a few coups with devs, it's pretty clear that DX12 and with repeated console wins, that general game development is "probably" more favorable to AMD going forward, but we just have to see.

speculating on this stuff is fine, but don't confuse what you are doing--speculating--with what troll bot is doing: claiming all of these assumptions is the absolute and irrefutable reality. You only set yourself up for disappointment.

That's the thing. You or anybody else doesn't have a right to tell anyone else what they can or can't assume. People have a right to their own expectations, they don't need to be approved by anyone.
And Zen has everything to do with Vega, pertaining to the context of the post i replied to. The claim was that historical evidence predicts the future and I said no it doesn't as proven by Zen.
BTW, Polaris is every bit as good as Pascal, IMO. For Vega i do think it'll be quite a bit faster than the going narrative suggests. If only because a 1500MHz Fury would already be over 10% faster than a 1080.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I get that people want to keep expectations low, because there is an emotional investment that they can't overcome if the products don't live up to there own expectations, but everyone has a right to speculate in a 'rumor' thread do they not? Are people no longer able to form an opinion on there own without the need to have it validated by someone else.
There's a big difference between speculating on a rumor and having an interesting discussion about this, and stating your speculations as indisputable fact while going up in arms whenever someone tries to argue a different perspective. Heck, that's the opposite of having a productive discussion, at least in my mind. Nobody is saying that people don't have the right to their opinions - in fact, that's what we're arguing for, and him against. I'm just arguing for applying common sense and healthy skepticism, not taking every rumor at face value, and applying simple principles of source criticism and verification to what we read. "Someone posted it on the internet" is a very, very low indicator of veracity, after all.
And Zen has everything to do with Vega, pertaining to the context of the post i replied to. The claim was that historical evidence predicts the future and I said no it doesn't as proven by Zen.
BTW, Polaris is every bit as good as Pascal, IMO. For Vega i do think it'll be quite a bit faster than the going narrative suggests. If only because a 1500MHz Fury would already be over 10% faster than a 1080.
Zen can be interpreted as a sign of AMD turning things around, sure. They seem to have a good strategy for development, and they're meeting or exceeding their stated goals. That's great. But it still doesn't change the fact that the CPU and GPU teams are entirely separate, with the RTG now semi-autonomous. On the other hand, there has been talk of tighter integration between the two and better coordination for upcoming APUs. Again, we don't know. You don't work in AMD management, do you? If so, I'd love to hear some inside info.

As to Polaris being "every bit as good as Pascal", that's... a stretch. That can only be true if you completely discount efficiency. The RX 480 uses as much power as the GTX 1070, which is far more powerful. Polaris doesn't clock very high either, and that seems to be more of an architectural problem than a process one (other chips on the same process clock far higher after all). A 1500MHz Fury would at the very least consume 410W (275*1.5), but in reality far more as power scales nearly exponentially with clock speed. Not to mention that AMD has yet to show us an architecture where clocks like that are actually attainable.

Where Polaris is great is in power per CU, but this unfortunately comes at the cost of performance per CU being lower than Pascal's per CUDA core.

Still, it's entirely possible for AMD to catch up. It can definitely happen. We just don't know yet.
 
Last edited:

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
While I don't trust VideoCardz further than I can throw them, that seems plausible at least. Wonder what kind of clock speeds they can get out of it, and what the price will be. I'd love one of these for my HTPC, although I'm worried it won't be much cheaper than the 460/560.
 

Mopetar

Diamond Member
Jan 31, 2011
8,024
6,489
136
Seems like it would be for the X50 and below products as the 450 down were just rebrands.

I'm curious about the design though. I assume it's just a single shader engine with 10 CUs which is a step up from both Polaris 10 and 11.

I wonder if it could bottleneck the full die or if it means AMD have made some improvements to Polaris and we'll eventually see a 40 CU RX 480 replacement.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,153
136
I'm actually confused here. Unless AMD has a contract from an OEM that would justify this, taking the tapeout costs of a new chip on 14nm for this nearly dead and still dying market is simply a bizzare choice.

NVIDIA has money to throw around, so I wouldn't be as surprised by them doing this. But AMD isn't in a position to spend their money freely on such projects. I would not be surprised at all if this chip exists to satisfy Apple, as the first place we caught wind of Polaris 12 was Apple's drivers.

But hey, as a consumer, we finally get freed from 28nm tech in the HTPC without iGPU market.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I'm actually confused here. Unless AMD has a contract from an OEM that would justify this, taking the tapeout costs of a new chip on 14nm for this nearly dead and still dying market is simply a bizzare choice.

NVIDIA has money to throw around, so I wouldn't be as surprised by them doing this. But AMD isn't in a position to spend their money freely on such projects. I would not be surprised at all if this chip exists to satisfy Apple, as the first place we caught wind of Polaris 12 was Apple's drivers.

But hey, as a consumer, we finally get freed from 28nm tech in the HTPC without iGPU market.
Given that 640SPs matches the Radeon Pro 450 in the base MBP, it's most likely the highest volume mobile chip they have. Makes sense for it to not be a cut-down part once yields improve. Not to mention the cost savings of fitting 50% more dies per wafer, given the reduction in die area from 16CUs to 10. At the same time, they'd be the only actor in the entry-level GPU market to support HDMI 2.0, HDR, HDCP 2.2 and so on. Would make excellent HTPC cards if nothing else - and while that's a tiny market, it's a place to put the Polaris 12 dice that don't make the cut for mobile in terms of power draw.

Still, this is of course a major expense. I'd imagine Apple has placed a significant order of these chips.
 

w3rd

Senior member
Mar 1, 2017
255
62
101
Warning: Wall of text incoming. This is in hope of eliciting some sort of actual response, though, as I'm getting rather tired of the ad hominem arguing here.

Uhm... You're making no sense. I have actually never owned an Nvidia GPU in my life. Currently, I have a Fury X, which I'm very happy with. Before that, a 6950, before that dual 4850s (god, that single-slot cooler was awful! especially in stereo! ended up replacing them with Arctic Accelero S1s), and before that a 9600 All-in-wonder or some such (I believe I "Omega drivered" it to act as a 9700-something at the time).

Also, I don't know that Polaris isn't Vega? What? Huh? Where on earth are you getting that from? Are you denying that Vega is built on GCN (and thus at least in part on Polaris)? I don't even understand what you're getting at with this. Saying that something is based on something else doesn't mean the same as saying they're the same. That should be pretty obvious ...

But back to the point. Simply because I'm arguing caution against getting our hopes up too high, I'm somehow saying that AMD sucks? Sorry, but you need to work on your reading skills, logic, or both. I'm simply saying that it's foolish to buy into rumors just because we want them to be true. Do I want them to be true? Of course I do. Every single GPU I've ever bought with my own money has been ATI/AMD. I really, really don't want to support companies with massive market leads unless I damn well have to. But I've seen the devastating effect overblown rumors have had on AMD's reputation time and time again (1600MHz Polaris OCs for all! The Fury X will revitalize AMDs high-end offerings! The Fury X is "an overclocker's dream"! (that's a direct quote, btw)), and I don't want to see that happen yet another time. I desperately want AMD to succeed. Heck, I've bought a Ryzen CPU too. The point is, overhyping products hurts AMD, just like overhyping No man's sky hurt Hello Games (just one example among many).

Stop putting words in my mouth. Please, if you can, show me a quote from one of my posts to demonstrate that I've said anything to that effect. Have I said that there's not enough data to assume huge IPC gains? Absolutely. Does that mean that AMD sucks? Hell no. If that is your conclusion form reading that, you need to check your paranoia (and remember that you are not AMD, nor does the success or failure of any AMD project (or any other brand) reflect on your character as a person). Have I said that I "can't even speculate (or imagine) that AMD tech could be better"? F*ck no. Stop making sh*t up. Please, show me a quote of me saying that. I dare you. Oh, and look up a dictionary to see what "speculate" means and how that word is used, because you're using it wrong (and certainly not in a way I'd ever use it). Hint: it's not synonymous with "imagine".

Have I attacked you? Really? Put that victim card back in your wallet, please. You won't get it stamped here.
Also, what exactly in my arguments against extreme optimism and evangelizing based on speculation and hope shows any anti-AMD bias?

Really? What ignorance? What dismissiveness? Again, I'd love to see some examples. Quote me all you like.

I have no problem doing that. It is however refreshing to see other people who actually want something resembling a realistic and balanced debate rather than screaming and running in circles.

This is interesting. I'm feigning ignorance? So I'm pretending not to know something that I actually do? So you're saying that in reality, I'm aware of AMDs massive, overpowering technological lead and how they will crush Nvidia in gaming performance, but that I'm conciously denying it to somehow promote Nvidia? A company that I've never supported economically, rarely advise people to buy, and don't have any feelings towards except a vague antagonism due to their huge lead in market share?That's a pretty big leap, I have to say. Again, check your paranoia. Psychoanalyzing your opponents in a discussion is rarely a good idea even when you know them personally. You clearly haven't read a single post I've ever written about GPUs in these forums, let alone ever talked to me, so you don't exactly have the best basis for it either, I'm sorry to say.

Also, again, look up the meaning of "dismissive". A product or technology can't be dismissive, though an attitude can. Nor have I dismissed AMDs technological improvements with Vega, I've simply pointed out that as of now, we have no data to show how they affect gaming performance. The HBCC is a prime example of this - architecturally, it seems like a perfect fit for data centers, workstations and the like (as I've stated before). It might be really good for gaming performance too, of course. What do I know? The point is, you don't know either, yet you act as if you do, and that what you're saying is fact, not speculation. Which is blatantly not true. All we've seen so far is that it should significantly improve memory management in low-memory situations, allowing 4GB or even 2GB GPUs to improve drastically compared to today. But so what? Do you think high-end Vega will have less than 8GB of VRAM? That seems extremely unlikely, given that high-end GPUs have had 8GB or more for quite a few years.

Also, please leave the "you're stifling discussion" card off the table. That's bull. Just because I don't agree with your wild speculation doesn't mean I'm against discussing. Heck, the fact that I'm arguing against you should prove that quite clearly. This is a discussion, no?

Yep, and I responded to your horribly flawed logic. I wonder why you haven't argued against that. Can you? You're saying that just because Vega is newer, it must be more powerful than Pascal. My counter-example was that if I go out tomorrow and build a GPU, it will also be newer than Pascal. Must it also thus be faster? There is no correlation between the two. By your logic, there is. If you disagree, please argue your point. Don't mistake correlation for causation. A 2017 Fiat Punto isn't faster than a 2000 Ferrari just because it's newer - and those are even made by the same company!

What we can be sure of is that Vega will be more powerful (and within reason: more power efficient) than Polaris. Why? Because it's made by the same people, developed on the basis of the same technology, just added to, improved and augmented. See? That's logical. From the same logic, we can assume that Volta will (in all likelyhood) be more powerful than Pascal. How much? Nobody knows (well, outside of some product designers, design leads and engineers at Nvidia, I assume). Will it be more powerful than Vega? Again, we can't know that until we have data, as the two have no relation to each other. Heck, the only reason we can assume Volta to be faster than Polaris is that it's based on Pascal, which is already faster than Polaris, and that performance regression between architectures is very unlikely. Otherwise, logically, it could be slower. Is that so hard to grasp?

To put it simply:
Polaris < Vega
Pascal < Volta
Vega ??? Pascal
Volta ??? Vega

The last two are unknown, and can't be known until we know the performance of Vega (and in time, Volta) based on more than hopes and speculations.

Again: if you disagree with my logic, show me how it's faulty. Argue against it. Don't say I'm a shill (as that doesn't disprove anything, just inflames the debate in silly ways), don't say I'm denying things, show me how I'm wrong. Seriously. I have yet to see you try.

(emphasis added: stating speculation as fact. None of those are known. We know that one example of a Vega ES was faster than the 2nd-tier Pascal. That doesn't in any way show us that Vega as an architecture is more powerful (and more or equally efficient, as that woult be a necessity in today's market) compared to Pascal as an architecture, just that it can get close. Nor does it show us how whatever the top Vega card will be called (heck, we don't even know if there'll be a card specifically named RX Vega, which you are also assuming) will compare to the 1080Ti. Again: it just shows us that they're somewhat close. For all we know the 1080+10% Battlefront demo was consuming 225W. Or 150W. Or anything else. Again: we don't know.)

Again, you're presenting rumor and speculation as fact. You might of course be right. I don't doubt that Vega will be cheaper than Pascal - AMD traditionally is. AMD is also usually good at beating Nvidia at price/performance. Unfortunately, at the high end they've faltered for two generation now. My dear Fury X (seriously, I really really like that card, I'm planning to keep it for years) still performs worse than the 980Ti in most benchmarks and games. Polaris was a good step towards catching up to Nvidia's efficiency lead, but not nearly enough. AMD has yet to compete with high end Pascal. I think this was a good choice strategically from AMD (rather than rushing some huge, inefficient Polaris die or launching a brand-killing 300W+ dual-GPU card), but it doesn't leave me overly confident in their ability to compete in the extreme high-end. For that, they need to at least come close in efficiency, if not catch up. Now, to clarify, since you seem unable, unwilling, or both to understand what I'm saying: I'm very sure AMD will launch a card that beats the 1080. No doubt about it. I mean, they've shown ES cards beating the 1080 (even if that's only in one relatively AMD-friendly game, it's proof of something). I have no trouble at all taking that at face value.

What I do have trouble taking at face value is that you're saying that this means that AMD will beat the 1080TI - a ~25% faster card - at less power. The only indication of Vega's power consumption is that the ES cards have 8+6-pin power connectors, and that it can be cooled by a rather standard-looking blower cooler. That tells us nothing more than that Vega should need less power than the Fury X (same manufacturer, 8+8-pin power delivery). Even that can't be confirmed, though, as a 8+6-pin setup can deliver close to 300W. Given that it's already a few months since this was demonstrated, and we still don't know when Vega will launch, it's more than reasonable to expect both performance to increase and power draw do drop in final silicon. So, what can we say for reasonably sure? That Vega will perform more than 10% better than a GTX 1080 in Star Wars Battlefront at 4k resolution at some power draw that can be cooled by a reasonably-sized blower. I'd say that means less than 250W, but even that's pure speculation. That the ES beat the 1080 by ~10% (that's a very rough estimate since we haven't actually seen any numbers other than on-screen in-game) is a good sign - 20% or even 30% might be attainable through optimization. But we still don't know this. That's the point you don't seem to grasp. I'm not saying it's impossible. I'm not saying that AMD isn't capable of it - far from it! I'm not saying that it isn't going to happen. I'm simply objecting to you presenting speculation, inductive reasoning and unsubstantiated rumor as fact. Which it isn't. Plain and simple.

Uhm, is that how you're arguing? I haven't seen that, at least. AMD is on GloFo's 14nm process, which seemingly is comparable to TSMC's 16nm, although we can't really know as no two identical chips are made on both processes. At the very least they're comparable, although we know that GloFo 14 isn't optimized for high clocks. This shouldn't matter much in a sub-2GHz GPU, though. Wafer yields? What does that have to do with anything? As long as they avoid a chip far above 500mm2, they should be fine. Memory structure? We don't yet know anything substantive about the gaming-related consequences of AMD's new memory technology. It might be revolutionary. It might be awesome, but require significant patching of games and/or Windows. We don't know. As such, assuming massive gains is putting the cart before the horse, so to speak.

Oh, and lastly, here's a list of my recent posts containting the words Nvidia, AMD, GPU or CPU. Show me some anti-AMD bias, please. I dare you.


So you are essentially agreeing with me..?

That everything you have suggested or said, or favored within other's posts, was in a negative & dismissive light, and was being masked behind the "caution" light..?


The irony here is, that it took a well thought-out response to me... to illustrate that your "cautions" are easily rebutted (rationalizing), when you use so many "ifs" in describing your own logic^. And that my speculation is no more off, than anyone's else.






I have offered reasonable prose and have injected more knowledge on the subject, than what was in thread, before I joined. Look at my first posts. Nobody on these forums were talking about RX Vega2 (2xgpu one card), until I brought it up. Which is a viable leap of faith to make, considering AMD tech & patents and projections.

Not like I am in a different sub-forum spouting this off. Saddening that You (not nec just you... but the bandwagon is starting) are here attacking me personally for my speculating on AMD future. Instead of just countering my speculation, of why "baby Vega" can't be more powerful than Pascal..? <- What prohibits that comparison..? (Vega11 vs GP 102 ?)

In explaining why they can't be compared, you will explain it to yourself. Ask Watson.




Understand, I am not one of you.
I am a gamer, hardware enthusiasts who hold no bias and just calls it as I see it, but was attacked personally for projecting my views. Within a rumor thread, none-the-less...

I am buying 3 new video cards in the next 4 months, one is a GTX1080ti the other two will be Vega. The Vega's will power my 4k displays, the 1080ti will power my Acer X34.


Again, the market is:
1080p
1440p
1440p wide
1600p
4k

That is how people shop Video cards.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
So you are essentially agreeing with me..?
Again you really need to work on your reading comprehension of that's what you took away from that.
That everything you have suggested or said, or favored within other's posts, was in a negative & dismissive light, and was being masked behind the "caution" light..?
And in the complete opposite direction, I cannot for the life of me understand what you're saying here. Is what I'm saying in this thread actually anti-AMD negativity somehow hidden behind arguing for caution? Is that what you're trying to say? Seriously? You didn't get what I said about not psychoanalyzing the people you're debating, did you? Again: please show me an example of this. Quote me. You're the one making extravagant claims here, so I'm simply asking you to back them up. If you're so sure: show us all. Enlighten us.
The irony here is, that it took a well thought-out response to me... to illustrate that your "cautions" are easily rebutted (rationalizing), when you use so many "ifs" in describing your own logic^. And that my speculation is no more off, than anyone's else.

Again, all ad hominem, no actual arguments, reasoning or sense. I'd love to see you put forward a line of logical reasoning without using the word "if", by the way. And if my arguments are so easily rebutted, how come you make no attempt at doing so? Simply saying "you're wrong" is neither an argument nor a rebuttal.

And again: your speculation might not be more off than anyone else's. You might be 100% correct for all we know (although I don't believe so, a point that I've argued extensively). We don't, and can't know that. But you're still presenting it as irrefutable and indisputable fact, which it isn't - unless you can show us some watertight data to verify it.
I have offered reasonable prose and have injected more knowledge on the subject, than what was in thread, before I joined. Look at my first posts. Nobody on these forums were talking about RX Vega2 (2xgpu one card), until I brought it up. Which is a viable leap of faith to make, considering AMD tech & patents and projections.
That might very well be - I have neither seen that discussion nor taken part in it. Don't quite see how it's relevant in this context.
Not like I am in a different sub-forum spouting this off. Saddening that You (not nec just you... but the bandwagon is starting) are here attacking me personally for my speculating on AMD future. Instead of just countering my speculation, of why "baby Vega" can't be more powerful than Pascal..? <- What prohibits that comparison..? (Vega11 vs GP 102 ?)
Have I been attacking you personally? Really? I have objected to how you present speculation as fact, have presented both historical and performance data and arguments to counter this, and have pointed out how you consistently argue ad hominem rather than present counter-arguments in response to this. Or are you saying that your stinging and eloquent term "Nvidia Yo..." is somehow an illustration of you being attacked, and you presenting.... what did you call it, "reasonable prose"? If so, I'd love to hear you argue this point. I did nothing but argue against your arguments and your way of presenting them (neither of which are personal attacks, if that's unclear to you). You respond by calling me an Nvidia shill, because... I don't agree with you? Because you sure don't have any other basis for that sentiment, outside of your own feelings.
In explaining why they can't be compared, you will explain it to yourself. Ask Watson.
Again with the not making sense. Oh well. It is of course possible that AMD has been pulling a fast one all this time and has been demonstrating their lowest-end Vega GPU. Again: it's possible. A lot of things are possible. And it would be a huge PR win for AMD if this was the case. What makes me skeptical of this? All we've seen of Vega is an ES card with 6+8-pin power running ~10% faster than a 1080 (non-Ti) in a single game that's known to be pretty friendly to AMD's architectures. You must then be assuming either that the ES is massively over-engineered in terms of power delivery, that high-end Vega will need more than 300W of power, or that they somehow will manage to cut power draw by 30-50% or more between Engineering Samples and production cards.

While the first is possible, I find it unlikely as AMD has a reputation for high power draw and low efficiency (which they've stated publicly and explicitly that they're working to improve), and showing off a card indicating ~50W higher power draw than the 1080 (which has a single 8-pin) at only marginally higher performance would not help them at all. If they have any sense, they wouldn't show off a card needing more power than it really does, as that could only hurt their image further. (Yes, this argument assumes that AMD's PR department is somewhat competent. Assuming otherwise would be silly.)

The second, if true, would mean that Vega is essentially DOA. Cooling 300W+ in a standard GPU form factor is nigh in impossible without an AIO, and if they need that much power for their high end card, I'd argue it's a pretty clear indication that they haven't caught up to Nvidia's efficiency lead. Which would be a massive let-down, and would hurt AMD noticeably (as it would cement their reputation as "the hot-running and power hungry brand").

And the third would be an unheard-of engineering miracle. Again, possible. But extremely unlikely.
Understand, I am not one of you.
I am a gamer, hardware enthusiasts who hold no bias and just calls it as I see it, but was attacked personally for projecting my views. Within a rumor thread, none-the-less...
Ah the classic "I'm not one of you (i.e. biased, emotional people who fail to uphold your utterly objective and distanced perspective)" argument. You know what that's called? Derailing the debate, and a suppression technique. This forum is made up of all kinds of people, with all kinds of perspectives. At most, you could argue that an interest in PC technology is the common denominator. Anything more than that is a stretch (unless you've done some extensive surveys that I'm not aware of). And again: nobody attacked you. You started calling me and others who criticized your argumentation shills, accusing us of wanting to somehow hurt AMD and promote Nvidia, which led to us calling out your way of arguing and resorting to personal attacks. Despite repeatedly asking you to show some example of bias on our part, you haven't done so. Again: the floor is yours. Quote me. Please.

And again: you claim that I'm somehow biased against AMD/a shill for Nvidia. Given how you don't know me at all, and I don't claim to be one, the burden of proof for this is on you. It's up to you to show me and the rest of the world how I've been somehow - implicitly or explicitly - promoting Nvidia and arguing that, as you said, "AMD sucks". If you're sure about this, it only takes rather basic textual analysis to argue a point like that. So please, go ahead. So far, you've utterly failed at substantiating this claim.

We/I started criticizing you because you were putting forward opinion and more or less unfounded speculation as fact - and subsequently attacking anyone criticizing this for being biased against AMD. You don't seem to grasp the difference here, so I'll say it again: there's a fundamental, undenyable difference between saying something is and something might be. Until you start choosing your "w3rds" according to what you claim to be saying (as you claim to be arguing your beliefs rather than facts), I'll keep calling you out. You're completely free to believe - and promote the belief that - small Vega will be more powerful than the 1080Ti. That's your right. I don't dispute that in any way. But it is our right to argue against you if we disagree. That's the nature of an open discussion forum. Of course, you neither have to respond, nor provide any kind of reasoning or proof for your claims and accusations. This isn't a courtroom, after all. But in the end, how you come off to people reading this is based on how well you argue your points. Refusing to do so doesn't reflect well on you in the end.

As for your description of yourself: I too am (somewhat of) a gamer, although I'm not too fond of that term as it's too often used as an exclusionary device to draw a line between those deemed "good enough" and those not. I don't quite see how that differentiates you from the rest of us, though. That aside (as that's another debate entirely and only tangentially related to this), you keep claiming to hold no bias, accuse us of being biased, yet consistently fail to show any examples of this. "I just call it as I see it" is also a complete cop-out, a technique to distance yourself from basic human qualities inherent in any form of expression. How you "see it" is colored by your perspective, likes, dislikes, preferences, biases, gender, age, sexuality, political and philosophical views, personal relationships, mental health, socioeconomic status, race, and all other kinds of underlying factors that form who you are as a person, and ultimately form what you stand for and how you argue those points. Attempting to sweep that under the carpet by saying "I just call it as I see it" just illustrates an unwillingness to either reflect on your own reasoning, or to share your reasoning with those you're arguing against. It also demonstrates a naïve belief (to clarify: this is not a personal attack, simply arguing a point) that your perspective and standing is somehow "the default," which is a classic sign of a lack of self-reflection and perspective on one's own privilege. Again: you're free to present arguments to show how I'm wrong in this. I'm simply presenting my take on how you come off in this debate.
I am buying 3 new video cards in the next 4 months, one is a GTX1080ti the other two will be Vega. The Vega's will power my 4k displays, the 1080ti will power my Acer X34.
Good for you. Don't see how that's relevant to this debate, though.
Again, the market is:
1080p
1440p
1440p wide
1600p
4k

That is how people shop Video cards.
Once again, rather than presenting counter-arguments, you argue something that ... well, is neither pertinent to this discussion nor has any relation to what you claim to be arguing against. Has anyone claimed this not to be true? Another paragraph, another attempt at derailing the debate. Well done.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
So based on rumors/leaks so far, the 2018 2017 product stack is likely:

RX Vega (full Vega)
RX Vega Nano (cut Vega)
RX 580 (improved 480)
RX 570 (improved 470)
RX 560 (improved 460)
RX 550 (budget/OEM)
 
Last edited:
Reactions: Headfoot

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
So based on rumors/leaks so far, the 2018 product stack is likely:

RX Vega (full Vega)
RX Vega Nano (cut Vega)
RX 580 (improved 480)
RX 570 (improved 470)
RX 560 (improved 460)
RX 550 (budget/OEM)
I doubt they'd use the Nano moniker to designate a cut-down chip, given the relative success of the SFF-specialized R9 Nano. If there's a Vega Nano, I'd hope for it to be a full-on follow-up to the R9 Nano, complete with a full die, binned for power and with (slightly) lower clocks. Oh, and that awesome vapor chamber cooler, of course.

If history is anything to go by, RX Vega for the 2nd tier and RX Vega X for the 1st tier sounds more likely, but that doesn't quite sound right in my head either. Still, they did the same with the Fury series - R9 Fury for 2nd tier, R9 Fury X for 1st tier, and R9 Fury as a designation for the series as a whole as well (my Fury X identifies in all software as "R9 Fury Series").

Also, rumors seem pretty consistent on there being two separate Vega dies, not just a complete and a cut-down one. Unless yields are amazing from the get-go, that should indicate more than one SKU per die, to avoid throwing away partially defective dies. Then again, rumors might be wrong. Die area-wise, there's definitely room for two dies (one ~300mm2 and one ~400-450mm2) above the ~220mm2 Polaris 10 die, though. If Vega is able to truly compete at the high end architecturally/efficiency-wise, it makes little sense for AMD to use a single die to compete with everything from the 1070 to the 1080Ti. That would make their entry-level Vega cards more expensive to produce than a 1070, after all.
 
Reactions: prtskg
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |