I think they aimed for much higher clock frequency but failed. I guess GCN just doesn't scale.
That seems very plausible. Nvidia admitted it took a lot to push things with Pascal. And AMD doesn't have the resources they did. Seems like that might be a major dud in the design so far (all the extra transistors added for higher clockspeeds, although it is likely something they were going to have to address at some point, but the tradeoff on Vega doesn't seem like it's terribly worth it so far).
As a professional/compute card, Vega isn't too bad. Radeon WX 9100 should sell decently; at $2,199 it's only a bit more expensive than Quadro P5000, but far more powerful.
The problem isn't that AMD is not competing in the high-end gaming segment; writing that off is a plausible business decision. The problem is that their marketing team was so flagrantly dishonest about it. I've lost a lot of respect for AMD over this fiasco. Any gamer who bought into the hype has been stuck with lesser products for months "waiting for Vega" when they could have had a superior Nvidia card at a price no higher than Vega will now cost. Worse, prices have become inflated from the mining craze, with the GTX 1070 almost unavailable and the GTX 1080 temporarily back up in price, so anyone who waited is likely to now incur a real financial loss unless they are willing to wait longer.
If you wanted 1070/1080 performance before now, you should have bought. We've been seeing this exact whining for over a year now. There's been multiple great deals on 1070/1080s (and the 1080Ti is a great card, even a pretty good value), it made no sense to wait. AMD was not going to seriously undercut Nvidia on perf/$. It seems like there's people that were expecting a 1080Ti killer at $500. That simply was not going to happen.
It is not AMD's fault that mining has caused prices to get stupid. Holding that against them is pure idiocy. Unsurprisingly, seems the same people blaming AMD for them waiting, are the same people that use this argument. Its almost becoming a recurring theme (along with the general "I was betrayed by AMD!!!" nonsense).
Well, Vega is a complete failure for gamers. Months ago, I was hoping for at least -10% from a 1080ti. Total disappointment. AMD have regressed in essentially every parameter, perf/watt, perf/mm^2 etc, without solving the essential issues that held Fiji back. The only thing that's holding me back from buying nvidia is Freesync (I haven't bought a monitor yet, I'm waiting for the C27HG70).
Its not good but its not nearly the disaster people make it out to be. At release, if the performance doesn't fit your needs/price don't buy. I'm guessing prices will decline on it sooner than later (especially if mining starts to fade like people act like is starting to happen), and then decide then (or buy cheap used card when the miners dump their stock). Which by then hopefully we'll know more about Volta, but again, unless you can wait (in which case, why bother frothing at the stupid speculation in video card subforums, just see what options you have at your price range when you're ready to buy, then maybe check to see if there's some impending release that might change things), don't.
That's why "ecosystems" are crap and people should stop buying into them unless there's very good reasons. Variable refresh isn't good enough that I'd let that dictate multiple $300+ purchases. I'm genuinely baffled at how obsessive over this people seem to be, as I think it is one of the biggest shams perpetrated in gaming. It never should have happened either because this never should have been an issue (adaptive sync should have been something that occurred long before now). Hopefully it being part of DP spec will put an end to this particular madness.
All AMD had to do was to remove the bottlenecks of Fiji then add +50% to everything. Think about it. A Vega with 1TB/s mem bandwith 6000+ SP and crapload of raw geometry performance. At ~1400-1500Mhz. Would have destroyed 1080Ti. But what do they do? They go for professional market and spend all transistor budget on features no gamer needs.
Hell. Even 2x Polaris would had been better.
Really they probably didn't need to add 50% more to Fiji across the board, they could have probably done better just by doing with what clockspeed increases they could get (probably up to Polaris level) from the process, and adding 25% to the ROP and texture units, along with whatever other architecture (not tied to upping clockspeed) changes. I do agree the decision to halve the bus width (and thus limiting memory bandwidth) was odd/bad too. Not even sure they'd have needed to keep the full width, as say just 3 blocks would have still offered plenty of bandwidth (~750GB/s), and they could have gone with 9GB and 12GB. Would be interesting to know the cost/complexity issues in determining this stuff.
I'm not sure I agree with that. Not that it would necessarily be worse, but I think there's a decent chance it wouldn't have been better either. Since quite a few people already think the ROP and TMU are limiting for gaming on Vega (due to Fury), only the TMU would be a bit higher. But Polaris doesn't clock as high (already is fairly power hungry at its lower clock speed too, about in line with Vega even), and a larger version very possibly wouldn't clock any better (maybe even worse). And it likely would not have some of Vega's features (that might not be terribly beneficial now, but getting developers access to those features with cards in their hands, could help them be beneficial later).
Vega seems very much to be a transitional design with forward looking aspects that aren't likely to be well utilized soon. People keep lamenting this about AMD's GPUs, but I think its their only choice, as they can't dictate people cater to them or hire a lot of software developers to get things implemented. They have to think ahead. Its often frustrating as average consumers/gamers (why don't we talk about how amazing of a gaming design they'd have if they'd have stuck with VLIW4!), but it hasn't been nearly as bad as people have been acting like for years. Seriously, this has got to be about the 5th year in a row we heard how GCN was the worst design ever and AMD is doomed to failure because of it. We'll see how Vega ages but the doom and gloom that pervades this forum at every AMD GPU release is almost outright laughable.
Thankfully, Ryzen is good, because otherwise, this could be company destroying for AMD.
They must have known it was going to be this bad, even before tape out, and yet they just went along with this train wreck. It doesn't bode well for bouncing back.
I really think they need outside talent to straighten out their GPU design, but where would that come from these days?
I don't agree. It could also be that they focused performance on things other than gaming. And those happen to have more growth and higher profit margins than the high end consumer video card segment. They released a card that is ok in gaming performance, which keeps developers at least considering their hardware (and gives them hardware that can utilize newer features so they can work on implementing them and get software support for them which will benefit later GPUs).
WTF is all this talk about their design team and junk? Seriously, I'm confident that almost none of the people constantly spouting off about this know much of anything when it comes to architecture design and engineering of GPUs, let alone know the internal setup of AMD/RTG or what their targets are/were. And yet we're getting "they need to fire Raja and clean house" from random people (many of which don't even have many posts on here). Especially considering that AMD has been involved in multiple GPU projects (Sony, Microsoft, their own stuff; there's even hints that they've been helping Apple develop their mobile GPU), I'm not sure how people can say some of the stuff they do (let alone how they do it definitively like they're some authority on the subject). I'd say disgruntled former employees but they don't really act like that (then I seem to remember people saying NVidia needed to ditch JHH after Fermi). Its clear that their GPUs have plenty of performance capability, its just they might not be stupendous for gaming. They're still ok for that use.