Dat text wall! I read it all, and will be responding to specific parts (at work so time crunch):
This is at railven.
https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical_Overview/4.html
AMD actually added a feature that could help DX11 performance too (conservative rasterization). Not sure if it'll actually be backported to support the DX11 implementation or if its just about DX12. Especially noteworthy is that the multi-view rendering is one of the things Nvidia talked up with Pascal, and is what that video was talking about with the discarding. Which from what I gather means that they basically check rendering from multiple angles, and discard as much as they can preventing them from having to discard a lot of the same stuff over and over. That specifically targets some of the tessellation stuff that Nvidia pulled with Gameworks, where they'd jack the tessellation rate to a ridiculous level for no benefit (there's a reason the video mentions culling things that are too small to even show up). And AMD has some Vega specific discard path that is faster or discards a lot more, which apparently so far isn't implemented (although in general Vega has more discard capability than Fiji).
I believe that feature was brought on with Polaris, which is overall to me a good GPU. However, as I'll explain in a following paragraph.
That article also absolutely supports that GCN5/Vega is enabling extra robustness for developers to manage issues in the pipeline (see slide 20). It will obviously be up to developers to really make good use of it though. But it means if they optimize they can both mitigate what would otherwise be stalls or things they'd be stuck waiting to get processed (i.e. discarding instead of fully processing), but also could adjust to get more work done (and seems like they could do this at multiple stages, giving them more flexibility to manage things).
And that is Glo's point (even if I do think he's being pretty optimistic on how much it will improve things, but he's talking about the objective max potential there, not saying it will absolutely happen from what I've seen). There are a lot of features that could make Vega much much better, but it requires developers putting in the effort, and will take time to implement. The reason why he's probably so optimistic is that, from what I'm seeing, this is all stuff that is part of the DX12 featureset moving forward, meaning, Nvidia will almost certainly be doing the same (in some instances already is), so its not like developers are likely to ignore it. Unless Nvidia really starts blackboxing stuff and basically does the Nvidia specific paths (while also putting in the work in the driver), developers putting in the work will benefit both. If I'm not mistaken developers have even spoken out about how blackboxing stuff just ended up causing their games to be broken on release (not even just for AMD stuff either), and generally when we saw a game with Gameworks (especially the broken ones), the sequel had the feature but it was implemented by the developer, and performance vastly improved while quality didn't seem to suffer much if at all.
And this is where my bitterness and overall contempt to some posters here stems from. What Glo is basically arguing now, others argued about Async Compute and even the new feature in Polaris that alleviated some of the Gameworks foul play. Polaris was the next HD 4800, it was going to be 90% of GTX 1080 in certain scenarios for half the price. It was going to use the new culling features and be 85% or better in DX11 performance of GTX 1080 for half the price.
That never materialized. So I'm very skeptical of the same predictions especially when it's some of the same posters.
And you are right, NV will most likely follow suit, but as I started saying back in 2014 (when someone was telling me the list of DX12 games was going to be the most important metric, and then in 2015, and then in 2016, you get the gist of it) when this does actually happen I expect NV to jump in with the systems are mature. And there are no bugs, and I wouldn't be surprised if NV's implementation is superior, thus making all the hardships on AMD pioneering the tech that much more agonizing.
As for the consoles, the PS4 Pro and Scorpio are really the first ones to offer even some of the new Vega features (but they don't offer the fullset that Vega does, meaning, it'll be the consoles after that will probably offer that; and I strongly disagree with your dismissal of how important being in the consoles is for AMD as it is absolutely responsible for the gains we saw with just some of the DX12/Vulkan features being implemented in games like Doom). And that has been my point, Vega is another forward looking design. So this implementation of the Vega architecture might be a dud (for various reasons, which could be hardware stuff that can't be fixed/improved with software updates and newer software) and never fully show its potential. But it might also be like the 7970, where when utilizing its featureset it could potentially see large gains some time later (meaning, yes, years). Yes, that really doesn't help now (once again, always always always always always buy based on perf/$ when you buy and not future performance), but this stuff is the groundwork for what graphics processing will be doing (Nvidia is already saying they're going to support plenty of this in the future, RPM being one they specifically said Volta will bring). So people dismissing the architecture as a failure when you're not even seeing a lot of its features really utilized (I'd guess that even games/engines that have support for some of this stuff it is quite limited right now).
You might not know my history on ATF, but I was a hardcore ATI/Radeon fanboy. I carried that badge proudly until I got pushed out by miners when trying to buy a 290 ref was almost 50% over MSRP. So I got a used GTX 780 Lightning (didn't even want to give NV my fresh dollars, yet).
The problem is, as a CFX 7970 user, those improvements came far too late. I upgrade regularly (my own schedule) that AMD had no problem maintaining until recently. But it was the framepacing issues and the constant borking of advertise features that got to me and I sold off my 7970s during that first mining craze (of course I did make some money myself mining).
So back to this post, the DOOM benefits didn't even come at launch. And if you want an interesting little tail - DOOM was slammed as being an NV sponsored game because it was featured in the GTX 1080 presentation, and had issues with AMD cards at launch. So yeah, I'm glad AMD can get these kind of gains, but in AMD fashion - it's often far too late. By the time DOOM got sorted out on AMD hardware I had already beat it and moved on. Yes, others benefited from it, but DOOM is now an old game but it's still at the fore front of what AMD hardware can do. That is a little discouraging. Hitman was another saving grace, and then that flopped. DUES, another example of botched launched. Just about every DX12 game that was suppose to make AMD shiny faltered at launch. While I'm not putting the blame on AMD, this just made me more aware of AMD's lack of resources to court devs.
That's why both Sony and Microsoft have been saying the Pro and One X are not really next gen consoles, as they're more extending the current stuff with a few extra features that won't make a big difference right now, but will likely be a bigger part in the future. So that doesn't actually help Vega much as then it'll still be years before this stuff will be common to developers, but that's how they've had to go about their GPU design. Be forward facing and hope that developers make good use of it. If not, then they look worse (which is what has been happening; although plenty of that was other things not related to their GPU performance; which they've been working to recognize and improve across the board, time will tell how things work out).
The Pro and Scorpio are mistakes in my opinion. Mistakes that are going to cost AMD. It prolonged a generation where the base models still have to be supported. We're basically stagnated 2013 consoles that featured GPU hardware already obsolete. Sure, the Pro and Scorpio might look better for console's but progress overall would falter. But Sony, with their devs will make beautiful games (Looking at you new Spiderman) but what good will that do AMD and other devs? Not much.
Would AMD be doing better if Xbox was selling in line with PS4? Possibly. But the thing is, that almost certainly would have come at the expense of PS4 sales, so unless they get a higher margin from Microsoft, it likely wouldn't make a big difference.
It's a dog eat dog world.
From what I've seen, Microsoft functions a lot like AMD in that they try to provide documentation, but they leave it up to the developers for the end results (I'd say they've been much more developer friendly than AMD, but AMD has had more of a focus on the hardware for obvious reasons; but Microsoft often has competing software so they're not going to just give away their unique advantages either; and plenty of game developers just use middleware anyway). They do offer tools but most of that isn't about eking the maximum performance is more about limiting outright problems. In the case of the Xboxes its likely mostly about CPU threading and managing the memory setup (the 360 and One both using eDRAM), and then baseline GPU stuff (didn't they just start talking seriously about DX12 on the One like 2 years ago? Meaning it very clearly was not designed with DX12 in mind, even though it can support a fair amount of its features). Scorpio definitely has a DX12 focus (it has hardware specifically for scheduling DX12 calls so that they can lessen CPU load), and Microsoft has been talking up their analysis tool, which both helped in them figuring out what they wanted in Scorpio (hardwarewise), and also showed them performance of the games already on the Xbox to get an idea of where they're bottlenecked. I would almost guarantee that will pay dividends.
MSFT's lack luster efforts this generation should have been a red flag for AMD. Gen 6, MSFT was courting pretty much everyone they could. Securing a great roster of titles to bolster their platform. This generation, barely anything, and then they collapsed and started to make their own platform exclusives available on PC.
We can argue what Xbox One was designed for, but one of it's huge marketing points was DX12. To the point where just like over on the PC side, I had to read through countless console fanboy wars when DX12 would be enabled, how Xbox One will match or eclipse PS4. That also never happened.
Sony definitely is more interested in helping their own ecosystem. I believe they've had a few of their star developers help create tools and offer insight (although often it seems aimed at companies willing to do PS exclusives). Sony thinks their low level stuff on the PS4 is better than even Vulkan, but they are supporting it as well. I don't know if a developer could license DX12 for games on the PS4 or not.
Sony is a Japanese company through and through (tie stereotypes of a xenophobic culture). It was only during the PS3's failings did Sony become a little more modest. But now look at them with PS4. They are back to "you will get two jobs to afford one" mentality.