That's cherry picking. Fermi was hyped to the moon for months it was going to be nothing short of a revolution a super computer on a chip. A fake non working product was shown on stage for crying out loud. Plus Bulldozer is not even a GPU, and right now AMD has done precious little to hype anything in fact before today we had basically nothing from them.
The hype surrounding stacked memory is happening all by itself AMD doesn't have to do anything since it is something very different than what we've seen before on a graphics card.
Agreed, Deckscrewgate was pretty embarrassing for Nvidia. And the 6-month delay for the Fermi launch was also a clear sign that something had gone very wrong. Delays usually are; that's one reason why the Fiji launch has a lot of people nervous. Even though AMD never specified an official release date prior to the E3 announcement, there is a perception that Fiji is "unofficially" late to the party.
By the way,
here's the conclusion to AnandTech's review of GTX 480 (Fermi) at the time. We can see that power consumption isn't an issue that was just recently made up to embarrass AMD; it's been part of the conversation about video cards for a long time. And prior to Maxwell, AMD had the upper hand as often as not. Here's a card that beat AMD's contemporary top tier SKU by over 10%, yet reviewers were reluctant to recommend it for the same reason people are reluctant to recommend Hawaii today: it was too hot, loud, and power-hungry.
The GTX 480 Fermi could
consume up to 320 watts under a stress load. That's pretty bad, but not as bad as I thought I remembered. Gaming loads peaked at 257W, which was about the rated TDP. So Fermi was about on par with Hawaii, more or less, in terms of power consumption. The GTX 580 was a respin (GF100 -> GF110) on the same process, and it
dropped gaming power consumption to 226W peak. I would be disappointed if the Hawaii respin didn't provide at least that much power saving. It should be able to do even better, since the 480 was a cut-down part (480 shaders) while the 580 was fully enabled (512 shaders). A more apples-to-apples comparison would be the GTX 480 compared to the GTX 570 (which had the same 480 shaders on the same GF1x0 series part); the 570 only used
190W on gaming loads. If AMD could do that well with a respin, then Grenada wouldn't be too far behind GM204 in performance per watt...