What you are saying would only be true if flagship cards went from using 250-275W of power to just 150-200W of power. That's not what's happening. What you described is exactly what NV wants us to believe. They created a new marketing strategy whereupon a videocard's worth is tied directly to its perf/watt, rather than its marginal utility (price/perf). This is a brilliant marketing strategy because once the consumer believes this marketing BS, NV can double prices but still sell you GTX560Ti (980) and GTX580 (Titan X) under the veil of perf/watt. Think about it, in the past did newer GPUs improve perf/watt over older ones? Absolutely but AMD/NV hardly used it as a major selling point to get you to upgrade. Today, perf/watt is marketed as THE most important factor for upgrading. It's not if the Titan X or a 980 use significantly less power than a GTX580 or 560Ti. They don't. Then why all of a sudden are they more attractive to warrant double the price of their historical lineage predecessors?
If you ask a new PC gamer, they wouldn't know that GTX560Ti was from the 680-980 lineage and cost $249 while GTX280/480/580 were from the Titan lineage and cost $499. The perf/watt marketing worked but NV still continues to sell 250W flagships, just today they ask double the price. I think AMD will still give us a flagship 250W card that is 40-50% faster than the R9 290X. AMD has remained the price/performance king since HD4850/4870 series and I don't see R9 300 changing that trend. Since R9 290X sells for $280-300 today, even if AMD releases a flagship at $550-650, it should be enough to cover the 500mm2+ die size and HBM and make more $ than they are currently making on those 290 cards.
While I hate to go off-topic on financials, in this case it's no topic. Increased manufacturing costs of larger die GPUs on lower nodes in no way offset the major price increases certain GPUs experienced in the last 5 years. NV's
FY2010 gross margins were 35.4%, and they
increased every single year to
55.5% as of FY2015. Therefore, the theory that GPU makers can't afford to manufacture GPUs with more transistors at similar die sizes as in the past doesn't fly. I truly believe perf/watt is used as a marketing tactic to justify the price increases today because performance increases have slowed down (took 3 years for 780Ti to double the 580). That means GPU makers cannot sell us on perf/watt as easily anymore (just look at the 960 vs. 760 = total disaster).
How do you market something a consumer doesn't really need? You need to devise a strategy that makes his/her existing product seem vastly inferior in some way so that he/she thinks it's outdated tech. Today, the easiest way to do this is perf/watt marketing. Even Intel is doing it. Intel will probably try to launch 35W Skylake CPUs that are nearly as fast as the i7 4690K/4770K. The focus on perf/watt suddenly makes your perfectly fine Haswell CPU look outdated. I think the focus on perf/watt today is because it's the easier metric to market and the easiest one to hype up because its hows the greatest improvement from 1 gen to another among all other metrics consumers actually understand. All of a sudden you can market a 35W CPU that's slightly faster than a 65W one
twice as good!