The 300 series needs to be a miracle to change anything. But the hybrid cooling leaves an impression that its the last whistle before the game is over like with the 220W FX chips. In a world where performance/watt gets ever more important.
So many flaws with that statement.
1. We don't know for a fact that Bulldozer is the last CPU architecture from AMD. Implying that FX is the last whistle of AMD's CPU division and disregarding Zen in Q3 2016 is ignorance at its finest. Then you proceed to draw a correlation between AMD's CPU and their GPU divisions as if designing GPUs and CPUs is somehow directly related...:hmm:
2. Implying that R9 300 series' architecture will form a backbone for R9 400 or 500 series is also a theory with no proof -- and this is what you are implying since if you don't believe that, you can't say anything about the efficiency of R9 400 or 500 series since you have no idea what the new architecture underpinning them will be like.
Trying to correlate perf/watt of AMD's R9 300 series made on 28nm towards 14nm/16nm designs is a waste of time unless we know for a fact that AMD's 14nm chips will share similar GCN architecture with R9 300 series. We don't have such data.
Unless you can see the future, I am pretty sure you can't make any inferences whatsoever about 14nm/16nm GPUs from AMD and NV with respect to each other.
"The roadmaps will include AMD’s upcoming products based on the next-generation x86 high-performance Zen core and ARMv8 64bit K12 core. In addition to an all new family of FinFET based GPUs code named
Arctic Islands. These will feature the company’s
most significant architectural evolution on the GPU front since the introduction of GCN (Graphics Core Next) back in late 2011."
http://wccftech.com/amd-future-gpu-cpu-roadmap/#ixzz3TSC8RWEf
3. 300W TDP + WC tells us nothing that perf/watt of 390X will be horrible as you seem to imply for the last 6 months. In all of your posts you cannot grasp how AMD can possibly improve perf/watt on the same 28nm node so your post is hardly a surprise. However, you have also failed to take into account that a 250W GPU vs. a 300W GPU will get paired with a high-end i5/i7 system, which means a true measure of perf/watt a gamer experiences is Total System perf/watt. Your constant focus on GPU perf/watt and total disregard for total system perf/watt is amusing because no GPU can operate independently of the other system components.
4. R9 300 series doesn't need to win in perf/watt to be a good product. Again, not everyone is a tree hugger. Many people would take a 300W $500 R9 390X with 10% less performance compared to a 250W $700 GM200.
Also, it's interesting how you keep hyping up perf/watt as some savior and a requirement for GPUs yet when AMD owned NV for 3.5 consecutive generations in perf/watt with HD4000-7000 series, they hardly gained market share or made $. Therefore, it can be concluded with 100% certainty that even if AMD were to have class leading perf/watt with R9 300 series, it absolutely does not guarantee market share or financial success either. There are other factors that matter such as marketing, OEM/customer relationships, strong supply chain/manufacturing/logistics, PR, features, timing in getting design wins, etc. that can supersede perf/watt.
Look at the NV shield console. It will hands down have the best perf/watt among Wii U, PS4/XB1 but that console is going to bomb in sales compared to any of them. Perf/watt is not some magical metric that guarantees success. NV is successful not only because of perf/watt but many other factors.