The GPU in the PS4 must be clocked pretty low then because a R9 270 with 1280 stream processors @ 900 Mhz is 150 watts TDP by itself --->
http://www.anandtech.com/show/7503/the-amd-radeon-r9-270x-270-review-feat-asus-his
The PS4's iGPU is clocked at
800 mhz.
If AMD is using 512 stream processors and paying the premium for that via having the large die reduce yields (as well as increased "edge of wafer loss") then I would prefer to see the iGPU clocks on desktop to be at least 1000 Mhz (and ideally greater) to maximize the investment.
So I am not sure AMD is constrained by die area (in the current OEM 95 watt situation), but rather thermals and bandwidth (for gaming).
As you yourself said, they are already paying a premium from having a large die that reduces yields, etc. Increasing die size further reduces yields, unless they go MCM . . . and I'm not sure Kaveri or Godavari support that. Bandwidth is another problem, naturally. But they are not constrained by thermals or power delivery, nor will they be on FM3.
Yes, they could increase TDP.
But I have been wondering if there is a cheaper better way of doing this than the rumored Bristol Ridge reported
here.
There might be, who knows? AMD has already cast its lot. They don't have the kind of budgetary discretion to be "agile" and throw out too many changes in their plans. It looks like their lineup between now and Q3 2016 (or so) is set in stone, and that had better be good enough.
Certainly, I can see 4GB at DDR4 3200 being relatively affordable* (so 2 x4 GB DDR4 3200 @ 51.2 GB/s is realistic), but I wonder if going much higher than that will require 8GB DIMMs.
That's going to be up to the market to decide what DIMM sizes are supported within any given market segment. Today, it's nearly impossible to get new (or even recent) 2x2gb kits in anything like DDR3-2400 since the market basically demanded 8gb as the new minimum memory configuration for a wide variety of systems. Are people between now and then going to start demanding 16gb memory capacity as a minimum in the enthusiast/DIY sector? How about OEM buyers? Will they?
So we may yet see DDR4-3400 (or higher) in 2x4gb kits. Personally, I think anyone who goes single-channel on a dual-channel (or higher) capable system is insane, but that's just me. Don't skimp on the memory people!
Of course, if Bristol Ridge APU users need 2 x 8GB kits to maximize bandwidth to fully utilize a fast clocked 512 sp iGPU that is going to throw a serious monkey in the wrench for gaming cost effectiveness. (In contrast, a R7 250X with 640sp at 1000 Mhz has 72 GB/s dedicated bandwidth).
One major point that you and many other APU downers seem to be forgetting is that the iGPU/dGPU ecosystem is moving towards cooperation rather than exclusivity. Many APU buyers will also buy a dGPU and switch the iGPU towards compute functions thanks to DX12 (or Mantle, if it ever becomes more widely-accepted). It won't be dead silicon for games when you throw in a 290x or whatever. Offloading physics to the iGPU is just the beginning.