It's called R&D, die shrinks improve efficiency, render piplelines get better, optimizations in current techniques are implemented or moved to dedicated hardware.
That's where the improvements mentioned are coming from.
You seem to think that if every game were more demanding, somehow they would have 10x the R&D budgets for the GPUs, and then the same for the processes they're being built on, and that even if they had that, they'd get 10x the improvements, too.
We're no where near the power envelope here if anythign we're being held back by the 300W TDP that the PCI-E specs have in place, this has gimped ALL modern dual GPU video cards, these are all fixable issues.
If we're stuck at 300W/card, it's not gimping, and they aren't fixable issues. Electricity isn't free either, and powerful PSUs are more expensive, along with multiple cards/ That we could just decide to ignore power issues is total BS.
We're not being held back in a reasonably fixable way by the 300W/card TDP. Even 300W
is insane. 300W should be a whole gaming PC's peak power draw. You're in a fantasy land if you think a paper spec of 300W is holding us back. The amount of cards that would be sold at >300W would be in the thousands, v. millions to tens of millions for cards with decent thermals and noise characteristics. Most people don't want the computer in another room, or to need IEMs, a dedicated A/C unit, or a second job (depending on country), just to use their PCs.
It's holding us back in the same way most cars not having turbo 5L+ V8s is holding us back: sensible people don't want any of it.
First of all a freeze on scene detail isn't required,
To rapidly increase resolution, without rapidly dropping framerates, yes, it is. Otherwise, it won't be equally playable as the lower res. Increasing detail will only serve to increase the needed computational performance beyond that needed just for the higher resolution.
How are you figuring that we'll have some amazing disruptive surge in capability? The last time we went up in pixel density 4x took about 10 years, and it hasn't been something that has been increasing in rate (800x600 to 1080P).
That said there most definitely IS a freeze on scene complexity, any game that is targeted at the consoles looks fundamentally as bad today as it did 7 years ago, the PRIME example is CoD MW series, they all look exactly the same, that is to say for a game in 2012 they all look really shit, there is a few very minor improvements year to year but because they're aimed at the console platforms there is no yearly progression in detail, they simply cannot do it the console hardware is static.
They look bad because they're Madden for twitchies. They were never made to look good. BF is also more or less Madden for twitchies, but made to look good, and stress modern PC hardware. Skyrim and DE:HR would be good counterexamples, IMO.
All markets are powered by money, the fight against physics takes cash, bottom line, that needs healthy market with good demand and demand isn't high to upgrade now, why would the average joe with a 1080p monitor or less upgrade their mid range video card when it can run basically most games maxed out with all bells and whistles and has done for several generations.
Because that's not the case. There are games right now that take a nice video card to look good at 1080P. You won't get perfect play for several generations. A midrange card today can only barely do decent 1080P.