I very much disagree with this analysis. Consoles have fallen behind PCs, because the power budgets for top end PCs have gone up VASTLY in the past few years. A top end rig with a pair of 290X's in Crossfire will consume over 700W!
Whereas a launch XBox 360, notorious for running so hot that it melted its own solder, consumes a mere 180W:
Microsoft and Sony both learnt their lessons from the RROD disaster- there is a hard limit to the amount of energy you can let your console consume, before serious issues with reliability kick in.
Graphics card manufacturers just keep pushing their consumption higher and higher, going to ever more extreme levels and requiring more and more elaborate cooling systems. Do you remember the days when a GPU came with a tiny cooler that just covered the chip itself, with a little fan on it? When the 360 launched, a TOP END graphics card looked like this:
Today's equivalent would be the HD7750! Whereas people on this forum would rather have you compare it to something like this:
When you compare GPU performance/W increases to console performance/W increases, the XBone and PS4 look fine. It's just the insane state of the top end GPU market which makes them look bad.