Power consumption is part of TCO. It doesn't take too many years of usage before a 100W delta in power usage blows your TCO-justified pro-AMD budget position.
Where I live I pay $0.13/kWHr, but it is air-conditioned. So every watt of extra power and heat dissipated into the air inside my house is yet another watt of heat that must be removed by the heat pump, which is about a 1W to 1W efficiency deal.
So that extra 100W power delta from my 8350 at the wall translates into roughly an extra 200W of power I am paying for on my utility bill.
Now if I conservatively estimate my loading usage at a
mere 8hrs per day (my apps of interest actually run 24/7 full load), then the TCO for my 8350 rig requires accounting for the additional $75/year power-bill footprint.
Or I could buy a 3770k (as I did), lower my power footprint by 200W (accounting for the lessened AC overhead), and get higher performance to boot, in exchange for paying out an extra $130 up front. The TCO here is heavily in favor of the 3770k if I plan to use the computer for 2yrs or more.
The only time you can make a TCO-justified argument that measures up to being in favor of the less expensive but higher power consuming AMD processors is when the usage scenarios are crafted to heavily favor long idle periods where the computer is essentially going unused.
And if that is the expected usage scenario, that you are going to buy a 4GHz 8-core computer only to have it sit idle 95% of the day then one really should be questioning the need for the computer purchase in the first place at that point because your TCO would be much more in favor of getting something used on craigslist then.