Thanks for the effort OP. I didn't go in detail as to the accuracy of your calculations.
But here is a much quicker assessment:
A mid-range videocard = 150W
A high-end videocard = 250W
Difference of 100W
100W x 365 days a year x 6 hours a day at 100% load x $0.15/kWh =
$32.85 per annum or less than $3 a month.
Ok what about idle power consumption? At idle, most modern GPUs are within 15-20W of each other (as has been noted in this thread),
far less than the difference between modern processors that a high-end GPU gamer would have: i.e.,
Core i7 920-950, Core i5 750, 2500k or a Phenom II X6 1100T at idle (likely the type of CPUs that a high-end videocard owner would pair with a 250W GPU).
While your spreadsheet is useful in theory, it calculates efficiency
per FPS. This is not really relevant in the real world imo. If a person can afford a $350 HD6970, for example, it makes little difference to them how much more efficient HD5770 is since they won't buy a $100 videocard in the first place. Also, if you don't want to deal with CF/SLI game profiles, there is no way you'll purchase an HD6850 CF setup, regardless how efficient it is. Also, while a GTX480 is rated #15 for efficiency vs. #11 for the HD5850, your chart implies that the GTX480 isn't that bad for efficiency. But the GTX480 cost
almost double what the HD5850 cost on release date. Frankly, the HD6970 is ranked worse, but the GTX480 is so much louder, its efficiency parameters are largely irrelevant in that comparison. I only provide these examples to provide an explanation why efficiency
in terms of costs is generally not considered before other more important factors.
Similarly, someone who can afford a $500 GTX580 or a $500 HD6950 CF setup shouldn't be trying to save $5 on electricity costs a month (after all gaming is a very cheap hobby compared to other hobbies in life). If you do care so much about electricity costs of a GPU, you really need to get your priorities straight (and I mean that in the nicest way possible). The irony here is that most of us run overclocked GPUs which by themselves easily gain 150W+ in power consumption at load. So it's somewhat hypocritical for most of us to complain about GPU power consumption while running Phenom II X4/X6 and i5/i7 processors overclocked to the moon.
For instance, if you have time to game
6 hours a day (completely unrealistic), it's about time to think about getting a real job. And if you can afford to game 6 hours a day (i.e., say you work from home or have business ventures which provide you with sustainable cash flow), then you are also probably making a decent amount of $ to not care about electricity costs. I realize, you can control your electricity costs by buying more efficient videocards.
But who runs their GPU for 6 hours a day x 365 days a year to play videogames? A student maybe. In that case, your undergrad tuition and book costs are so high, the last thing in the world you care about are electricity costs.
All in all, lets take a look at what
typical household appliances use:
1) Clothes Dryer - 2,790 W (if you have a family, you are using this very often for 40 min at a time, if not more a week).
2) Coffee Maker - 1,200 W / or Kettle (probably using this 5-6x a week)
3) Oven - 12,000 W (if not you are eating out, which costs more than cooking at home if you are eating healthy food)
4) Microwave Oven - 1,200 W+
5) Toaster - 1,100 W
6) Water Heater - 2,500 W
7) Television - 200-500 W
Let's not even pretend your GF, mother, wife doesn't use a 1000 W hair dryer on a weekly basis! What about a hair straightener? Think about that - 15 min of a blow-dryer x 5 workdays a week @ 1000W. Go ahead and tell your wife to save on electricity by drying her hair for 1 hour on the balcony - see how that will fly.
Basically the amount of electricity your videocard consumes is peanuts compared to the sum of all the other appliances you probably use on a daily basis. And frankly if $10-20 a month makes a huge difference, I strongly suggest and encourage getting a higher paying job before sitting there and trying to count 50 cents a day saved on electricity. I never understood the importance of" 1 penny saved is 1 penny earned". I believe that you get rich from capital gains and cash flow generating assets (whether its your human capital asset - i.e, earning a higher salary at a job, or your physical asset that generates cash flow - such as a rental property). Savings does
not equal new cash flow and therefore contributes little to wealth. In other words, instead of trying to save $0.50 a day, try to get a job that pays $30-40 more a day. But that's just my thinking.
Honestly you are better off drying your clothes outside/on a rack than using your dryer, or take your bike/rollerblade to a convenience store 10 min away instead of driving. All of these changes in your life will bring far greater benefits than trying to save money on energy costs from GPUs.
Should we also stop using cell phones (smartphones usually require charging every day) and switch to a landline to save costs from charging our wireless devices?
In conclusion, I would say the greatest importance of GPU efficiency to me are reduced heat and noise levels, as has been mentioned by many posters here. :thumbsup: