Efficiency and its importance in gaming - updated GPU cost effectiveness list

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
I like the data you're collecting and the charts you're making.

I disagree with the whole harping on the energy bit. I'm as much for conserving as the next guy, but the cost savings due to hardware changes and usage patterns are insignificant compared to the amounts squandered by typical American activities including alcohol (after work happy hour, friday and saturday night festivities, and that stash you keep at home), services not actually used (like cable tv and unlimited data wireless plans), transportation (gas, tolls), excessive social entertainment (movies, restaurants, outdoor team sports/hobbies), and impulse shopping.

Well, since you opened that can of worms, no, I do not agree to squandering money on anything in life; regardless of hobby. In some things though, you get what you pay for. Such is not the case with graphics cards (as seen here).

But you may have missed where your SLI 580's sit on those charts for efficiency; even compared to other setups that can produce 60+ FPS.
 

darckhart

Senior member
Jul 6, 2004
517
2
81
But you may have missed where your SLI 580's sit on those charts for efficiency; even compared to other setups that can produce 60+ FPS.

Unfortunately, it's easy to make charts say whatever we want. In the restricted variables you compare, the results come out as they do. (As for my cards, I use them for more than gaming. But even in gaming, they shouldn't be on your list at the paltry 1920 res.) Regardless, I was only arguing what I feel is a misplaced emphasis on power costs.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Unfortunately, it's easy to make charts say whatever we want. In the restricted variables you compare, the results come out as they do. (As for my cards, I use them for more than gaming. But even in gaming, they shouldn't be on your list at the paltry 1920 res.) Regardless, I was only arguing what I feel is a misplaced emphasis on power costs.

Restricted variables? It's FPS/cost. How restricted is that? That's baseline information not even considering heat generation. Cards that don't sit well on these charts will sit worse if you factor in heat generation as well.

While 1920x1200 isn't the highest resolution for comparing benchmarks, it does sit in the middle and average for what more people use. What resolution do you play at? What is your desktop resolution in OS?
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Ok, all spreadsheets have been updated with each card being ranked by each kWh cost and upgrade frequency. Sorry it took so long. Don't you hate when half way through something tedious you figure out a much less redundant way to do something? >.<
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Yet you are using a 9600gt? Surely some other considerations entered into your buying that.

That's budgeting constraints, and hasn't been upgrade along with the SB upgrade.
Power efficiency doesn't even factor unless it means a new PSU!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The poll dosn't suit the thread.
How often you upgrade has nothing to do with the topic of the thread.

Where is the poll option to choose performance over "efficiency"?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Thanks for the effort OP. I didn't go in detail as to the accuracy of your calculations.

But here is a much quicker assessment:

A mid-range videocard = 150W
A high-end videocard = 250W
Difference of 100W

100W x 365 days a year x 6 hours a day at 100&#37; load x $0.15/kWh = $32.85 per annum or less than $3 a month.

Ok what about idle power consumption? At idle, most modern GPUs are within 15-20W of each other (as has been noted in this thread), far less than the difference between modern processors that a high-end GPU gamer would have: i.e., Core i7 920-950, Core i5 750, 2500k or a Phenom II X6 1100T at idle (likely the type of CPUs that a high-end videocard owner would pair with a 250W GPU).

While your spreadsheet is useful in theory, it calculates efficiency per FPS. This is not really relevant in the real world imo. If a person can afford a $350 HD6970, for example, it makes little difference to them how much more efficient HD5770 is since they won't buy a $100 videocard in the first place. Also, if you don't want to deal with CF/SLI game profiles, there is no way you'll purchase an HD6850 CF setup, regardless how efficient it is. Also, while a GTX480 is rated #15 for efficiency vs. #11 for the HD5850, your chart implies that the GTX480 isn't that bad for efficiency. But the GTX480 cost almost double what the HD5850 cost on release date. Frankly, the HD6970 is ranked worse, but the GTX480 is so much louder, its efficiency parameters are largely irrelevant in that comparison. I only provide these examples to provide an explanation why efficiency in terms of costs is generally not considered before other more important factors.

Similarly, someone who can afford a $500 GTX580 or a $500 HD6950 CF setup shouldn't be trying to save $5 on electricity costs a month (after all gaming is a very cheap hobby compared to other hobbies in life). If you do care so much about electricity costs of a GPU, you really need to get your priorities straight (and I mean that in the nicest way possible). The irony here is that most of us run overclocked GPUs which by themselves easily gain 150W+ in power consumption at load. So it's somewhat hypocritical for most of us to complain about GPU power consumption while running Phenom II X4/X6 and i5/i7 processors overclocked to the moon.

For instance, if you have time to game 6 hours a day (completely unrealistic), it's about time to think about getting a real job. And if you can afford to game 6 hours a day (i.e., say you work from home or have business ventures which provide you with sustainable cash flow), then you are also probably making a decent amount of $ to not care about electricity costs. I realize, you can control your electricity costs by buying more efficient videocards. But who runs their GPU for 6 hours a day x 365 days a year to play videogames? A student maybe. In that case, your undergrad tuition and book costs are so high, the last thing in the world you care about are electricity costs.

All in all, lets take a look at what typical household appliances use:

1) Clothes Dryer - 2,790 W (if you have a family, you are using this very often for 40 min at a time, if not more a week).
2) Coffee Maker - 1,200 W / or Kettle (probably using this 5-6x a week)
3) Oven - 12,000 W (if not you are eating out, which costs more than cooking at home if you are eating healthy food)
4) Microwave Oven - 1,200 W+
5) Toaster - 1,100 W
6) Water Heater - 2,500 W
7) Television - 200-500 W

Let's not even pretend your GF, mother, wife doesn't use a 1000 W hair dryer on a weekly basis! What about a hair straightener? Think about that - 15 min of a blow-dryer x 5 workdays a week @ 1000W. Go ahead and tell your wife to save on electricity by drying her hair for 1 hour on the balcony - see how that will fly.

Basically the amount of electricity your videocard consumes is peanuts compared to the sum of all the other appliances you probably use on a daily basis. And frankly if $10-20 a month makes a huge difference, I strongly suggest and encourage getting a higher paying job before sitting there and trying to count 50 cents a day saved on electricity. I never understood the importance of" 1 penny saved is 1 penny earned". I believe that you get rich from capital gains and cash flow generating assets (whether its your human capital asset - i.e, earning a higher salary at a job, or your physical asset that generates cash flow - such as a rental property). Savings does not equal new cash flow and therefore contributes little to wealth. In other words, instead of trying to save $0.50 a day, try to get a job that pays $30-40 more a day. But that's just my thinking.

Honestly you are better off drying your clothes outside/on a rack than using your dryer, or take your bike/rollerblade to a convenience store 10 min away instead of driving. All of these changes in your life will bring far greater benefits than trying to save money on energy costs from GPUs.

Should we also stop using cell phones (smartphones usually require charging every day) and switch to a landline to save costs from charging our wireless devices?

In conclusion, I would say the greatest importance of GPU efficiency to me are reduced heat and noise levels, as has been mentioned by many posters here. :thumbsup:
 
Last edited:

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
The poll dosn't suit the thread.
How often you upgrade has nothing to do with the topic of the thread.

Where is the poll option to choose performance over "efficiency"?

Already covered in earlier posts. It was for me to determine what to display on the spreadsheets and also out of simple curiosity.

Thanks for the effort OP. I didn't go in detail as to the accuracy of your calculations.

But here is a much quicker assessment:

A mid-range videocard = 150W
A high-end videocard = 250W
Difference of 100W

100W x 365 days a year x 6 hours a day at 100% load x $0.15/kWh = $32.85 per annum or less than $3 a month.

Ok what about idle power consumption? At idle, most modern GPUs are within 15-20W of each other (as has been noted in this thread), far less than the difference between modern processors that a high-end GPU gamer would have: i.e., Core i7 920-950, Core i5 750, 2500k or a Phenom II X6 1100T at idle (likely the type of CPUs that a high-end videocard owner would pair with a 250W GPU).

I appreciate your attempt at the comparisons, but unfortunately they don't provide the necessary baseline of which FPS/cost produces. Hence the purpose of the spreadsheets. Also, take note that depending on how much you pay for your energy and how long you own that GPU will make a larger difference in its effectiveness.

I will be exploring CPUs eventually on these spreadsheets as well.

While your spreadsheet is useful in theory, it calculates efficiency per FPS. This is not really relevant in the real world imo. If a person can afford a $350 HD6970, for example, it makes little difference to them how much more efficient HD5770 is since they won't buy a $100 videocard in the first place.

It's not all that it calculates, but it is the driving factor for the ranking. It's a matter of budget versus investment. Most people forget how investment works and that while you may spend less money on a device that uses more energy, in the long run you'll spend more money just using it. While I agree most people probably wouldn't think twice about saving a few currency over the long term, this is to get them to start considering it. You can't tell me if you see a similar performing card for $20 cheaper that you wouldn't buy that one instead simply because it's cheaper? What if it's because it was cheaper initially, then more expensive later on due to operational expense? This is the essence of the spreadsheets.

Also, if you don't want to deal with CF/SLI game profiles, there is no way you'll purchase an HD6850 CF setup, regardless how efficient it is.

Also the point of why the spreadsheet breaks down by FPS/cost to show you how efficient each setup is compared to each other objectively. No bias. If you could run a pair of cards that would produce more FPS, cost less initially, and cost less overall instead of a 'single more powerful card', wouldn't you at least consider it?

Also, while a GTX480 is rated #15 for efficiency vs. #11 for the HD5850, your chart implies that the GTX480 isn't that bad for efficiency. But the GTX480 cost almost double what the HD5850 cost on release date. Frankly, the HD6970 is ranked worse, but the GTX480 is so much louder, its efficiency parameters are largely irrelevant in that comparison. I only provide these examples to provide an explanation why efficiency in terms of costs is generally not considered before other more important factors.

This isn't regarding release date. This is regarding NOW. Prices change for a reason as companies realize where their cards sit and have to adjust to either milk the market by increasing the price or attain more sales by lowering the price. I'm confused how initial market release cost has any weight on the subject?

The spreadsheets show exactly where each card sits by its performance marks. You can spend countless hours reading reviews and trying to weigh how well cards do, but there isn't a way to get a true baseline without factoring in FPS/cost and then considering how hot the card runs and also how loud it is (if those attributes matter to you).

*****
I'm considering how to weigh noise and heat generation into these numbers to provide a more "realistic" approach to the ranking system, but that will then make it more subjective (depending on how important those attributes are to you). This is also the reason why I provided the dB(A) and running temperatures on the charts, so one could weigh this information directly against FPS/cost. The point of the ranking system is to show where each card lines up against each other in terms of general performance marks. It's obvious a very powerful setup, GTX580 SLI for example, will be poor in efficiency, but it also produces the highest FPS. You must then consider if you don't mind losing a few FPS from 111.5 (still being well above 60FPS; which is considered optimal for gaming by many), that you could run several other setups which will be much cheaper. To put it appropriately, <1% of the gaming population needs 100 FPS+ to play their games comfortably.
*****

If you do care so much about electricity costs of a GPU, you really need to get your priorities straight (and I mean that in the nicest way possible). The irony here is that most of us run overclocked GPUs which by themselves easily gain 150W+ in power consumption at load. So it's somewhat hypocritical for most of us to complain about GPU power consumption while running Phenom II X4/X6 and i5/i7 processors overclocked to the moon.

I care about how much money I'm spending on everything in my life and the return investment compared to what else is out there. Being a better consumer is what I strive for. I refuse to mindlessly run around buying whatever just to suit my needs and fit stereotypes for the power hungry age that we live in. I live within reason and balance everything in my life. And no, I don't overclock much (<5%). Extremism is something that has been adopted far too easily in the technology age and I refuse to accept it.

For instance, if you have time to game 6 hours a day (completely unrealistic), it's about time to think about getting a real job. And if you can afford to game 6 hours a day (i.e., say you work from home or have business ventures which provide you with sustainable cash flow), then you are also probably making a decent amount of $ to not care about electricity costs. I realize, you can control your electricity costs by buying more efficient videocards. But who runs their GPU for 6 hours a day x 365 days a year to play videogames? A student maybe. In that case, your undergrad tuition and book costs are so high, the last thing in the world you care about are electricity costs.

I'm commuting to or at work 48+ hours a week. I game 0-6 hours a day depending on the day and what else I have to do. Six hours is not unrealistic. Perhaps to you and your lifestyle. Some people game 12 hours a day. Some people game 30 minutes. Six hours game time and 2 hours idle time is a happy medium number I picked to give an example. You can tit for tat all you want and criticize people about others lifestyles, but that won't get us anywhere nor will it change the facts on how each card compares to each other. I could've just as easily chosen 5 minutes or 24 hours a day for these numbers and you'd still see the differences. The numbers would just be that much more extreme in what they represent.

Basically the amount of electricity your videocard consumes is peanuts compared to the sum of all the other appliances you probably use on a daily basis. And frankly if $10-20 a month makes a huge difference, I strongly suggest and encourage getting a higher paying job before sitting there and trying to count 50 cents a day saved on electricity.

Why is trying to be a more conscientious consumer in everything you do bad, including the hardware you employ? Getting a higher paying job isn't as easy as you make it out to be.

I never understood the importance of" 1 penny saved is 1 penny earned". I believe that you get rich from capital gains and cash flow generating assets (whether its your human capital asset - i.e, earning a higher salary at a job, or your physical asset that generates cash flow - such as a rental property). Savings does not equal new cash flow and therefore contributes little to wealth. In other words, instead of trying to save $0.50 a day, try to get a job that pays $30-40 more a day. But that's just my thinking.

It appears that you're still learning the importance of "waste not, want not". This is not about getting rich. It's about living effectively within your means and knowing how hard your money is working for you.

In conclusion, I would say the greatest importance of GPU efficiency to me are reduced heat and noise levels, as has been mentioned by many posters here. :thumbsup:

I do agree that heat and noise are pretty important as well. My problem is how to incorporate that data objectively?

Finally, I would like to point out that if you find value in performance reviews, you'll also probably find value in baseline statistics; if not now, then eventually. These charts aren't for everyone, but it is my hope that more people will realize the value of efficiency and how it can affect your life - even in your gaming.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Unfortunately, it's easy to make charts say whatever we want. In the restricted variables you compare, the results come out as they do. (As for my cards, I use them for more than gaming. But even in gaming, they shouldn't be on your list at the paltry 1920 res.) Regardless, I was only arguing what I feel is a misplaced emphasis on power costs.
Agreed.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Six hours is not unrealistic.

Even if I were to agree with this statement, I already provided you with a real world comparison based on your 6 hours a day assumption. I said the difference between a mid-range card and a high-end card will be roughly 100W (and even less when comparing AMD cards). I didn't check your calculations for accuracy but I think you are making it seem worse than it is.

I'll even go into more detail:

HD5750 (low-end) = 54W
HD6850 (mid-range) = 96W
HD6950 GB (high-end) = 153W
GTX580 (very high-end) = 229W
Source: Crysis 2 (peak)

The difference between low-end and high-end is only ~ 100W
The difference between mid-range and high-end is ~60W.

People who buy a GTX580 for $440 don't care about another 60W of power consumption from a cost perspective over the $250 HD6950 2GB.

So again, you are looking at a 50-100W power difference for most users. Compare an HD6850 (mid-range) to a GTX580 (very high-end) as our worst case scenario for single GPUs (there is no point in looking at the low-end since those GPUs aren't good enough for modern games). We get 133W x 6 hours per day x 365 days.

At 15 cents / kWh = $43.69
At 25 cents / kWh = $72.82

But how many people are using a $440 GTX580? They can probably afford the extra cost over an HD6850, no? The same people who buy a $440-500 GTX580 will also be buying a GTX680/HD7970 for $450-550 in the next 6 months - I can almost guarantee you that this group of buyers is price inelastic to electricity costs.

For most of us the difference will be 60-100W at load between GPUs. This will come out to ~$16-32 per annum at 15 cents / kWh, for example. I don't think that's a lot of $ for an annual cost of gaming. Like I said, if it was, they'd be gaming on a console which consumes even less electricity (if we are that concerned about electricity costs).

I know that you tried to show the cost difference but I don't think most adults are gaming 6 hours a day like you do. So it's a very extreme case that you presented imho.

Some people game 12 hours a day.

Yes, either students (who have far greater costs to worry about), unemployed (who should be looking for work or getting new skills instead of gaming 6+ hours a day) or people who have a lot of free time (this likely means they already have enough $ not to work "regular jobs").

I could've just as easily chosen 5 minutes or 24 hours a day for these numbers and you'd still see the differences. The numbers would just be that much more extreme in what they represent.

Tom's Hardware did a similar study to yours. For an "Enthusiast" profile, the difference was only $17 for an HD5850 and $40 for a GTX580 per year at 22 cents / kWh from a base case.

For a "Gamer" profile, the difference was $50 per year between a GTX480 and an HD5850. The reason I compared launch prices is because gamers who bought a GTX480 probably bought it close to launch, as other far more efficient cards were available later on. Which brings me to the same point I keep repeating - the difference isn't as drastic since most modern cards (outside of Fermi) are relatively close with one another in load power consumption.

Finally, I would like to point out that if you find value in performance reviews, you'll also probably find value in baseline statistics; if not now, then eventually. These charts aren't for everyone, but it is my hope that more people will realize the value of efficiency and how it can affect your life - even in your gaming.

I think it would be more interesting to see if CPUs, when overclocked, are a worse culprit than GPUs. So many people in the GPU forum love to discuss how "hot, power hungry and inefficient" certain brands of GPUs are, and yet the same people are using Phenom II X4/X6 processors.



You can see that the power consumption differences between overclocked CPUs far exceed the small 50-100W differences between GPUs. So I think you should have focused here first. :thumbsup:

You imply significant differences at 6 hours of gaming per day from GPUs. But based on my 2 min calculations it should only come out to about $20-40 extra a year if you live in North America. Still, it is more expensive cards that generally consume more electricity (such as an HD6970/GTX570/GTX580). Chances are people who buy $250 GPUs are making a little bit more $ than those who are buying frugal HD5770s. The difference between say an HD5770 and an HD5870 is only 50-60W at load btw, and you get double the performance!
 
Last edited:

WMD

Senior member
Apr 13, 2011
476
0
0
Excellent post. LOL. My thoughts exactly.

Thanks for the effort OP. I didn't go in detail as to the accuracy of your calculations.

But here is a much quicker assessment:

A mid-range videocard = 150W
A high-end videocard = 250W
Difference of 100W

100W x 365 days a year x 6 hours a day at 100% load x $0.15/kWh = $32.85 per annum or less than $3 a month.

Ok what about idle power consumption? At idle, most modern GPUs are within 15-20W of each other (as has been noted in this thread), far less than the difference between modern processors that a high-end GPU gamer would have: i.e., Core i7 920-950, Core i5 750, 2500k or a Phenom II X6 1100T at idle (likely the type of CPUs that a high-end videocard owner would pair with a 250W GPU).

While your spreadsheet is useful in theory, it calculates efficiency per FPS. This is not really relevant in the real world imo. If a person can afford a $350 HD6970, for example, it makes little difference to them how much more efficient HD5770 is since they won't buy a $100 videocard in the first place. Also, if you don't want to deal with CF/SLI game profiles, there is no way you'll purchase an HD6850 CF setup, regardless how efficient it is. Also, while a GTX480 is rated #15 for efficiency vs. #11 for the HD5850, your chart implies that the GTX480 isn't that bad for efficiency. But the GTX480 cost almost double what the HD5850 cost on release date. Frankly, the HD6970 is ranked worse, but the GTX480 is so much louder, its efficiency parameters are largely irrelevant in that comparison. I only provide these examples to provide an explanation why efficiency in terms of costs is generally not considered before other more important factors.

Similarly, someone who can afford a $500 GTX580 or a $500 HD6950 CF setup shouldn't be trying to save $5 on electricity costs a month (after all gaming is a very cheap hobby compared to other hobbies in life). If you do care so much about electricity costs of a GPU, you really need to get your priorities straight (and I mean that in the nicest way possible). The irony here is that most of us run overclocked GPUs which by themselves easily gain 150W+ in power consumption at load. So it's somewhat hypocritical for most of us to complain about GPU power consumption while running Phenom II X4/X6 and i5/i7 processors overclocked to the moon.

For instance, if you have time to game 6 hours a day (completely unrealistic), it's about time to think about getting a real job. And if you can afford to game 6 hours a day (i.e., say you work from home or have business ventures which provide you with sustainable cash flow), then you are also probably making a decent amount of $ to not care about electricity costs. I realize, you can control your electricity costs by buying more efficient videocards. But who runs their GPU for 6 hours a day x 365 days a year to play videogames? A student maybe. In that case, your undergrad tuition and book costs are so high, the last thing in the world you care about are electricity costs.

All in all, lets take a look at what typical household appliances use:

1) Clothes Dryer - 2,790 W (if you have a family, you are using this very often for 40 min at a time, if not more a week).
2) Coffee Maker - 1,200 W / or Kettle (probably using this 5-6x a week)
3) Oven - 12,000 W (if not you are eating out, which costs more than cooking at home if you are eating healthy food)
4) Microwave Oven - 1,200 W+
5) Toaster - 1,100 W
6) Water Heater - 2,500 W
7) Television - 200-500 W

Let's not even pretend your GF, mother, wife doesn't use a 1000 W hair dryer on a weekly basis! What about a hair straightener? Think about that - 15 min of a blow-dryer x 5 workdays a week @ 1000W. Go ahead and tell your wife to save on electricity by drying her hair for 1 hour on the balcony - see how that will fly.

Basically the amount of electricity your videocard consumes is peanuts compared to the sum of all the other appliances you probably use on a daily basis. And frankly if $10-20 a month makes a huge difference, I strongly suggest and encourage getting a higher paying job before sitting there and trying to count 50 cents a day saved on electricity. I never understood the importance of" 1 penny saved is 1 penny earned". I believe that you get rich from capital gains and cash flow generating assets (whether its your human capital asset - i.e, earning a higher salary at a job, or your physical asset that generates cash flow - such as a rental property). Savings does not equal new cash flow and therefore contributes little to wealth. In other words, instead of trying to save $0.50 a day, try to get a job that pays $30-40 more a day. But that's just my thinking.

Honestly you are better off drying your clothes outside/on a rack than using your dryer, or take your bike/rollerblade to a convenience store 10 min away instead of driving. All of these changes in your life will bring far greater benefits than trying to save money on energy costs from GPUs.

Should we also stop using cell phones (smartphones usually require charging every day) and switch to a landline to save costs from charging our wireless devices?

In conclusion, I would say the greatest importance of GPU efficiency to me are reduced heat and noise levels, as has been mentioned by many posters here. :thumbsup:
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
/sigh

Once more and then I'll concede this to contextually challenged, cognitive bias and endless internet debate.

If you don't like what's being portrayed in the spreadsheets, change your mind and accept it or disprove the calculations.

I'll provide you with a pretty easy comparison for why I created the spreadsheets, since this appears to be completely missed as of late.

HD 6850 CF is a high-end GPU setup (average of all benches = 84.31 FPS)
GTX 580 SLI is a high-end GPU setup (average of all benches = 111.56 FPS)
The cost per year to OPERATE AND PURCHASE HD 6850 CF @6 hours a day (full load) with 2 hours idle, 300/365 days/year @$0.21/kWh = $484.28
The cost per year to OPERATE AND PURCHASE GTX 580 SLI @6 hours a day (full load) with 2 hours idle, 300/365 days/year @$0.21/kWh = $1,164.19

Per year, a difference of $679.91, or per month, $56.66. All to receive an extra 27.25FPS average over all benchmarks provided (well above 60FPS).
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
/sigh

Once more and then I'll concede this to contextually challenged, cognitive bias and endless internet debate.

If you don't like what's being portrayed in the spreadsheets, change your mind and accept it or disprove the calculations.

I'll provide you with a pretty easy comparison for why I created the spreadsheets, since this appears to be completely missed as of late.

HD 6850 CF is a high-end GPU setup (average of all benches = 84.31 FPS)
GTX 580 SLI is a high-end GPU setup (average of all benches = 111.56 FPS)
The cost per year to OPERATE AND PURCHASE HD 6850 CF @6 hours a day (full load) with 2 hours idle, 300/365 days/year @$0.21/kWh = $484.28
The cost per year to OPERATE AND PURCHASE GTX 580 SLI @6 hours a day (full load) with 2 hours idle, 300/365 days/year @$0.21/kWh = $1,164.19

Per year, a difference of $679.91, or per month, $56.66. All to receive an extra 27.25FPS average over all benchmarks provided (well above 60FPS).


You pick multi-GPU setups and talk about "cost"?

Funny...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Per year, a difference of $679.91, or per month, $56.66. All to receive an extra 27.25FPS average over all benchmarks provided (well above 60FPS).[/COLOR]

Yes, you are talking about Performance per Dollar, not Performance per Watt (i.e., efficiency).

Right, but you didn't need to compute the electricity costs to conclude that a GTX580 SLI setup for $900-1000 is overkill for 1920x1200 and represents poor value. We already know that from a performance per dollar perspective a GTX580 SLI setup is worse value than an HD6850 CF setup. So by you concluding that the total cost of ownership for the GTX580 SLI is much greater vs. an HD6850 CF setup is just stating the obvious since $900 of that cost is the cost of the videocards (not incremental electricity!).

In fact, based on your linked Anandtech benches, a single $220 HD6950 1GB achieves an average of 60-61 fps. So what incentive would someone have buying a faster videocard than that from your chart? I thought the point of your analysis was to single out efficiency and express the cost of ownership specifically as it related to the added electricity costs. Yet, I can't quickly determine this because nowhere does it tell me how much extra it would cost to operate an HD6970 over say an HD6850. Of course, I can do this task "manually" by subtracting Column J from Column L. You should probably provide this as a standalone column.

Let's just look at 1 example from your data:

HD5770 ($147.46 Total cost - $110 purchase price = $37.46 in annual electricity costs).
HD5870 ($425.95 Total cost - $380 purchase price = $45.95 in annual electricity costs).

You see how there is only an $8 difference per annum between owning a high-end 5870 vs. a low-end HD5770. So how are concluding that Electricity costs are so significant in the total cost of ownership?

Looks to me like that $380 purchase price for the 5870 is completely scewing your findings in favor of the HD5770 from FPS / $ perspective. Not to mention you could have picked up an HD5870 for $180-200 4-5 months ago.

I think you should create 3 charts: (1) showing ONLY the total electricity costs - this is required when trying to show how in/efficient each card is in terms of power consumption; (2) FPS / $ graph to show which card provides the best value from a performance per $ perspective; (3) Combine the electricity costs + cost / FPS. This way, we'll be able to use any one of those 3 charts and isolate for electricity costs.

Can you elaborate how are you deriving the Upgrade costs every 2 years? This would be extremely difficult to do since it doesn't take into account reselling your previous hardware (A GTX580 will re-sell for a higher price than a GTX460. As a result, it makes it difficult to determine the future cash outlay for either user).

To summarize, I understand why you clump together purchase price + electricity cost because you are deriving the total cost of ownership to the end user and then comparing it to the total performance you get. But you should separate the electricity cost entirely so that we can see the extent to which efficiency impacts the total cost of ownership. Since you are including the purchase price you are greatly reducing the value proposition for high(er)-end cards (which most of us understand would be the case since you pay top dollar for best performance, and every marginal dollar spent above say $250 brings diminishing returns). In the end, I really like what you have done, but I am not sold on the execution. The data tells us little about how energy efficient high-end cards are relative to each other or say how much more energy efficient mid-range cards are compared to high-end cards.

If you can alter your excel sheet to isolate specifically for electricity costs, that would get the point across much better.
 
Last edited:

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Right, but you didn't need to compute the electricity costs to conclude that a GTX580 SLI setup for $900-1000 is overkill for 1920x1200 and represents poor value. We already know that from a performance per dollar perspective a GTX580 SLI setup is worse value than an HD6850 CF setup. So by you concluding that the total cost of ownership for the GTX580 SLI is much greater vs. an HD6850 CF setup is just stating the obvious since $900 of that cost is the cost of the videocards (not incremental electricity!).

In fact, based on your linked Anandtech benches, a single $220 HD6950 1GB achieves an average of 60-61 fps. So what incentive would someone have buying a faster videocard than that from your chart? I thought the point of your chart was to single out efficiency in terms of Electricity costs (which is what we care the most about in this thread, right?). YetI can't quickly determine this because nowhere does it tell me how much extra it would cost for me to operate an HD6970 over say an HD6850. I can do this task "manually" by subtracting Column J from Column L. You should probably provide this as a standalone column.

Let's just look at 1 example from your data:

HD5770 ($147.46 Total cost - $110 purchase price = $37.46 in annual electricity costs).
HD5870 ($425.95 Total cost - $380 purchase price = $45.95 in annual electricity costs).

So you see how there is only an $8 difference per annum between owning a high-end 5870 vs. a low-end HD5770? So how are you showing that Electricity costs are so substantial in the total cost of ownership? Looks to me like that $380 purchase price for the 5870 is completely scewing your findings in favor of the HD5770 from FPS / $ perspective. Not to mention you could have picked up an HD5870 for $180-200 4-5 months ago.

I think you should create 3 charts: (1) showing ONLY the total electricity costs - this is required when trying to show how in/efficient each card is in terms of power consumption; (2) FPS / $ graph to show which card provides the best value from a performance per $ perspective; (3) Combine the electricity costs + cost / FPS. This way, we'll be able to use any one of those 3 charts and isolate for electricity costs.

Can you elaborate how are you deriving the Upgrade costs every 2 years? This would be extremely difficult to do since it doesn't take into account reselling your previous hardware (A GTX580 will sell for a higher price than a GTX460. As a result, it makes it difficult to determine the future cash outlay for that user).

To summarize, I understand why you clump together purchase price + electricity cost. But you should separate the electricity cost entirely. We want to know how in/efficient a particular card is from a power consumption perspective. My main point is that you are including the purchase price -- but this component greatly reduces the value proposition for high -end cards (which most of us understand since you pay top dollar for best performance) - thus this screws the results unfavourably for high-end cards like the HD6970/GTX570/580 and at the same time tells us little about how energy efficient they are relative to each other.

If you can alter your excel sheet to isolate specifically for electricity costs, that would get the point across much better.

Now we're bringing up good points. Thank you for the response.

The primary focus of the charts are to show how efficiently your money is being spent compared to what else is on the market. You simply can't discard the fact that you spend more money on the cost of a card and only value how much it costs to operate. That's irresponsible logic.

I do see where the confusion appears in the points you've made. Efficiency is pretty synonymous with cost effectiveness, and I failed to label correctly. I'll change the title of the spreadsheets and second post to correctly reflect their purpose.

Resale is not considered for the upgrade frequency. That would be difficult to determine IMHO. Especially after 4 years of ownership... I could generate an extra column to include a generalized resale value on each frequency of ownership, but like I've emphasized against before, this will cause the data to be subjective and not actual. I'm trying to keep the charts as unbiased as possible so if you have your own considerations or information to interpret, then you can use the variables on the charts as baseline.

These charts took a tremendous amount of time to create, and this is only a hobby. They're evolving as time passes and as people provide feedback. So far, your points are the only constructive information I've received - let the evolution begin!

The problem now will be how to simplify the charts so they aren't so huge that people won't even look, much less be able to read them. They're big as of now...
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I am also interested in seeing a column with the electricity costs only, without mixing in the price of the card.

I would guess that the AMD card prices might fluctuate wildly depending on how things go with Bitcoins - so it would be helpful to have a way to see electricity only cost. I was shocked to see recent prices on Ebay for 5870 cards, since I bought mine new for $200 after rebate, but see that it has sold on ebay for $350 used.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The primary focus of the charts are to show how efficiently your money is being spent compared to what else is on the market.

Yes, I understood eventually that's what you were trying to portray. That can be your scenario of called "Total Cost of Ownership".

Resale is not considered for the upgrade frequency. That would be difficult to determine IMHO. Especially after 4 years of ownership...

Agreed. It would be too difficult to estimate. I think you should just create a scenario for owning a particular card for 2 years. I don't think it's necessary for you to include the future upgrade costs since that adds too much ambiguity. In fact, even 1 year cost of ownership is sufficient since we can just double it or tripple it in our head.

I could generate an extra column to include a generalized resale value on each frequency of ownership, but like I've emphasized against before, this will cause the data to be subjective and not actual.

Ya, don't do that. Too subjective.

I'm trying to keep the charts as unbiased as possible so if you have your own considerations or information to interpret, then you can use the variables on the charts as baseline.

Just include 2 main scenarios:

1) Efficiency - Electricity Costs per Annum (and if you want add Performance / Watt metric -- so how many FPS / 1 watt of power the card uses, or FPS / $ spent on electricity).

2) Total Cost of Ownership - this will include the purchase price + electricity costs vs. FPS achieved. You can include a side-column with FPS / $ spent as well.

These charts took a tremendous amount of time to create, and this is only a hobby.

You've done a great job. All the information is there. You just need to reorganize it.

I suggest you create 3 basic Bar Graphs. Make them simple like this:
http://www.xbitlabs.com/images/coolers/120mm-fan-roundup-2/186_dmax-cfm_xbt.png

1 chart will show summary of Annual Electricity cost per videocard assuming 6 hours of load usage per day.

A 2nd chart would show us Performance (FPS) per $$$ spent on the videocard or per $$$ spent on annual electricity (it's up to you which you choose).

Finally a 3rd chart will the Total/Aggregate Cost of Ownership (Purchase price + electricity cost per annum) per videocard vs. FPS it achieves.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Well, adding an extra column to show operational costs individual of purchasing will be easy. I've also considered calculating heat dispersal costs, but that will be too variable depending on room size, A/C unit efficiency, and air flow. I suppose a generalized figure could be used. We'll see.

I'll look into creating bars and graphs for easier display. That'll definitely give me some goals to work towards since that'll be a first endeavor.

Again, I appreciate the feedback that helps mold the focus of where these sheets are going. The data here is certainly not for myself alone, and any way to make it easier or more usable for others is welcomed. Any other ideas or criticism I'm willing to entertain!
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Updated the spreadsheet.

Added a row for averages on the very bottom, a column for rankings for Total cost/FPS, and a column for operational kWh's used. I didn't get around to making easy to read charts this time. I know these are becoming blinding to look at; we'll get there eventually!

The old spreadsheet is still available if you're curious to see how trends are going for pricing. Unfortunately performance numbers didn't changed between the old and the new. Old. New.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Great!!

Now you can clearly see from your charts that the power costs between a low-end (HD5770) and a high-end card (HD6970) is very small: $37 vs. $49

Also the difference between the HD5870 and GTX480 is also pretty small: $46 vs. $61

In other words, most of the difference comes from Price of Card/Setup vs. FPS achieved.

Obviously, mileage will vary for people who game > 6 hours a day and where electricity costs are much higher (say Hawaii). But generally speaking, it appears that for people already buying $200+ videocards, the cost of electricity among mid-range and high end cards is only about $10-15, which is pretty minor considering some of these videocards cost $250-500 at launch!
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I believe you should pull the Heat/FPS column as it really doesn't make sense as it stands. Heat isn't a unit of measurement. We can measure a card using 100 watts, but you can't measure a card creating 100 heat.

Second, the premise of the column is off. I understand why you wanted to put it there, to inform customers that 480SLI does indeed generate more heat than a 5770. However, the amount of heat generated has nothing to do with the temperature of the card as you are using it.

Fortunately, the information you want to convey is already in the graph. Since nearly all of the electricity running through our video cards gets turned into thermal energy, the "Watts/FPS" column tells the user nearly exactly how much heat is generated using watts as our unit of measurement.

A very very informative chart despite my nitpickings
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
But generally speaking, it appears that for people already buying $200+ videocards, the cost of electricity among mid-range and high end cards is only about $10-15, which is pretty minor considering some of these videocards cost $250-500 at launch!
Yep, that pretty much nullifies the whole electricity costs argument. If you can't afford an extra 10-15 bucks a year for electricity then you can't afford a $300 video card to begin with.

TDP only makes sense from the point of view of noise given hotter cards tend to be louder than colder cards. You’ll be affected by the extra noise every time you game, so it becomes very relevant in that case.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yep, that pretty much nullifies the whole electricity costs argument. If you can't afford an extra 10-15 bucks a year for electricity then you can't afford a $300 video card to begin with.

TDP only makes sense from the point of view of noise given hotter cards tend to be louder than colder cards. You’ll be affected by the extra noise every time you game, so it becomes very relevant in that case.

Ya, I agree with you, at least in most of North America. The situation may be different for those with very high electricity rates or perhaps those who run their card at 100% load 24/7. Otherwise, the added heat (esp. in the summer) and noise are far more negative side effects of say choosing a GTX480 over a GTX460.

I am not sure the person who is buying a $300-500 graphics card over a $100 low-end card is going to be sweating the extra $15 per year, when they are getting at least 50-75% higher framerate (which we reason is why they bought a high-end card in the first place). Also, such a system is likely to be paired with a good PSU, $150-300+ CPU, etc. In other words, the electricity cost difference is there but it is unlikely to be a deterrent for a hardware enthusiast who is already spending hundreds of dollars on a customized PC system.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Yep, that pretty much nullifies the whole electricity costs argument. If you can't afford an extra 10-15 bucks a year for electricity then you can't afford a $300 video card to begin with.

TDP only makes sense from the point of view of noise given hotter cards tend to be louder than colder cards. You’ll be affected by the extra noise every time you game, so it becomes very relevant in that case.

You seem to have a short term memory...the price of running cards has many times (falsly) been used to attack GPU vendors on this forum.

I hope this thread will kill that strange misconception...even if it was unintetionally, the OP seemed to have a different plans with this thread:

Before I get into the real math and figures, I will say that in a year's time, the differences in costs are pretty substantial. Substantial like the fact that we critically argue how crucial initial costs of CPUs and GPUs per their performance marks, but when power efficiency comes into play we seem to forget operational cost per performance. If the video card you just bought for $30 cheaper performs even on par with another, but is significantly inefficient with its watt usage and heat generation in comparison and you'll spend that $30 you saved over the next 6 months - then more in the following months - wouldn't that be something you'd consider when originally purchasing it? Don't think energy costs that much? I dare you to continue reading.


The problem is this:
Great!!

Now you can clearly see from your charts that the power costs between a low-end (HD5770) and a high-end card (HD6970) is very small: $37 vs. $49

Also the difference between the HD5870 and GTX480 is also pretty small: $46 vs. $61

In other words, most of the difference comes from Price of Card/Setup vs. FPS achieved.

The perf/watt has been inflated past it real world value...FOTM...now let it die.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |