Efficiency and its importance in gaming - updated GPU cost effectiveness list

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jhansman

Platinum Member
Feb 5, 2004
2,768
29
91
And yes, AMD pretty much slaps Nvidia around when it comes to performance per watt. Let's just see them do it with their CPUs vs Intel

If AMD had Intel's fab resources, market share, and influence, they might. Anyway you slice it, that's a David & Goliath situation, and likely always will be. But I digress. This thread is about energy efficiency; for my part, I pulled two rotational drives out of my system and put in a boot SSD. My vidcard, admittedly low end, is passively cooled.

This discussion interests me because I am readying a GPU upgrade, and until now, never consider power consumption beyond the PSU. Oh, and I am at the 2.5 yr mark since my last build, which squares with the chart.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,713
142
106
my power bill recently went down by $100 because I stopped running distributed computing

to think of all the money wasted over the last 12yrs that i've run distributed computing non-stop nearly 24/7

geeze ... what a costly addiction
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
my power bill recently went down by $100 because I stopped running distributed computing

to think of all the money wasted over the last 12yrs that i've run distributed computing non-stop nearly 24/7

geeze ... what a costly addiction

Wow, that's a lot.

How many GPUs are you running? Even at 400W @ 24 hours x 30 days = 288 kW.

$100 / 288 kW = $0.35 per kW.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,713
142
106
2 computers, was pulling 700-800KWh per month
the tiered price was like $0.25 per KWh in the upper bracket
i've got to stay under 350KWh per month (baseline) to get the lowest price (under $0.10)

not counting the 2 roommates and other appliances
the breakdown per computer/component was something like this:
225W workstation (full load)
160W server (full load)
100W crt tv
100W crt monitor

i've gotten down to the following:
135W workstation (idle @stock)
85W server (idle under-clocked)
20W lcd replaced the tv
100W crt monitor still

I also upgraded the cable box ... but unfortunately it still pulls the same 20W even when turned off
 
Last edited:

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
You seem to have a short term memory...the price of running cards has many times (falsly) been used to attack GPU vendors on this forum.

I hope this thread will kill that strange misconception...even if it was unintetionally, the OP seemed to have a different plans with this thread:

The perf/watt has been inflated past it real world value...FOTM...now let it die.

Look below for "false attacks" and "inflated past real world value."

Great!!

In other words, most of the difference comes from Price of Card/Setup vs. FPS achieved.

Indeed, however, the numbers that you're citing are nearly as minimal as they can be. See the following.

Here is the clearest purpose for why I created these spreadsheets:

For all intents and purposes, let's compare the HD6870 in CF versus the GTX470 in SLI.
As per 08-20-11, they're very similar in average FPS over all benches: 96.25 versus 98.82
For someone who spends $.21 per kWh, the differences in operational costs per day will be: $.59430 versus $.81564 (or $.22134)
On a monthly basis of ~30.4 days (365/12) this equates to: $6.728

Also consider the fact that GTX470 SLI costs $475.72 initially compared to a pair of HD6870s at $349.98, $125.74 less.

Also include that the HD6870 CF runs idle at 45°C, load at 76°C, versus a pair of GTX470s idling at 54°C, load at 96°C. This will make a large impact on A/C costs.

Quite a few people in this thread have mentioned paying more than $.21 kWh, so this gets multiplied. For instance, Concillian mentioned paying $.32 per kWh if they reach tier 3. For a house with a family of 4, reaching tier 3 really isn't that unrealistic. What's the operational costs @$.32 kWh versus $.21 kWh? $10.252/month. All for a setup that provides relatively the same framerate (an unnoticeable increase of 2.57 FPS over all benches).

Hopefully this will provide perspective for those that still think GPU performance efficiency isn't something to consider.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
I believe you should pull the Heat/FPS column as it really doesn't make sense as it stands. Heat isn't a unit of measurement. We can measure a card using 100 watts, but you can't measure a card creating 100 heat.

Second, the premise of the column is off. I understand why you wanted to put it there, to inform customers that 480SLI does indeed generate more heat than a 5770. However, the amount of heat generated has nothing to do with the temperature of the card as you are using it.

Fortunately, the information you want to convey is already in the graph. Since nearly all of the electricity running through our video cards gets turned into thermal energy, the "Watts/FPS" column tells the user nearly exactly how much heat is generated using watts as our unit of measurement.

A very very informative chart despite my nitpickings

Thanks for the feedback and praise. Much appreciated!

I'm confused on what you're saying. Watt consumption does not directly equal energy loss in the form of heat; when comparing different cards. Efficiency in architecture of the card and how it processes data, design of air flow and volume of air flow (or heat dispersal via water cooling) modify those numbers. The comparison comes into play for purposes of cooling the heat generated (A/C costs).

Oddly enough, you got me thinking about how to provide energy lost through heat / wattage used. This would need to be done by comparing the two columns of Watts/FPS and Heat generated. This would show another aspect on how efficiently a card is utilizing the energy it drains.

I've also considered using the average amount of heat generated and wattage used for all the cards compared and creating an [unfortunately] subjective scale on top of FPS/costs to get at a more universal comparison level. Subjective complications of this are detering me from doing so, however. Say, anything above average would receive a penalty. The penalty scale would be where issues arise.

Yet more columns...lol. This thing's getting ridiculously large. I hate leaving anything out, but... Ugh...

Thoughts anyone?
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Quite a few people in this thread have mentioned paying more than $.21 kWh, so this gets multiplied.

FYI, here in the Netherlands we pay 21 eurocents per kWh. Now for technical reasons, I am forced to heat my water with an electric boiler (in stead of using gas, like everybody else here does). Because of that I have a subscription where I pay 14 eurocents during the night, and 28 eurocents during the day.

28 eurocents = $0.40.

Over half of that price is a special energy tax. Because of the mild climate, it is very very rare for people to have AC in their homes.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Indeed, however, the numbers that you're citing are nearly as minimal as they can be. See the following.

Disagree. 6 Hours of gaming a day 300 days a year is definitely on the high-end for most people. You are providing one of the worst case scenarios outside of people doing distributed computing. Like I said before, people who are gaming 6 hours a day / 300 days a year and are complaining about electricity cost should invest that time into getting a better paying job, upgrading their skills, etc. 6 hours a day x 300 days a year of gaming is an addiction, not a typical scenario.

For all intents and purposes, let's compare the HD6870 in CF versus the GTX470 in SLI. Also include that the HD6870 CF runs idle at 45°C, load at 76°C, versus a pair of GTX470s idling at 54°C, load at 96°C.

My GTX470 idled at 38-42*C and loaded at 76-78*C. So I can't agree with the data you provided. I provided screenshots on our forum many times and can do so again. That's with the stock TIM too. I always stress how important it is to have a case with excellent air ventilation. Most people don't.

Regardless you correlating GPU temperatures with A/C is questionable. US homes tend to have central air. So if you have 4-5 rooms and 1 room is hot, A/C won't continue working just to keep that 1 room cold. You are making it sound like A/C will continue to work because 1 room has a computer in it.

Also consider the fact that GTX470 SLI costs $475.72 initially compared to a pair of HD6870s at $349.98, $125.74 less.

I got 2 GTX470s for $404 in July 2010. HD6870 didn't launch until October 22, 2010 (nearly 5 months later). Its launch price was $239. I would have had to wait another 5 months before I could buy 2 of those for $350. HD6870s didn't dip to $175 until GTX560 Ti launched. This comparison is completely flawed because:

1) People who bought GTX470 bought it when HD6870 wasn't on the market at $175
2) People who bought GTX470 bought it because it cost $100+ less than HD5870 and performed within 10% of that card, so the value was actually greater than the HD5870.
3) NV shipped 2-3 games with GTX470 when it launched for a period of 3+ months, providing additional value the in US.

You shouldn't compare older setups with newer setups because people who are buying an HD6870 are choosing a GTX560 Ti; so the comparison should be made about what's available today. It wouldn't make any sense to compare the operational and FPS costs of HD7970 to an HD6970 8 months from today, would it? But you just did that with a GTX470 vs. HD6870. No one is going to be choosing an HD6970 vs. HD7970 8 months from now. It's a good theoretical comparison.


28 eurocents = $0.40.

Ok, but you earn your salary in euro right?
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Disagree. 6 Hours of gaming a day 300 days a year is definitely on the high-end for most people. ...6 hours a day x 300 days a year of gaming is an addiction, not a typical scenario.
How many people watch TV, 6 hours a day - 365/365? Think of the electricity used for those 50" screens going 24/7 in some households.
- And there are eighteen other hours per day; 6-8 is taken usually by sleep. There is always plenty of time for work if you have a job.

i'd say gaming is far more productive than TV watching and you could get into the industry.
- just for a counter-point.
 
Last edited:

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
RussianSensation, you're regurgitating a lot of points we've already discussed.

Disagree. 6 Hours of gaming a day 300 days a year is definitely on the high-end for most people.

The amount is not meant to be indicated as an exact amount that represents all computer users. It's a representation of the differences found between each GPU in their efficiencies. As I stated before, it doesn't matter if it was 5 minutes or 24 hours a day. The number is there to easily show you scale of the differences.


My GTX470 idled at 38-42*C and loaded at 76-78*C. So I can't agree with the data you provided. I provided screenshots on our forum many times and can do so again. That's with the stock TIM too. I always stress how important it is to have a case with excellent air ventilation. Most people don't.

I can't take your values for a certain card as verbatin. The numbers I have on the spreadsheets are directly from Anandtech's bench tool. If you have a problem with their results I suggest you talk to their benchmarkers.

Regardless you correlating GPU temperatures with A/C is questionable. US homes tend to have central air. So if you have 4-5 rooms and 1 room is hot, A/C won't continue working just to keep that 1 room cold. You are making it sound like A/C will continue to work because 1 room has a computer in it.

Irrelevant. Not all computers are in rooms that are closed off to general air flow. Not only that, but with this logic, you're saying heat generated isn't worth considering at all since central air doesn't run specifically for 1 room that's hot.



I got 2 GTX470s for $404 in July 2010. HD6870 didn't launch until October 22, 2010 (nearly 5 months later). Its launch price was $239. I would have had to wait another 5 months before I could buy 2 of those for $350.

None of these cards came out at the same time, but the point is they are all available NOW. You can't argue past and future in a present situation, lol. How would you have known? You wouldn't have! Perhaps you need to read the disclaimers again in post #2.

No one is going to be choosing an HD6970 vs. HD7970 8 months from now. It's a good theoretical comparison.

You don't know that. Honestly, you don't. Not everyone buys the latest and the greatest right when it comes out. I for one don't. I wait until the prices drop off that steep ledge and the software matures to meet the capabilities of the hardware. In 8 months, the HD6970 could very well be the most efficient buy for your money depending on revamping technologies of aftermarket designs, what the next generation brings, price fluctuations and market values. What if the 7xxx series completely bombs and the HD6970 still rocks? You can't theorize the future and deny anything from being possible. That's incredibly poor rationalization.

Ok, but you earn your salary in euro right?

Now you're just trollin' for the sake of trollin', lol.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I'm confused on what you're saying. Watt consumption does not directly equal energy loss in the form of heat; when comparing different cards.
Yes it does
pandemonium said:
Efficiency in architecture of the card and how it processes data, design of air flow and volume of air flow (or heat dispersal via water cooling) modify those numbers. The comparison comes into play for purposes of cooling the heat generated (A/C costs).
Actually electronics are basically nearly 100% efficient at transferring electricity to heat. Even if you don't want to believe me, its very easy to see why the column is useless by applying some common sense.

The first and easiest way to show that GPU temperature isn't really related to power consumption as you are suggesting is to stick your finger into your GPU fan while running furmark. Its not suddenly using a crapton more power because the fan stopped.

Another fun example is to use different scales of temperature. What if we measured temps using something madeup called a BATTSECKS unit. The freezing and boiling point of water in the BATTSECKS scale is -50* and 50* respectively. The idle temperature of a 5770 is 313.15 degrees Kelvin, or -10.15* BATTSECKS. According to the same math as your chart, a 5770 uses negative heat when idling. While the values don't mean anything since power consumption again has nothing to do with GPU temperature, at least use the Kelvin scale so you don't get outputs that break the laws of physics.

Lets look back at the load temps for a bit. A GTX 480 gets up to 94*C in Crysis. Adding a second card for SLI brings it up to 96*C. If we are using the same metric that Heat produced = GPU temperature, that second card only adds 2.1% more heat.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
I understand where you're going, Ben, and you're right. I'm definitely not disputing Joule's first law or anything here. The point I'm trying to make is that no two cards are equal when it comes to their productivity versus the energy consumed - then in parallel - the heat dispersed.

I'll try to example what I'm saying. In order to keep the comparison as objective as possible, I'll pick 2 cards that have nearly the same Load noise levels. This is to reduce the comparative amount of heat dispersal that is being done by heat-sinking and fan assemblies. I'm not comparing FPS directly between these two cards, but relating their performance measures based on how much energy they consume and the heat retained on the card; baselined by their ambient dB levels. They're the HD5870 in CF and GTX470 (non-SLI).

Per the 8-20-11 charts:
-HD5870CF averages 97.55FPS, uses 460 watts under load, runs at 61.7 dB and 80°C.
-GTX470 averages 58.16FPS, uses 366 watts under load, runs at 61.5 dB and 93°C.

The HD5870CF is generating a higher FPS than the GTX470 (+39.39FPS), it's using more energy (+94 watts), runs .2 dB louder (negligible for this comparison), and runs 13°C cooler. Can you tell me that the .2 dB increase in ambient volume makes up for the 13°C difference in heat dispersal? Will an increase of 39.39FPS and 94watts consumed scale even remotely close to a drop in 13°C against other card comparisons? If heat dispersal efficiency (even at a negligible audible level of +.2 dB) is the only consideration for weighing Watt/FPS against Heat/FPS, then how is this possibly so adverse against other comparisons?

Does that make sense?
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I'll try to example what I'm saying. In order to keep the comparison as objective as possible, I'll pick 2 cards that have nearly the same Load noise levels. This is to reduce the comparative amount of heat dispersal that is being done by heat-sinking and fan assemblies. I'm not comparing FPS directly between these two cards, but relating their performance measures based on how much energy they consume and the heat retained on the card; baselined by their ambient dB levels. They're the HD5870 in CF and GTX470 (non-SLI).
You can't just use dB levels as a baseline of heatsink performance. If the above was true, the Intel stock cooler would be a beast and Honda Civics with Fartpipes would be fast.

Imagine a video card that had a huge phase change unit integrated within itself for a cooler. The GPU temperature could be below zero. According to your chart the card is creating negative heat. That makes no sense.
 
Last edited:

kalrith

Diamond Member
Aug 22, 2005
6,630
7
81
First of all, THANKS OP! The spreadsheet is great. I'm very much on board with this. For those of you who keep saying you don't care about power consumption, that's bull (unless you don't pay for your electricity). If you're comparing two cards of similar performance and one costs $40 more than the other, of course you'll choose the cheaper card. It would be the same thing with two equally priced and equally performing cards with one of them using $40 more in electricity per year.

I built a low-power HTPC/file server that draws 40W at the wall on idle, so I did a lot of research about the impact of energy costs. I also am very much an advocate for not wasting money for "unnecessary" (IMO) things. I canceled my satellite, use free TV only, and switched to a combination of prepaid cell phones and Magic Jack for my communications. Those changes have saved me $1500 per year for the past 3 years, and my only costs for going that route were about $500 - 600 for the HTPC and TV antenna, plus about $30 per year in electricity to run the HTPC.

I'm considering upgrading my gaming PC, and I hate how much it heats up the room even at idle.

I am going to make a few recommendations to the spreadsheet, which might be a lot of work, but I'll recommend them anyways. All of us have different computer-usage amounts and different electricity costs. I recommend putting three fillable fields to contain electric costs, idle usage per day, and load usage per day. Then, make your formulas based on these amounts. You can prefill them with your amounts so that the spreadsheet will start off being populated, but people will be able to tweak your findings to suit their own situation.

The other recommendation is to make the rankings column dependent on one of the other columns (probably total costs per FPS). That will allow that rankings to change dependent on that column.

Thanks for all the hard work!

Edit: Adding a fourth fillable field for frequency of GPU upgrade would be helpful as well.

Edit2: And as a caveat to my statement above about why people should care about efficiency, I finished reading the thread, and I'm not saying anything about people buying one 6950 instead of a 580 SLI setup for efficiency. I'm saying that efficiency should be looked at as much as initial purchase price. The initial price of card A might be less than card B, but the total cost of ownership could be more for card A if it's less efficient. And I think total cost of ownership is what pandemonium is trying to drive home. If you spend $3,000 on a computer, then you might not care about total cost of ownership (much like someone who buys an AMG doesn't care that it's a gas guzzler). However, I don't think it's a stretch to say that the majority of forum members fall into the sub-$300 video card category, in which efficiency can make a very big difference for the total cost of ownership.
 
Last edited:

kalrith

Diamond Member
Aug 22, 2005
6,630
7
81
Well, I had a few extra minutes, so I made my proposed changes at the following link:



Since the inputs can be changed, only one set of data needs to be included, which I think makes it a little less overwhelming. I also froze the first pane, so you'll always know what cards you're looking at.

Anyways, let me know what you guys think.

will fix link in a minute
 

gevorg

Diamond Member
Nov 3, 2004
5,075
1
0
Awesome analysis! This should go on the main Anandtech page. Maybe a new section for hand-picked reviews by AT forum members.
 

dualsmp

Golden Member
Aug 16, 2003
1,626
44
91
Can someone upload pandemonium's spreadsheet to Rapidshare or another site? Thanks.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
You can't just use dB levels as a baseline of heatsink performance. If the above was true, the Intel stock cooler would be a beast and Honda Civics with Fartpipes would be fast.

Hah, definitely no. You're missing the other variables in those analogies that are oh-so important for the comparison; such as displacement, mpg, bhp, whp, curb weight, etc.

The point was to eliminate, as much as theoretically possible, the differences of performance of heat dispersal on the two cards so we could compare how Watt/FPS aligns to Heat/FPS between different cards. If all cards were equal, they'd scale relatively within the same line with each other with power consumed and power lost through heat. Since we know they're not equal due to design differences, we can then compare Watt/FPS and Heat/FPS as a property of efficiency as well as how each card effects our budgets. That's how I see it anyways.

Imagine a video card that had a huge phase change unit integrated within itself for a cooler. The GPU temperature could be below zero. According to your chart the card is creating negative heat. That makes no sense.

True, but you're going too extreme for comparative purposes. All of the cards on this chart are actively air cooled. In order to equally use my comparison you'd have to apply the same cooling techniques to all the video cards.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
pandemonium said:
In order to keep the comparison as objective as possible, I'll pick 2 cards that have nearly the same Load noise levels. This is to reduce the comparative amount of heat dispersal that is being done by heat-sinking and fan assemblies.
Hah, definitely no. You're missing the other variables in those analogies that are oh-so important for the comparison; such as displacement, mpg, bhp, whp, curb weight, etc.
Perfectly phrased, there are other things that determine how well a cooler cools vs just the noise it makes. Case in Point: GTX 470 vs GTX 580.

The 580 uses more power yet has a lower load temperature and lower noise signature. This is due to being switched to vapor chamber cooling.

The way you calculate "Heat" would incorrectly put the 470 as using more power when it doesn't.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
First of all, THANKS OP! The spreadsheet is great. I'm very much on board with this.

You're welcome! I'm glad another person sees use for the sheets.

I am going to make a few recommendations to the spreadsheet, which might be a lot of work, but I'll recommend them anyways. All of us have different computer-usage amounts and different electricity costs. I recommend putting three fillable fields to contain electric costs, idle usage per day, and load usage per day. Then, make your formulas based on these amounts. You can prefill them with your amounts so that the spreadsheet will start off being populated, but people will be able to tweak your findings to suit their own situation.

The other recommendation is to make the rankings column dependent on one of the other columns (probably total costs per FPS). That will allow that rankings to change dependent on that column.

Thanks for all the hard work!

Edit: Adding a fourth fillable field for frequency of GPU upgrade would be helpful as well.

I agree that having a fill-able field would be perfect and actually reduce the size of the sheets so I don't have to calculate separate frequencies out. I'm actually learning how Excel works by doing these sheets, so I apologize if they're not the best way of doing things. I love learning it, but if anyone is an expert you're more than welcome to modify the sheets. Send me a message and I'll send the sheets over your way.

Well, I had a few extra minutes, so I made my proposed changes at the following link:

I'm not seeing a link. :/
 
Feb 19, 2009
10,457
10
76
Crazy how cheap electricity is for Americans.

It's around 1/6 - 1/10th the price compared to a lot of EU countries.
 

FredGamer

Junior Member
Jul 27, 2011
7
0
0
For someone who spends $.21 per kWh, the differences in operational costs per day will be: $.59430 versus $.81564 (or $.22134)
On a monthly basis of ~30.4 days (365/12) this equates to: $6.728


It would be difficult to spend $.21 per KWh in the USA, where residential electricity costs $.11 per KWh average across the country:

http://www.eia.gov/cneaf/electricity/epm/table5_6_a.html

So I say "HD6990 buyers, enjoy."
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
None of these cards came out at the same time, but the point is they are all available NOW. You can't argue past and future in a present situation, lol. How would you have known? You wouldn't have! Perhaps you need to read the disclaimers again in post #2.

You are missing my point, because of the statement you made. For instance, sure GTX480 is available NOW for $300. But hardly anyone would buy that card when you can get a GTX570 for same price.

I mean, yes, I suppose it's useful to see how much more efficient modern cards are. However, since you brought Price / Performance FPS and wattage into consideration, it's almost a certainty that newer generations will be superior to older generations (i.e., since GPU speed increases at the same price indefinitely from 1 generation to the next).

Essentially, in order for you to make a point, you have to compare similar setups and see the difference in electricity costs. I have gone ahead and done this, as outlined below.

You don't know that. Honestly, you don't. Not everyone buys the latest and the greatest right when it comes out. I for one don't.

I get that. I can tell you that HD6870 for $150 is one of the best bang for the buck cards right based on Price / FPS. The electricity cost differences between an HD6870 and HD6770 won't suddenly make HD6770 more attractive. Just like in 8 months from now, a $350-400 HD7970 isn't going to be a better bang for the buck than a used HD6970 for $180 based on electricity costs.

My point is in North America, electricity cost is a very small fraction of the total ownership cost (bar the extremely power hungry GTX480). From your own graphs it's evident it amounts to $5-15 annually between a worst and best case scenario (it's even smaller than that).

Look at what happens when you compare apples to apples as a buyer would do today:


.......................


Annual Electricity Costs - using YOUR data:

Single GPUs: Annual electricity costs for comparable single GPUs

Ultra-High End = GTX580 = +$6.64 over HD6970 but there is already a $100-150 price premium for this card over the HD6970/GTX570. GTX580 users tend to be price inelastic. You think they would care about $7 a year in electricity premium over HD6970 @ 6 hours of gaming/day for 300 days a year!?

High-End = GTX570 vs. HD6970 = +$2.94 difference
Mid-High-end = GTX560 Ti vs. HD6950 = +$3.10 difference
Low-Mid-range = GTX460 vs. HD6850 = +$4.41 difference

Older generation:

HD5870 vs. GTX470 (+$7.43) but HD5870 cost $350 at the time when GTX470 cost $280-300. So hardly anyone would have chosen the 5870 for electricity "savings". GTX480 cost $500! so hardly anyone who was eyeing it over the $350 HD5870 would have cared either. So again electricity costs between GTX470/480/5870 wouldn't have mattered (heat and noise would have).

Now let's move on to SLI/CF:

Multiple GPUs: Annual electricity costs for comparable Multiple GPU setups

Ultra-High End = GTX580 SLI vs. HD6970 CF = +$8.48 (When the cash outlay is $850-1000 vs. $700 on AMD side.......electricity cost is irrelevant) < Regardless proper opponent is GTX570 SLI I would say based on price.

High-End = GTX560Ti SLI vs. HD6950 CF = You don't have any data for GTX560 Ti SLI....Ok fair enough.

Let's look at HD6870 CF vs. HD6950 CF vs. HD6970 CF = $59.43 vs. $66.42 (+$6.89) vs. $78.25 (+$11.83)

Mid-Range = GTX460 SLI vs. HD6850 CF = +$6.59 difference

Your own analysis shows that the annual electricity cost difference between comparable setups ranges from about $3-12 per annum.

The only serious outliers are GTX470/480. However, both of those cards were special cases because GTX470 was significantly cheaper than the HD5870 and GTX480 was significantly more expensive than the HD5870 on release. And again, most GTX470/480/5870 users aren't going to side-grade to current generation for electricity savings.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
First of all, THANKS OP! The spreadsheet is great. I'm very much on board with this. For those of you who keep saying you don't care about power consumption, that's bull (unless you don't pay for your electricity). If you're comparing two cards of similar performance and one costs $40 more than the other, of course you'll choose the cheaper card. It would be the same thing with two equally priced and equally performing cards with one of them using $40 more in electricity per year.

Except, if you looked at the data presented for comparable setups, the electricity cost differences between NV and AMD tend to hover at about $5-10 (+/- $2-3). So it's nowhere near $40; and therefore hardly material, despite being projected at higher electricity cost rates than the avg in US already.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |