kalrith
Diamond Member
- Aug 22, 2005
- 6,630
- 7
- 81
Except, if you looked at the data presented for comparable setups, the electricity cost differences between NV and AMD tend to hover at about $5-10 (+/- $2-3). So it's nowhere near $40; and therefore hardly material, despite being projected at higher electricity cost rates than the avg in US already.
I read your posts and concede to your point. At this point they're doesn't seem to be a large difference in costs for cards of comparable performance. As you've already mentioned, the only recent outliers in energy usage are the GTX470 and GTX480. So, this spreadsheet possibly would've been useful in evaluating those vs. another comparable video card.
It seems to me that the real impact of this spreadsheet could be in deciding to choose a lower-tier card over a higher-tier card in order to save additional money. While someone looking to spend $300 on a video card probably isn't worried about the extra energy costs, someone who's scraping pennies to save up for a card might appreciate the future energy costs of a lower-tier card.
Another applicable area would be in something like an HTPC or a media server, especially if they're left on 24/7. Someone might have a spare 4870 lying around (or buy a used energy hog) and think they should use it in their HTPC/media server, when they would be better served by a card that uses a lot less energy. One of the reasons I put a 4550 in my HTPC is because it uses very little energy.