lavaheadache
Diamond Member
- Jan 28, 2005
- 6,893
- 14
- 81
Do you seriously need me to spell this out for you? Average is what matters as that is over time.
Taking your logic to the extreme, if a card averages 1 watt for an hour but has one spike up to 100 watts for a fraction of a second before settling back down to 1 watt for an hour again, then that card consumes more power than another card that averages 90 watts for an hour with no variation.
No. It doesn't work that way. The first card consumes ~89 watts less than the second card, regardless of any random spikes here or there. I don't care if you don't believe the math. Your electric company surely agrees with my math, and they are who matters.
It's 19w difference average for the cards they measured for the game they measured. Peak only matters to determine what size PSU you need. When comparing power draws, you need to compare averages.
By the way: power draw on average across many many games will be different than what a one-game comparison says. Power draw could be more or less depending on the particular card and binning of the chip, as there isn't even a single "stock" voltage anymore with these newfangled GPUs). See, e.g., HardOCP for how energy usage varies depending on the game: http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/13
Commendable job.
My former 7970(1200mhz) rig averaged about 430-450w while gaming peaks in the 465 range. (gtx 285 for physx and 2600k oce'd to 4.4)
My current 680 rig (card swap alone 1216 boost clock) roughly averages 400-405 with peaks of 420+