Oh, thanks - didn't see that you had a separate spreadsheet linked in the OP.
You know, I'd be curious if those load power consumption figures hold with the Gigabyte card. 412w just seems way too high, especially considering it was way over your OC/OV/unlocked 6950. The 7970 should come in maybe just above a stock 6970, not way above a hyperclocked one, and only 65w higher than a stock 6950, not 130w higher. Even your slightly higher idle power use was suspect - maybe it really was a bad card.
This is just going on Anandtech's review:
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27
That spreadsheet is where you'll find all updates first, so always check that. The power consumption is very odd, even at stock. It could be that Crysis 2 just really pushes the GPU, where as the 6950 might have been limited by its tessellation engine or something. Like you said, the Gigabyte's results will shine some light on this. Also, the higher idle result might be my own fault. I generally let it idle for a minute or so and take the recorded number. It might take 10 minutes or having the monitor off or something to get the GPU to kick down to its 3W state. On the other side though, maybe this shows how irrelevant that 3W idle state is, i.e. if you're using your computer, it's never going to enter it. I'll take a look into it.
@ OP/Thread: I'm really thankful for you taking the time to test these things out.
I do hope the XFX was just a bad sample of what we can expect.
And your videos of the stock cooler, @50% it is probably a little louder than my HD 5870 around 50%. But with the recent grinding noise, I'm not sure haha,
Hopefully the MSI Lightining is in the pipeline for March.
Unfortunately, the camera doesn't pick up the sound pressure that well. The quality is relatively accurate, but the overall intensity is "fuller" in person, I would say. In either case, it's an improvement over previous generations, so anyone upgrading should expect a positive change.
interested in what you find with the Gigabyte card. FWIW my XFX Black Edition DD card doesn't respond to voltage either. i've used both Afterburner and Trixx and can't break past an 1100 MHz wall on the core. it seems like both Afterburner and Trixx are still getting the kinks worked out for our cards, but i'm not optimistic my card will do any better once they do. anyway, looking forward to your results with the Gigabyte card.
According to Unwinder, AB's voltage control is completed, and there will be no more updates. That's one of the reasons I was a little more certain in blaming the card here over the programs. As you said, the Gigabyte will at least give another data point, although it will be hard to derive any resounding conclusion from only two data points.
trixx can go up to 1.3v
not mine, mine is still being shipped to me :|
I tried AB, Trixx, and GPUTweak. They all had the same result, so I'm guessing they all access the same voltage controllers.