Mr. K6's 6950 -> 7970 Overclocking Review

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Figured I'd chime in with my results since I've been rock solid for over a week now.

I have a Gigabyte 7970 reference card.

CCC sliders all the way to the right.

core 1125
mem 1575

I don't even need to set powertune to + 20% but I did after a couple days just because I was bored.

My VID according to Afterburner is 1.175

and my highest load temp (that I have noticed) was 78c

I don't continously monitor temps because I haven't had to and the cooler never gets too loud, just a whooooooosh to let me know she's doing her job.

I'm thoroughly impressed with the card. Wicked easy to get a sizable overclock and has provided me with increased gaming performance over my previous GTX580. Much appreciated when using a 30incher and 120hz panels alike.

All this and my stocks have shot up!!!!
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
WOOOOOOOOO here's a huge update, let me reply to some of you guys first:

Congrats. Definitely sounded like that XFX was messed up.

Honestly, I'm not so sure that launch-day non-reference cards are the way to go. After reading up on the XFX cards, it seemed like they weren't quite ready for prime time. I've never had any problem with reference cards, except maybe their being a bit loud.
In my experience, launch day has always been more of a gamble - since there isn't a lot of stock built up, every piece of silicon is available, for better or for worse. Therefore, you can nab a really nice card or a barely passable one (especially since we don't even know how much they've binned for the 7950 stock). I always stick with reference because, if anything, they're over engineered.
Glad your Gigabyte is working out better for you. Magic 8 Ball says 1350/1600 under water.
1325/1800 @ 1.3v set in AB
Thanks, you're both close. Maximum clocks are in the area of 1325-1350 @1.3V/1675-1700 depending on the application. I'll be running the review at 1300/1650 since those will be my 24/7 overclocks. Also, this card is insanely bandwidth limited, more on that below .
Mr. K6, are you still planning on using the mcw82 on the 7970? If you are, then you should be aware that universal gpu blocks require a spacer since the 7970 gpu die is surrounded by a shim.
Yep, thanks for the heads up. I read about it early on, but never saw a picture. It's a complete PITA and most universal blocks will not fit without some modding. More on that below as well.
yep, in every forum that i read(guru3d, techpowerup, and tomshardware) it look like every xfx board is a bad overclocker
I can't say much since my personal two data points hardly make a trend, but if this is true I also wouldn't be surprised. On a personal level, this is enough for me to stay away from XFX from now on. Granted my experiences could easily be coincidental, but there are so many other options out there, why would I bother?
Hi,
could also try some undervolting?
Stock voltage HD 7970 overclocks very well, so I think it actaully got a lot more voltage than it really needs.

Thanks
Certainly. I've also had requests to do analysis on minimum FPS as well. Once I run the benchmark suite and crunch the analyses, I'll see what kind of time I have left and hopefully will get some good data. I can tell you right now that because these cards are so bandwidth limited, your most "efficient" overclock will probably be maxing your stock voltage core overclock and then cranking the vRAM to the max.
Hehe... I'm just getting started. Playing Skyrim @ 1075MHz on the core, stock volts. I have Sapphire Trixx installed, I think I'll be gunning for 1.3GHz soon.

I had two Sapphire VaporX cards... for the first time in a couple of years, I can hear my graphics card fan. Not loud, but a low hiss from the blower as I game. Not too bad, but I wonder how things will go with more clockspeed and voltage. That's probably what will limit my OC, the noise.
Go water my friend :thumbsup:. This kind of performance in a near-silent PC just gets the enthusiast in me all giddy .
Figured I'd chime in with my results since I've been rock solid for over a week now.

I have a Gigabyte 7970 reference card.

CCC sliders all the way to the right.

core 1125
mem 1575

I don't even need to set powertune to + 20% but I did after a couple days just because I was bored.

My VID according to Afterburner is 1.175

and my highest load temp (that I have noticed) was 78c

I don't continously monitor temps because I haven't had to and the cooler never gets too loud, just a whooooooosh to let me know she's doing her job.

I'm thoroughly impressed with the card. Wicked easy to get a sizable overclock and has provided me with increased gaming performance over my previous GTX580. Much appreciated when using a 30incher and 120hz panels alike.

All this and my stocks have shot up!!!!
Glad your card is clocking well too :thumbsup:. It really is amazing how easy it is to crank these things out of the box. And I agree, if you have 30", 120Hz, or multi-monitor, there is no other card on the market for you
afterburner beta 11 is out http://forums.guru3d.com/showthread.php?t=357421 supposed to have memory voltage control for the 7970
Heh. In about six months we'll start seeing a lot of "Why is my 7970 artifacting?" threads now.
Thanks for the heads up, just installed beta 11. However, I agree with SlowSpyder: despite how tempting it may be to overvolt the vRAM (especially since clocking it gives such noticeable gains), it's not worth it at all in the long run. vRAM is very sensitive and too easy to kill IMO.

See the next post for the update (darn image restrictions).
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Alrighty, so pulling apart and cleaning my loop was a pain in itself (glad I only do it every 18-24 months), but the kicker is that shim on the 7970. The 7970 is actually designed well for the enthusiast who sticks with the reference cooler. For instance, with only taking out 6 screws, you can take off the shroud to get full access to the heatsink to clean the dust out of it:

Anyway, here's the dreaded shim:

It's huge and would be a PITA to remove. It looks to me like it's there to protect the exposed circuitry on the package. Why they designed the package this way I don't know, but it is what it is. Because of that shim, you can't get decent contact with a flat mounting heatsink. If you look closely at the picture, you can see little blobs where I tested TIM spread (using MX-4 now ), and it's just miserable. The reference heatsink has a little raised area around the core to make contact, so how does one remedy this? My line of thinking - I'm not waiting for a full cover block nor am I paying for one after I just got this MCW82. Therefore, I made my own shim:

This is simply copper plating. You can get it at any hardware store or super store (Home Depot, Lowe's, True Value, Ace Hardware, etc.) as it's used for decorating/plating/moldings, etc. It's ~1mm thick (forget the actually thickness), and I picked up a 12" x 6" plate of it years ago for ~$6. It's probably more expensive now since copper prices have gone through the roof, but it's a lot cheaper than a new heatsink . All I did was round the edges with 400 grit sand paper and then sand the entire piece with 1000 grit to remove oxidation. Here's what it does for the mount:

BAM! We're in business. Of course this totally screws the mounting for the MCW82, since there's no way that's what it's set up for (regarding total thickness of back plate -> water block, screw length, etc.). I toyed with a few ideas, I actually wanted to to a spring-loaded design to introduce some compliance in the system, but my local Ace Hardware (the only place that carried half of what I needed) didn't have springs small enough. Therefore, this is what I did:

Sorry for the poor quality, I'm actually using a flashlight to illuminate the block so the flash doesn't hide the detail. Basically, you can see the block making excellent contact with the shim. This is a direct screw on mount, a la water cooling from a decade ago. This is a very "dangerous" mounting mechanism as there's little forgiveness for over tightening (think cracked core). I probably pansied out and the mounting pressure isn't that great, but temps are actually fine. Again, I would have liked to use springs, but they weren't available. Basically I use the nylon washers to judge thread lengths and eyeballed the rest. These are 2/56 x 1/2 screws + metal washers on the top, for those of you interested. The final product:

Got everything installed. You can see the VRM's are cooled by cutting MOSFET heatsinks in half. You can also see there's still a lot of air to bleed out of the loop, but that will come (I run a closed loop, no reservoir, so it takes awhile to make it out the T-line). Still, it looks a heck of a lot better than the old loop, right?


See the next post for performance updates.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
As I mentioned in the previous post, looks like final 24/7 clocks are going to be 1300/1650. 1.3GHz is what I wanted to hit on the core, I'm very happy I got it. I did some quick benches just to throw something out there to talk about, I'm going to run through the suite now and get all the numbers up for both stock and overclocked. Keep checking the spread sheet for live updates.

Some performance numbers at 1300/1650MHz:
Temps - 34C idle/46C load
3DMark11 Performance - 10158
Crysis 2 Ultra DX11, 2560x1600, 4xAA (Laplace) - 46.8FPS

First, as you can see, my jury-rigged MCW82 works darn well. Secondly, OH MY GOD THIS PERFORMANCE IS UNHOLY :twisted:. I knew the card would be much faster than reviews made it out to be, but this is just insane. Crysis 2, cranked, and 46.8FPS. This card schools anything else on the market today, even the GTX 590 and HD 6990. As previously mentioned, if you're running 2560x1600, 120Hz, or multi-monitor, this is the card to get, there is no other. Check out the spread sheet for the calcs, but you're looking at up to 125% faster than a 6950, almost 100% faster than my overclocked 6950 @ 1GHz. One of the interesting things I found while testing for my highest overclock is that the card responded better in Crysis 2 to memory frequency increase over core frequency increase. I got greater FPS gains going from 1375MHz -> 1650MHz memory with the core at 1125MHz than going from 1125MHz -> 1325MHz on the core with the vRAM at 1375MHz. Very interesting. I can't remember the last time I saw a card that was this bandwidth starved. Granted this is only in Crysis 2, but I think it's valid to make a generalization from such a modern game.

Anyway, just insane, absolutely insane. More to come :thumbsup:.

EDIT: Power consumption looks to be ~500W from the wall maximum (~400W real), so it's definitely chomping down the power. I think it's probably still more efficient than an 40nm chip though, we'll see when I go through the data :thumbsup:.
 
Last edited:

Tencntraze

Senior member
Aug 7, 2006
570
0
0
This is all really great information! Sorry if I missed it, but any idea how much power that OC'd card is drawing? Debating updating my current trifire rig at some point when prices go down and want to make sure that my AX1200 can handle 3 OC'd beasts.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I'm going to interrupt your updates just to give you a shout out for some amazing work (as usual). Really interesting to read about how you used some old school ingenuity to set up your water cooling, and of course, the performance speaks for itself.

A job well done, sir. You definitely earned the right to brag...hope there wasn't too much blood, sweat, and tears.

Time to update your sig!

Edit: Following up on the above poster, did you figure out if the high idle/load power draw at stock was a fluke limited to the XFX card?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
This is all really great information! Sorry if I missed it, but any idea how much power that OC'd card is drawing? Debating updating my current trifire rig at some point when prices go down and want to make sure that my AX1200 can handle 3 OC'd beasts.
Ah yes! It's in the spread sheet, but I'll add it to the post above. I'm looking at ~500W from the wall at full load @ 1300/1650MHz @ 1.3V. It definitely chomps the power, but is probably still more efficient than any 40nm card. We'll see when I crunch the numbers. That said, 500W from the wall is ~400W actual load, so you're looking at probably 225W per card. If you want to cut power consumption, running maximum clocks at stock voltage and cranking the RAM nets a ton of performance for little power usage increase. All this said, your AX1200 should cut it, but it will be close.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I'm going to interrupt your updates just to give you a shout out for some amazing work (as usual). Really interesting to read about how you used some old school ingenuity to set up your water cooling, and of course, the performance speaks for itself.

A job well done, sir. You definitely earned the right to brag...hope there wasn't too much blood, sweat, and tears.

Time to update your sig!

Edit: Following up on the above poster, did you figure out if the high idle/load power draw at stock was a fluke limited to the XFX card?
Thanks! I'm glad you're enjoying it as much as I am :thumbsup:. To be honest, I post a lot of what I do because when I was starting out with computers and modding, I never felt there was enough information available to explain what some of the more advanced users were doing. I always felt there was this experience gap that had to be crossed to get a lot of the references/vague advice that was given. Therefore, I take a lot of pictures and try to properly but succinctly explain what I've done and why I did it (although I do stray into verbosity sometimes).

That also said, I don't want to give off the false impression that any of this is simple, per se. IRL, I do general surgery regularly and am comfortable with the patience, focus, and technicality that a lot of this takes, and I don't think it's prudent to rush into something like this without that composure and definitely not without the experience. However, I'm nobody's keeper, the information is there and anyone is more than welcome to do what they will with it.

And I just updated my sig
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Edit: Following up on the above poster, did you figure out if the high idle/load power draw at stock was a fluke limited to the XFX card?
The power draw is basically identical. I think the 6950 is simply overrated in a lot of reviews, so the 7970 seemingly consumes a lot more.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Nice MacGyver work on your block. I have the parts coming in to put my CPU under water tomorrow! Whenever I upgrade my video cards they will be joining my CPU.

Now give us some benchmarks at those clocks. I'd really like to see a Crysis Warhead run using http://downloads.guru3d.com/Crysis-WARHEAD-Benchmark-Tool-BETA-download-2072.html on the frost level @ 2560x1600 4xAA and enthusiast settings. Just to compare to Ryan's results here at Anand for a stock 7970.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Thanks, you're both close. Maximum clocks are in the area of 1325-1350 @1.3V/1675-1700 depending on the application.


want to be right about 1325/1800 HAHAHA!

Hey MR.K6, Afterburner now has memory voltage control, (beta11). So shoot for 1800-1900, instead of 1600. Test out that "bandwidth starved" crysis 2.

Nice work so far dude!
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Nice MacGyver work on your block. I have the parts coming in to put my CPU under water tomorrow! Whenever I upgrade my video cards they will be joining my CPU.

Now give us some benchmarks at those clocks. I'd really like to see a Crysis Warhead run using http://downloads.guru3d.com/Crysis-WARHEAD-Benchmark-Tool-BETA-download-2072.html on the frost level @ 2560x1600 4xAA and enthusiast settings. Just to compare to Ryan's results here at Anand for a stock 7970.
All done . My results pretty much correlate with his, so I think you can extrapolate performance comparisons with other cards on that list. At 1.3GHz, the 7970 is ~ GTX 590 performance at 2560x1600 (not quite a 6990).
want to be right about 1325/1800 HAHAHA!

Hey MR.K6, Afterburner now has memory voltage control, (beta11). So shoot for 1800-1900, instead of 1600. Test out that "bandwidth starved" crysis 2.

Nice work so far dude!
Lol, like I said, I'm a little wary of pumping volts through this RAM. As it is, I don't think it's entirely stable at 1650 and I may back it down to 1625 or 1600 to give myself more head room. Although between the shim and the possibility for memory performance, this whole card makes an argument for picking up a full cover block.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Nice work MrK6, thanks for the info as we (the ones who are contemplating) really appreciate this. What a monster card.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
MrK6, When you run Metro 2033 benchies at your clock speeds you are going to be blown away.

I ran them at 1920x1080 very high, AAA, tess on, dof off. Those are the settings I played the game with. On my 580 @ 840 core I was scoring 44-45 FPS.

Same settings with my 7970 @1125/1575 I was pushing 69-70 fps. Huge increase!
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
OMG.

BF3 1080p ultra at over 100fps.

Holy crap.

BTW, I see that Witcher 2 and Skyrim both scale negatively with your overclock - could you have some memory correction going on there? Might want to determine if that OC is completely stable.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Maybe ram, but I've found Metro 2033 to not give two licks about cpu clock speed.

I get the same fps with my i5-2500k clocked at 3Ghz as I do with it at 5.2GHz.
 

superjim

Senior member
Jan 3, 2012
293
3
81
BTW, I see that Witcher 2 and Skyrim both scale negatively with your overclock - could you have some memory correction going on there? Might want to determine if that OC is completely stable.

Skyrim is known to be CPU limited but still, his i5 at 5Ghz "should" be enough. Goes to show how much Skyrim was a console-port.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Nice work MrK6, thanks for the info as we (the ones who are contemplating) really appreciate this. What a monster card.
What an excellent thread.
You deserve serious props for this MrK6 :thumbsup:
Thanks guys, glad you're enjoying it :thumbsup:.
MrK6, When you run Metro 2033 benchies at your clock speeds you are going to be blown away.

I ran them at 1920x1080 very high, AAA, tess on, dof off. Those are the settings I played the game with. On my 580 @ 840 core I was scoring 44-45 FPS.

Same settings with my 7970 @1125/1575 I was pushing 69-70 fps. Huge increase!
just did a Metro run @ 1300mhz core. 73 fps with previously stated settings. Holy crap!!
I agree, this card's a monster . I ran my benches with 4xMSAA enabled in game and I don't think the results are as impressive. I'm only pegging 33FPS at 2560x1600 w/ 4xMSAA, and it is one of the poorer scaling games when it comes to the overclock (without any major CPU bottleneck). That said, I think the AA implementation is just coded poorly in this game.
Is there a way to upload some high res videos so we can see what you are seeing?
It's my intention of uploading videos of runs of the custom benchmarks so you guys can see what I'm doing. I might get to them today I hope.
OMG.

BF3 1080p ultra at over 100fps.

Holy crap.

BTW, I see that Witcher 2 and Skyrim both scale negatively with your overclock - could you have some memory correction going on there? Might want to determine if that OC is completely stable.
This card absolutely slaughters BF3, no doubt. What you're seeing in Skyrim is a complete CPU bottleneck. In Witcher 2, you're seeing an asymptotic decrease in performance improvement as it approaches the 60FPS limit of the engine. In either case, these are good scenarios in that the graphics card is more than doing its fair share of work. However, this also means that doing "summary" performance gains won't be accurate since the graphics card wasn't being pushed. I'll think about how I want to approach this in the final analysis.

EDIT:

Skyrim is known to be CPU limited but still, his i5 at 5Ghz "should" be enough. Goes to show how much Skyrim was a console-port.
Yep, pretty much.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |