GTX 1080 Overclocking

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Piroko

Senior member
Jan 10, 2013
905
79
91
Found this interesting video:
https://www.youtube.com/watch?v=p_2_iCCesxI

So high-side and low-side MOSFET in one package, that at least explains what stumped me with the single MOSFET per phase part. On the other side, yep, that's a budget board. Aftermarket ones should do better at OC and stability.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If the FE1070/FE1080 ain't your jam, why not buy a card that better suits your needs from one of the AIB vendors?

I can't honestly believe that somebody who goes to such great lengths to try to educate people on how to get the best value for their $ on these boards would purposely buy an inferior product at a worse price just to send a message to a corporation that will not be received .

It's not inferior since it makes $$$. The irony of your post here is that Any AMD card above R9 370 is still technically a better value than a GTX1080 because it pays for itself faster with mining. So I am still consistent in my message. The gamer would actually earn $$$ towards 400mm2+ flagship cards if they forego the 1080(s) and get as many 380X/390/390X/Fury/Polaris 10/Nano cards for the same $700 as they can. That makes almost any AMD card out now superior to any 1080. Same reason no NV cards since HD4000 generation made sense either. Since you buy stocks, AMD cards are like a stock with a huge dividend component. You sell the shares later on but the dividend is where the meat and potatoes are.

Back to topic.
--------

Why do you keep defending a $699 card that out of the box fails to deliver advertised specifications for a case with poor airflow? Leave the AIB cards out of it for now.

Is 1080 FE not an engineering/marketing failure without manually tweaking of power tune and fan speeds by the end user? Yes or No?

NV showed in real time sub-70C with 2.1Ghz overclock during the presentation but after 20 minutes of gaming, the card cannot even maintain 1733mhz Boost without any overclocking. That's a marketing lie and the blower design is an engineering failure.

You seem to be defending it since you intend to purchase FE cards. If ATI/AMD/Intel/Matrox releases such a product, you honestly mean to tell us you would find it acceptable, and that's not even getting to the fact that NV is asking a $100 premium on top?!
 
Last edited:
Reactions: Grazick

flopper

Senior member
Dec 16, 2005
739
19
76
What makes it worse is that Computerbase used a low air flow case - Fractal Design R5 - exactly the "optimal" case which calls for a blower. Under these conditions, the card failed to reach 1733mhz boost after 20 minutes of gaming in every game other than the AofS. This proves the card fails exactly under the conditions it was designed to excel in over an open air cooled card. That's an engineering failure then because in a case with good to great airflow, the open air cooled card will destroy it. If the blower cannot even deliver promised spec performance in a case with poor airflow, then it has no purpose other than for those who water cool and/or intend to use miniATX boards in SLI. I bet NV loves that it will now shove $449/699 FE cards down BestBuy and OEM markets where most consumers aren't tech savvy to know how much of a blunder the FE cooler is.

Notice Computerbase has Max results in their chart that prevents throttling? To get there took 48 dBA in the same case/test bed where Sapphire Fury Nitro is 36 dBA and Fury X is 37 dBA:
http://www.computerbase.de/2016-05/geforce-gtx-1080-test/9/

I guess now noise levels and temperatures and GPU Boost throttling don't matter either, just performance. Goal posts moved again.

by far the 1080 Fuxxxd edition is hot, noisy, OC badly and lack power.
but you add more price so then its a good buy?
how does such a buyer reason?
Got robbed and said, was expecting that?
 

Timmah!

Golden Member
Jul 24, 2010
1,463
729
136
It means less crazy overclock ability and more responsible amounts of performance. It also means AIB premium versions will eat this premium reference card for breakfast.

That sexy shroud though... makes you wanna reference everything

So when JHH said it offers irresponsible amount of performance, he lied?
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
1080 is simply repeating 290x fiasco.

only way to fix this is to jack up the reference blower to a custom fan profile or go aftermarket cooling or go water cooling.

want to go out on a limb and say that all the benchmarks are wrong due to the fact that those benchmarks are cold starts.





fact. a 290x reference with a custom fan profile is rock solid. even in an 80 degree room.
 
Last edited:

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
1080 is simply repeating 290x fiasco.

only way to fix this is to jack up the reference blower to a custom fan profile or go aftermarket cooling or go water cooling.

want to go out on a limb and say that all the benchmarks are wrong due to the fact that those benchmarks are cold starts.





fact. a 290x reference with a custom fan profile is rock solid. even in an 80 degree room.

I agree with you, it seems crazy to see Nvidia repeating what AMD did. I get that they need a blower but damn.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
VRM Phases are like tiny power filters which clean up variances in the power delivery to the GPU. Basically, the GPU board is fed 12v of power but the GPU needs far less than that (usually between 1-1.3v). What the VRMs do is phase the power down from 12v to the required 1.xv. Each phase step cleans the power to a larger degree and ensures less vDroop. VDroop happens when the supplied voltage drops unexpectedly. Such drops cause stability issues in the operation of the GPU which are more apparent when overclocking.

This means that NVIDIA have decided to omit one phase which results in a higher chance of vDroop on a card which has a high operating GPU frequency. This inhibits overclocking on a board being sold based on marketing fluff claiming that it is well engineered. Basically the marketing is false.

NVIDIA are cutting corners in order to improve their profit margins whilst also charging a premium tax on early adopters.

Um no. Clearly you dont have a good understanding in this field neither do most of the posters which is ok because I can explain it.

The VRM is basically a buck converter. So that one can step down (otherwise known as "bucking down" i.e buck) the 12V to what ever the voltage the GPU requires. This could be between 1~1.3V as you have said.

Now normally, having 1 buck converter is enough for most applications. But because the GPU requires alot of current (100~300A full load), you need duplicates of the same buck converter to satisfy the power requirement. Or else you'll end up with very very large components especially the DC inductor otherwise known as chokes (the large square things you see on the board - they come in all sorts of sizes/shapes/materials btw).

The converters are synchronized to share the load, and the number of buck converters you have are often referred as "the number of phases". E.g. 3 would mean 3 buck converters.

Now performance of these are normally judged on the following (alright there is ALOT more than this but lets keep it simple):
1) Output power capability
2) Output ripple voltage (also output noise)
3) Cost (for obvious reasons)

Output power capability is often dependent on the number of phases and how beefy the components are. More phases means more output power can be delivered to the load. More beefy the components are (they can handle more current with good electrical parameters like low on resistance) the more output power one phase can deliver.

Output ripple voltage really depends on how well the circuit has been designed (even if they are all doing the same thing). Using expensive components can help but best performance comes from the design equations themselves. I remember from that video presentation they had showed how they reduced the output ripple noise by half which means they really did their homework.

You talk about vdroop, but it essentially means how the buck converter copes with load steps. You'll see these spikes and dips when the load is constantly changing (150W to 50W, 20W to 200W etc). The number of phases has NOTHING to do with this. This is purely dependent on how well the feedback loop is designed of that converter i.e. how well it reacts to the sudden change of output load.

The conclusion I'd like to make is that, more phases =/= robustness or better. Your idle power consumption could actually be worse. The only primary benefits you'd get is that the heat is more spread out and you have the ability to supply more power to the GPU if it requires. Some of the downsides would be that having more phases could result in more output noise which of course isn't desirable (this one is arguable). The overall efficiency might be worse off too but that too depends on how well its designed.

However the primary point I want to make is that "Phases" aren't reflective of the actual performance of the VRM in terms of how clean the output voltage is, or how well responds to step loads.

So claiming that they've gimped something because you see that not all components are soldered on is quite a rookie thing to say if not ignorant (actually its quite funny). There is much more to it than what the un-trained eye is telling you. If their slides showing the oscilloscope wave capture + the efficiency curve is true, they've done an excellent job without having to use all that fancy components that the AIBs like to flaunt (its literally meaningless, well in my view anyway).
 
Mar 10, 2006
11,715
2,012
126
Very informative post, Cookie Monster. Thanks. Going to read this through a couple of times :thumbsup:
 
Last edited:

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
1080 is simply repeating 290x fiasco.

only way to fix this is to jack up the reference blower to a custom fan profile or go aftermarket cooling or go water cooling.

want to go out on a limb and say that all the benchmarks are wrong due to the fact that those benchmarks are cold starts.





fact. a 290x reference with a custom fan profile is rock solid. even in an 80 degree room.

I like how suddenly people are comparing the R290 reference to the GTX1080. The reference coolers on the R290 was terrible especially the noise profile. And those were cooling almost 300W GPUs in the real world.

I dont seem to recall a negative reception over the Titan reference titan cooler used to cool a 250W GPU? I do however recall people liking the reference design, the noise profile etc.

Now all of a sudden, we have the GTX1080 (TDP of 180W) that uses a similar cooler but people are complaining that its a POS, loud and hot etc etc. Im literally lost on this one.
 
Last edited:

casiofx

Senior member
Mar 24, 2015
369
36
61
I like how suddenly people are comparing the R290 reference to the GTX1080. The reference coolers on the R290 was terrible especially the noise profile. And those were cooling almost 300W GPUs in the real world.

I dont seem to recall a negative reception over the Titan reference titan cooler used to cool a 250W GPU? I do however recall people liking the reference design, the noise profile etc.

Now all of a sudden, we have the GTX1080 (TDP of 180W) that uses a similar cooler but people are complaining that its a POS, loud and hot etc etc. Im literally lost on this one.
There are two reasons explaining this:
1) Nvidia charged $100 more, touted craftmanship of it's card, including the cooler.
2) There's something called price anchoring, for example apple hinted the original's ipad was around $1000 launch price, but revealed it is as low as $499. People immediately thought it is cheap, that's how human mind works. Now for this situation it is not exactly "price anchoring" but "performance anchoring". GTX 1080 demonstration revealed it is running at 2100Mhz+ at only 67C, that sowed the seed in people's mind, such as "wow this is an awesome cooler" etc etc. So when the real reviews comes out, it falls behind everyone's expectations, everybody is hating it. even though it is performing like the previous coolers. This is what are called "anchoring".
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
It should be obvious why. Nvidia has posited that this same old tired reference cooler and PCB is suddenly deserving of a $100 premium now. So it's being put to the scrutiny of that price hike and failing to meet that premium miserably.

If not for the founders blunder, the card would have been much better received. They really screwed up the whole marketing,out of character for nvidia and they are reaping the bad PR of the mistake.
 
Last edited:

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
It should be obvious why. Nvidia has posited that this same old tired reference cooler and PCB is suddenly deserving of a $100 premium now. So it's being put to the scrutiny of that price hike and failing to meet that premium miserably.

If not for the founders blunder, the card would have been much better received. They really screwed up the whole marketing,out of character for nvidia and they are reaping the bad PR of the mistake.
They're also reaping the rewards of average buyers and enthusiasts that don't care buying every single card they produce within minutes of availability.

Bad PR doesn't matter much, at least in this case, because they do at least deliver what the market wants - a high performance finfet card. Bragging rights and/or the thrill of having a new toy is worth a lot to some people.

That seems to be something that AMD doesn't really understand or isn't up to the task of exploiting. Polaris may be a great product, Polaris might suck, but either way, it's not the card that you'll be seeing in people's sigs simply because it's not the high performance finfet card that everyone wants. If it's a $300 120W 980 Ti, great, but it's still a 980 Ti. People don't want 980 Tis, we have those, they want 1080s.
 
Last edited:

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
I like how suddenly people are comparing the R290 reference to the GTX1080. The reference coolers on the R290 was terrible especially the noise profile. And those were cooling almost 300W GPUs in the real world.

I dont seem to recall a negative reception over the Titan reference titan cooler used to cool a 250W GPU? I do however recall people liking the reference design, the noise profile etc.

Now all of a sudden, we have the GTX1080 (TDP of 180W) that uses a similar cooler but people are complaining that its a POS, loud and hot etc etc. Im literally lost on this one.


290x is a 300w gpu. with a custom fan profile. the reference cooler can stand on its own.

1080 is a 180w gpu. absolutely ZERO reason why it should be throttling. POS or not.

if this is any indication. 1080ti/full pascal will definitely need underwater.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Amazing.

Not a single retail card has been seen yet, and many have pronounced the card dead.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I like how suddenly people are comparing the R290 reference to the GTX1080. The reference coolers on the R290 was terrible especially the noise profile. And those were cooling almost 300W GPUs in the real world.

I dont seem to recall a negative reception over the Titan reference titan cooler used to cool a 250W GPU? I do however recall people liking the reference design, the noise profile etc.

Now all of a sudden, we have the GTX1080 (TDP of 180W) that uses a similar cooler but people are complaining that its a POS, loud and hot etc etc. Im literally lost on this one.

You pretty much answered your own question.

290x had to cool 300 watts, and when doing so got loud. It released as top end card @ $550.

1080 has a "premium" cooler, touting $100 extra for it, yet it fails to cool only 180W and throttles even when at 100% when used in a case (like its designed for, hence blower fan design).

So the premium cooler on a $750 180W card is worse than the cooler on a $550 300w card and you are asking why people are upset?

Everyone says that the 290x reference was a bad buy and custom were much much better. Same deal here except the cooler was specifically called premium.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Amazing.

Not a single retail card has been seen yet, and many have pronounced the card dead.

Not only is it dead, the supply was so low it didn't even make it to market! D:

I'm hoping to catch one of these unicorns on Friday
 

brandonmatic

Member
Jul 13, 2013
199
21
81
Yeah, the backlash here is for trying to justify the $100 early adopter's fee as being the cost of a premium cooler - which it clearly isn't. If we actually saw the BOM and design costs I would guess it would be +/- 10% of the cost of all of Nvidia's previous reference coolers. The FE isn't a terrible cooler by any means - it's just slightly subpar and restricts the card's performance. But that sticks in the craw for many people when it's being touted as being something special.

I don't in principle have a problem with an early adopter's fee because those buyers - who are all willing consumers - are cross-subsidizing cost conscious consumers like me who are looking for good value. Both NVidia and AMD would love to have those high spending customers. Unfortunately for AMD, they mostly get value buyers like me.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Overclocking doesn't matter anymore either. Oh and heat doesn't matter too.

I swear, you can't make this up. I'd like to create a collage of folks who make changing claims as generations of GPUs release in an effort to back their favorite side. No doubt I'd get banned though.
this is actually a very frustrating point about the forum rules. especially when this is suppose to be a top, serious hardware forum.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Um no. Clearly you dont have a good understanding in this field neither do most of the posters which is ok because I can explain it.

The VRM is basically a buck converter. So that one can step down (otherwise known as "bucking down" i.e buck) the 12V to what ever the voltage the GPU requires. This could be between 1~1.3V as you have said.

Now normally, having 1 buck converter is enough for most applications. But because the GPU requires alot of current (100~300A full load), you need duplicates of the same buck converter to satisfy the power requirement. Or else you'll end up with very very large components especially the DC inductor otherwise known as chokes (the large square things you see on the board - they come in all sorts of sizes/shapes/materials btw).

The converters are synchronized to share the load, and the number of buck converters you have are often referred as "the number of phases". E.g. 3 would mean 3 buck converters.

Now performance of these are normally judged on the following (alright there is ALOT more than this but lets keep it simple):
1) Output power capability
2) Output ripple voltage (also output noise)
3) Cost (for obvious reasons)

Output power capability is often dependent on the number of phases and how beefy the components are. More phases means more output power can be delivered to the load. More beefy the components are (they can handle more current with good electrical parameters like low on resistance) the more output power one phase can deliver.

Output ripple voltage really depends on how well the circuit has been designed (even if they are all doing the same thing). Using expensive components can help but best performance comes from the design equations themselves. I remember from that video presentation they had showed how they reduced the output ripple noise by half which means they really did their homework.

You talk about vdroop, but it essentially means how the buck converter copes with load steps. You'll see these spikes and dips when the load is constantly changing (150W to 50W, 20W to 200W etc). The number of phases has NOTHING to do with this. This is purely dependent on how well the feedback loop is designed of that converter i.e. how well it reacts to the sudden change of output load.

The conclusion I'd like to make is that, more phases =/= robustness or better. Your idle power consumption could actually be worse. The only primary benefits you'd get is that the heat is more spread out and you have the ability to supply more power to the GPU if it requires. Some of the downsides would be that having more phases could result in more output noise which of course isn't desirable (this one is arguable). The overall efficiency might be worse off too but that too depends on how well its designed.

However the primary point I want to make is that "Phases" aren't reflective of the actual performance of the VRM in terms of how clean the output voltage is, or how well responds to step loads.

So claiming that they've gimped something because you see that not all components are soldered on is quite a rookie thing to say if not ignorant (actually its quite funny). There is much more to it than what the un-trained eye is telling you. If their slides showing the oscilloscope wave capture + the efficiency curve is true, they've done an excellent job without having to use all that fancy components that the AIBs like to flaunt (its literally meaningless, well in my view anyway).

+1 :thumbsup:
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Um no. Clearly you dont have a good understanding in this field neither do most of the posters which is ok because I can explain it.

Seems like you don't have either.

The converters are synchronized to share the load, and the number of buck converters you have are often referred as "the number of phases". E.g. 3 would mean 3 buck converters.

No. They are synced but are not sharing load per se.

Start with what is MOSFET and how it works. Then you would know the reason to have multiple phases in power delivery circuit.

First is for scalability reason. 1080 MOSFETs can push only 250amps at room temperature.
290x can handle 5x160amps=800amps at room temp but only 5x70amps@125'C (less than a half)
https://www.youtube.com/watch?v=6i6TPriNXis

So how much current can 1080 VRM push in realworld? Suddenly it becomes obvious why it is powerthrottling.

Second reason is ripple. You seem to not mention in your post how VRM phases and FETs work.

"Output ripple voltage really depends on how well the circuit has been designed (even if they are all doing the same thing). Using expensive components can help but best performance comes from the design equations themselves."

Really? If this is not about VRM phases I don't know what it is about.
First and for the most, the components quality. High quality caps and inductors are required to even dream of good ripple. You want to generate as little ripple as possible - that is why a lot of small VRM phases are better than a single beefy one.
Third is ripple suppression which is lacking on a GPUs (inductor+capacitor) compared to for example power supply. But it is tied to how much ripple both of these devices cope with and the PSU have a lot of ripple to smooth out.

For the starters, think of VRM phase as a Cylinder in a car's engine.
We had 2 cylinder engines which were running like tractors. Anyone familiar with the smooth sound of V6 or V8?
 

Drderpinheimer

Junior Member
May 25, 2016
1
0
0
Excellent post.

I would add one more thing that makes reference 290 >> 1080FE.
Both coolers are trash and can't handle the card, but at least r9 290 have a quality PCB that can supply immense amount of power:

290x:


1080FE missing one phase on VRM - quality premium design D: :


I guess you would have to pay $200 premium to get fully populated reference board! What a failure.

So here it is, for 290 you could get $40 -$70 aftermarket cooling:
http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html
Or put it under water and OC it to the moon. Something you can't do with 1080Fe because the premium design is missing VRM phase!


They are asking $100 premium, yet they skipped a VRM components for a total of $1.5 savings. Give me a break!
As if the 290x is capable of overclocking at all. 15% under air or 16% under water according to hwbot which seem way higher than actual OCs.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |