GPU boost - gimmick?

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Perhaps "gimmick" is a bit strong, but what's the point of raising GPU clocks when load is low? Can this feature help minimum framerates, or is it just there to increase nV's average fps and impress us with large benchmark numbers? Maybe it helps in compute when resources are used very selectively?

I understand GPU load is a bit more complicated than CPU load, but my initial impression is that this is a mostly useless feature for gamers. I would much rather have a feature that can allow spikes over TDP while keeping average power lower, to raise fps when it's needed most.
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
I'd say increasing performance without sacrificing ergonomy is more than a gimmick. For those that don't overclock this would be a pretty useful feature.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I can see how in some cases a GPU may be under load, but the total wattage is not very high, and this may help a tiny bit. But 50MHz isn't that much, the difference is small. But it does seem like it would do more to boost FPS when FPS are already high then when FPS are low.
 

IlllI

Diamond Member
Feb 12, 2002
4,927
10
81
with it being only 50-ish mhz boost, it seems pretty useless. I wonder what happens if you try to overclock yourself, does the boost still kick in? Or does it hinder manual overclocking in some way?
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
It'd be more useful if the 680 were released as a ~800Mhz GTX 660 Ti and the boost went up to 1000Mhz.

As it stands I'm not too impressed by it. And I'm still waiting on real overclocking info, not cherry picked info. :awe:
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
It's called innovation.
Along with adaptive V-sync. They have IC's monitoring many functions/ fps/ load /tdp at every connector.
You don't design a chip and magically change directions and clock it 40% higher. With luck they usually hit the power target with a certain clock in mind for a given design.
When you go beyond this sweet spot, the power starts going up exponentially. This was the problem with early Fermi's. They had 800mhz in mind, and had to ship the gtx 470 at 607mhz.
IMO, AMD was aiming for 1ghz stock clocks on it's new GCN gpu's. They had to settle a little with Tahiti. We see this in some o/c tests where it can pull a extra 100 watts with a 1100mhz o/c and some voltage.bump.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/2.html

The last of the three big features is Adaptive V-Sync. The feature improves on traditional V-Sync, by dynamically adjusting the frame limiter to ensure smoother gameplay. Traditional V-Sync merely sends frame data to the screen after every full screen refresh. This means if a frame arrives slow, because the GPU took longer to render it, it will have to wait a full screen refresh before it can be displayed, effectively reducing frame rate to 30 FPS. If rendering a frame takes longer than two full refreshes, the frame rate will even drop down to 20 FPS. These framerate differences are very noticeable during gaming because they are so huge.
What Adaptive V-Sync does is, it makes the transition between frame-rate drop and synchronized frame-rate smooth, alleviating lag. It achieves this by dynamically adjusting the value that V-Sync takes into account when limiting frame-rates. I did some testing of this feature and found it to work as advertised. Of course this does not completely eliminate frame rate differences, but it makes them less noticeable.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It's called innovation.
Along with adaptive V-sync. They have IC's monitoring many functions/ fps/ load /tdp at every connector.
You don't design a chip and magically change directions and clock it 40% higher. With luck they usually hit the power target with a certain clock in mind for a given design.
When you go beyond this sweet spot, the power starts going up exponentially. This was the problem with early Fermi's. They had 800mhz in mind, and had to ship the gtx 470 at 607mhz.
IMO, AMD was aiming for 1ghz stock clocks on it's new GCN gpu's. They had to settle a little with Tahiti. We see this in some o/c tests where it can pull a extra 100 watts with a 1100mhz o/c and some voltage.bump.

I highly doubt they are using external IC's to monitor anything more than power consumption (and even then it is extrapolating). Everything else is internal to the GPU itself.

The Tahiti power consumption only goes up when you start boosting voltage. My 7950's power consumption hardly went up at all with the clocks I am running at (stock voltage). Measured power consumption at the wall, which includes power supply losses.

The turbo boost like functionality as it is now is somewhat worthless as its so small. Like mentioned above, if they shipped as a 660/670 as it was going to originally, the boosting would be far more useful. The adaptive vsync certainly sounds interesting however.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I highly doubt they are using external IC's to monitor anything more than power consumption (and even then it is extrapolating). Everything else is internal to the GPU itself.

The Tahiti power consumption only goes up when you start boosting voltage. My 7950's power consumption hardly went up at all with the clocks I am running at (stock voltage). Measured power consumption at the wall, which includes power supply losses.

The turbo boost like functionality as it is now is somewhat worthless as its so small. Like mentioned above, if they shipped as a 660/670 as it was going to originally, the boosting would be far more useful. The adaptive vsync certainly sounds interesting however.
Anands /TPU / Hocp , all give a little insight in to how this works. It's new, so describing functionality is not the easiest to get across when many don't even read the reviews.

Three of these INA219 power monitor chips are located on the board (the third one is on the other side). These provide voltage, current and power monitoring for 12V PCI-E power, and both 6-pin power connectors. They are used to provide realtime power consumption numbers to the driver, which will then use that info to enable dynamic overclocking and ensure the board does not go above its rated TDP.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/5.html
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
base clock = lowest it will go, 1050MHz is the smallest boost possible with default power.

Depending on your sample, you can get more than 50Mhz, but never less.

The boost is tricky, and is totally different than what we've seen.. Basically the quality of your chip will dictate how much boost you get out of the box, higher quality chips will boost higher than 52MHz, lower quality units no matter how bad will boost 52Mhz.

It's actually kind of crazy, Nvidia is now selling stock cards that have a performance variance out of the box.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Not a gimmick, protects partners, delivers higher clocks at low clocks safely. Is turbo boost a gimmick? Of course their TDP limits is BS for extreme overclockers but everything can be hacked. Stay tuned.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Not a gimmick, protects partners, delivers higher clocks at low clocks safely. Is turbo boost a gimmick? Of course their TDP limits is BS for extreme overclockers but everything can be hacked. Stay tuned.

We already know it can be modded, EVGA did it..

I'm sure they can fake a digital signature or whatever is needed, sooner or later this will be cracked.
 

mkmitch

Member
Nov 25, 2011
146
2
81
I am about as disinterested in this whole entire nvidia vs amd battle as a person can be and I have never owned an nvidia card. Now having said that if you take time to read the reviews that explain how it works and why nvidia chose to do this, it makes perfect sense and is certainly not a gimmick. I rarely game so a 2gb 6870 is more than enough card for me, but kudos for the 680, I only wish this infantile battle between the two camps would end. Its the most ridiculous thing I've witnessed in a long time, us geeks try to change other geeks minds. It's hilarious if you think about it.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I didn't mean to come off as inflammatory against nV (I own several nV products) but it seemed one of the less practical additions in recent times. The 680 is definitely a great card and beats anything AMD has at the current price points, but I specifically don't see how GPU turbo can help gamers.

I operate under the assumption that when framerate is at its lowest (minimum fps) that's when the GPU is taxed most, power usage is highest, and is exactly when you'll not get any boost.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I wouldn't be shrugging at the key signed BIOS, very possible by the time someone cracks it for retail cards the card will be 2 generations old.

Now if EVGA or some other AIB was to leak retail key info...
 
Feb 19, 2009
10,457
10
76
It's a useful feature for marketing because a lot of users don't bother to overclock so they would have never gained any benefit, here, they get extra performance out of the box.

It's much less a useful feature for enthusiasts who often overclock.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Anands /TPU / Hocp , all give a little insight in to how this works. It's new, so describing functionality is not the easiest to get across when many don't even read the reviews.


Right, so basically what I said. They would use an IC to measure power consumption, but not for the other things you mentioned.
 

kidsafe

Senior member
Jan 5, 2003
283
0
0
Perhaps "gimmick" is a bit strong, but what's the point of raising GPU clocks when load is low? Can this feature help minimum framerates, or is it just there to increase nV's average fps and impress us with large benchmark numbers? Maybe it helps in compute when resources are used very selectively?
Min FPS when either the core is dealing with entirely too many polys or the framebuffer is the bottleneck. In the first case, GPU boost is unlikely since the shaders should be working very hard. In the second case, no amount of extra clock speed will really help.

This ends up being a case where it boosts average framerates, but not the 99th percentile frame draw times like Tech Report is so fond of. It would also explain why the HD 7970 beats the 680 on that particular scatter plot.

Now adaptive vsync on the other hand is a much more enticing technology.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Right, so basically what I said. They would use an IC to measure power consumption, but not for the other things you mentioned.

Yes they are.
This thread is turbo boost a gimmick ?
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/4

GPU Boost: Turbo For GPUs
Now that we’ve had a chance to take a look at the Kepler architecture, let’s jump into features. We’ll start with the feature that’s going to have the biggest impact on performance: GPU Boost.
skipping alot of the technical details in the actual article to another direct quote.
As for deciding what clockspeed to step up to, GPU boost determines this based on power consumption and GPU temperature. NVIDIA has on-card sensors to measure power consumption at the rails leading into the GPU, and will only allow the video card to step up so long as it’s below the GPU Boost power target. This target isn’t published, but NVIDIA has told us that it’s 170W. Note that this is not the TDP of the card, which is 195W. Because NVIDIA doesn’t have a true throttling mechanism with Kepler, their TDP is higher than their boost target as heavy workloads can push power consumption well over 170W even at 1006MHz.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Right, so basically what I said. They would use an IC to measure power consumption, but not for the other things you mentioned.

Yes they are.

They have IC's monitoring many functions/ fps/ load /tdp at every connector.

NVIDIA has on-card sensors to measure power consumption at the rails leading into the GPU, and will only allow the video card to step up so long as it’s below the GPU Boost power target.

Where does it state they have IC's measuring FPS and load? I agree with Stuka87 on this.


Anyway, I personally don't have a need for GPU boost, It's and interesting take on things but I'd say we won't see Intel style performance boosts until perhaps the 2nd or 3rd iteration of it.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Where does it state they have IC's measuring FPS and load? I agree with Stuka87 on this.


Anyway, I personally don't have a need for GPU boost, It's and interesting take on things but I'd say we won't see Intel style performance boosts until perhaps the 2nd or 3rd iteration of it.

Some of these features are tied in together. Through some method they are measuring frame time in relation to refresh rate, to improve upon this stutter that has become more talked about lately.
Nvidia’s Kepler sports a ‘game-changing’ adaptive vertical sync feature; pushed by John Carmack

The first benchmarks of Kepler will be released in the following days and we can’t wait to see what the green team has in store for us. It is said that the card is a beast when it comes to raw performance numbers and features some pretty interesting features. One of them is the dynamic framerate function when played in Surround and another one is an adaptive vertical sync solution. But did you know that John Carmack was the one that has pushed for such a feature?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Some of these features are tied in together. Through some method they are measuring frame time in relation to refresh rate, to improve upon this stutter that has become more talked about lately.

Yeah, that's referring to Adaptive Vsync which we all know about, it's done in the driver and not a fixed function hardware unit as you described. As for 'Dynamic framerate function when played in surround' that could be anything, it could perhaps refer to single GPU running 3 monitors in surround which is a launch feature.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I like to use extremes when explaining these things as I find it's usually easier to grasp a concept when used.
Furmark and Battlefield 3
Furmark will push any GPU to, and beyond it's limits. Will cause the highest power consumption possible on any given GPU. There is nothing left. All of the GPU is pushed 100% power wise and processing wise.

Battlefield 3 may not need to utilize the GPU enough to raise it to it's 170W power consumption ceiling (technical ceiling). Say it runs 155W normally.

GPU Boost will raise clockspeed until just about 100% of imposed power ceiling is reached.

This is completely dynamic and done in real time. Power consumption is measured every 100ms and is adjusted accordingly. You will observe the core clock jumping around often as load on power has subtle changes.

It's not a gimmick as it is good for a few extra frames when there is some power window left on the top end.

Something like 3DMark11 would probably run the GPU at or very near it's power limits and will not offer much in the way of GPU boost, unless you raise the base clock yourself manually.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |