I'm not saying that you're lying. Again, I'm saying that your results are unrealistic, made up and misleading.
They're based on the same numbers you were saying weren't valid, and are now attempting to use to prove your own point. How can it be all that? It can't, you're just reaching randomly as you dig your hole deeper because you can't admit you're wrong.
The results from hardware.fr between stock and Uber for the GTX 780 is a 9.9% at 1080p while the average clocks are 876 Mhz for the stock and 993 for the Uber testing, 13.3% increase in average clockspeeds. So just removing the thermal threshold and adding a 6% power limit the card behaves way closer to your locked frequency. But hold on, you can pick the results from Sleeping Dogs for example with a 12% improvement between stock and Uber (15% faster average clock) or Splinter Cell Blacklist with a 7% (8.5% faster average clock). Just by that you can tell that GPU Boost is highly variable and you can't emulate it with a locked clockspeed.
This is the point of the discussion, using results you were attempting to discredit saying they fluctuated.
Sleeping Dogs, 12% increase for a 15% increase in core, 0% increase in bandwidth.
Splinter Cell Blacklist, 7% faster with an 8.5% increase in core, 0% increase in bandwidth.
Boost doesn't fluctuate the clock, I can run mine all day it will stick 1137MHz at stock without adjusting anything. The only reason boost would change is if it was a) used under vsync b)throttling due to heat/power.
By your own Tomb Raider figures max fps for the first bench is 27% higher than average and min fps 28% lower with a locked clockspeed. Showing that your GPU is way more or less efficient than the average in some parts of the benchmark. Depending on how GPU Boost behaves you should add or subtract an aditional X% based on other factors that your locked frequency doesn't take into account.
Tomb Raider had a 26% increase in core speed, coupled with a +400 offset on the memory.
Min increased by 29%
Max increased by 27%
Avg increased by 25%
Way more not found. GPU boost is fixed, I'm not sure what to tell you. The clock speed does not change +/- 13MHz during the entire test.
Your results are unrealistic because no card will behave like yours with different max and min figures.
Made up because you removed the GPU Boost functionality and misleading because you're trying to pass them as the same as a stock card averaging those clockspeeds.
Not sure how they're unrealistic when they're right in front of your face? How could they be anymore real than that?
Based on what exactly? Do you have a card to test yourself the back up your claims, or are they just claims with no proof?
This is with stock GHz bios, boost is not disabled.
1019/1500
OC
1176/1750
Core OC: 15%
Min FPS increase: 16%
Max FPS increase: 12%
Avg FPS increase: 15%
Any other theories you want me to investigate?
It's like you bought your card and rediscovered the wheel for everyone. If no one has done the same testing as you since the Titan launch is because all the reasons above.
It's like I'm an enthusiast end user discussing scaling in a thread with water cooling and 1.4v OC's and 1440 MHz 780s... What did you think it would be like? Stock reference?
Edit: OPS I had adaptive V-Sync on @ 74Hz, which is why the Max didn't scale up like it should have percent wise and why there is a slight dip in usage when it it is peaking the FPS as seen in Afterburner graph.
Do I need to retest, or are we ok with these
boost enabled results?