Stock testing of gtx 680's would/should have been done without installing EVGA precision and adjusting any parameters.
Then you have basce clock 1006
Turbo clock 1058
In perfect settings of temperature/TDP there may be another 50mhz from all the sites I read. Who are also still trying to understand and find the correct way to state card behavior.
Worst case almost never will go below 1006 and best case almost never goes above 1100mhz, unless TDP parameters are adjusted or the default fan curve and ambient temperatures are extreme one way or the other. Being hot or cold.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/30.html
Temperature
In the briefings NVIDIA mentioned that GPU temperature is taken into account for dynamic overclocking. Wait, what? Will the card run slower when it's running hot ? Yes it does.
The following graph shows how changes in GPU temperature affect the selected clock. We tested this with a static scene that renders the same scene each frame, resulting in constant GPU and memory load, which would otherwise affect this testing.
GPU clock is plotted on the vertical axis using the blue MHz scale on the left. Temperature is plotted on the vertical axis too, using the red °C scale on the right. Time is running on the horizontal axis.
We see a clearly visible downward step pattern on the clock frequency curve as the temperature increases. This is not a gradual change, but the steps happen at what looks like predefined values of 70°C, 80°C, 85°C, 95°C. Once temperature exceeds 98°C, thermal protection will kick in and GPU clock drops like a rock to 536 MHz and then even 176 MHz trying to save the card from overheating.
Each step is 13.5 MHz in size, which results in a total clock difference of 40 MHz going from below 70°C to 95°C - with the exact same rendering load, all happening transparently by the NVIDIA driver.
For end-users this means that to maximize dynamic overclocking potential, they would have to run at temperatures below 70°C. Otherwise they will end up with up to 40 MHz less if their card runs above 90°C. Even users who don't care about manual overclocking will have to consider this. The dynamic overclocking in the driver is always active and can not be turned off.
Performance now being based on temperature will pose an interesting challenge for system assemblers and case manufacturers as they will now have to focus even more on thermals, while still trying to keep noise levels acceptable. How will reviewers test their cards? With an open bench? a normal case? or a worst case [sic] ?
I ran some additional testing with the card's fan speed set to maximum, which results in much lower temperatures of the card, directly increasing performance (without any manual overclocking).
It looks like on average just setting the fan to 100% results in a 0.8% performance increase. Again, this is without any overclocking or other tweaking. 0.8% is not very significant, but it still shows that there is now another variable that needs to be considered when trying to maximize performance. It also means that cases with really bad ventilation will suffer from a (small) performance penalty when a GeForce GTX 680 is installed in the case.