- Oct 5, 2004
- 63
- 0
- 0
I happened to stumble over something peculiar last night. When using
the excellent program Powerstrip to create a custom resolution I saw
that it detected my display adapter's default clocks to be 425 MHz
(Engine) and 1100 MHz (Memory). The card I have is a standard
reference design Nvidia Geforce 6800 Ultra which should be clocked
400/1100. To make sure Powerstrip was not to blame I enabled the
overclocking in Forceware drivers with coolbits.
Surprisingly I found that when using software in 3D mode the official
Forceware drivers (61.77) automatically overclocks the card's engine
from 400 MHz to 425 MHz. Is this common knowledge? This means all
benchmarks actually show the Geforce 6800 Ultra performing out of
spec.! You can easily verify this by using Rivatuner. (Alt-Tab to Windows
during gameplay)
I am not sure if this really troubles me, but I can definitely see someone
getting angry.
the excellent program Powerstrip to create a custom resolution I saw
that it detected my display adapter's default clocks to be 425 MHz
(Engine) and 1100 MHz (Memory). The card I have is a standard
reference design Nvidia Geforce 6800 Ultra which should be clocked
400/1100. To make sure Powerstrip was not to blame I enabled the
overclocking in Forceware drivers with coolbits.
Surprisingly I found that when using software in 3D mode the official
Forceware drivers (61.77) automatically overclocks the card's engine
from 400 MHz to 425 MHz. Is this common knowledge? This means all
benchmarks actually show the Geforce 6800 Ultra performing out of
spec.! You can easily verify this by using Rivatuner. (Alt-Tab to Windows
during gameplay)
I am not sure if this really troubles me, but I can definitely see someone
getting angry.