Originally posted by: keysplayr2003
Originally posted by: jaredpace
Originally posted by: keysplayr2003
Originally posted by: BFG10K
If the shaders on the G94 were superior in some way, it would have surpassed the 8800GT when the 8800GT ran at 64 shaders, not equaled it.
Not if your test didn't bottleneck the shaders enough.
With both cards at 64 shaders, and identical clocks, does it matter if the shaders are bottlenecked???????? NO. Because the shaders are identical.
No matter what game was used. CoD4, Bioshock, Crysis, whatever.
This is simply untrue. As an extreme example, do you believe testing GLQuake is equivalent to testing Bioshock?
With both cards at 64 shaders, and identical clocks, does it matter what game is tested????? NO. Because the shaders are identical.
How is it required to run say, CoD4 at 25x16 when all we are testing is Power per shader?
Because you need a test situation that stresses the shaders in order to test shaders. If you aren?t stressing the shaders how can you possibly test them?
With both cards at 64 shaders, and identical clocks, does it matter if the shaders are stressed or not????????? NO. Because the shaders are identical.
Do you disagree that an 8800GT at 64 shaders and exact same clocks as a 9600GT would run STALKER any differently at any resolution? How about Bioshock? Lost Planet?
We don't have enough valid data to make that judgment.
Of course we have enough data.
Any game at any resolution at any settings are equal between a 9600GT and a 64 shader 8800GT with the same clocks. Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia. If you run the most shader intensive game on the planet on both of these cards at the highest possible resolution and settings, they will be equal when clocks are the same and both at 64 shaders. /fini
You have to use the 174 forceware that shipped with 9600GT on both cards for this to be true, as it has massive improvements and contributes towards some of the 9600gts stock performance.
You missed it Jared. A 9600GT and a 8800GT are identical when disabling 48 shaders on the 8800GT. Both have identical architecture, shaders, texture units, 256-bit bus and 16 ROP's. Using 174's on both cards would have the same results. I know an 8800GT should walk away from a 9600GT when more shader power is required. For some reason, BFG wants to take things further, much further, than originally intended. Blowing out the scope of our test in the process. So be it. I got what I needed, and I am willing to help him find what he needs. Just so long as he understands what I was looking for in the first place.
I'd just like to make clear the point a few seem to be missing. I know the 8800GT will be a performance leader in games that require more shader power. This is known to me. It doesn't escape me. Do not misinterpret my posts into thinking that I don't understand this.
My original intention was to see, on a per shader basis, if the G94 had architecture improvements contributing to it's better than ususal performance. I know benchmarks were conducted using a driver that took advantage of compression technology. That was not known before all this started.