9600GT SLi review

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MeSh1

Member
Jul 1, 2004
104
0
0
Originally posted by: jaredpace
I've also heard a lot of complaints about nvidia sli motherboards on various forums, although I have never owned one. Just seems everyone likes to recommend p35 boards when doing penryn overclocking. You can't get sli on a p35 afaik.

I'm sure nvidia chipsets are great for mild oc's and never give problems. But there should be a reason that 3 people in the post alone have mentioned hearing problems with nvidia boards.



I have read the same. First they're known for running hot. One look at the 790i cooling solution confirms that. Also for the predicted price of 350+ and the only diff is ddr3/pcie2.0 and official support for 1600 i just think that its not worth it (to me). Also they're using the same ol' MCP and for that price the and the next gen tag they put on it i just think its not worth it (again for me). Also they're claiming native pcie 2.0, but one of them will be 1.0 (thats from my research). I would love an x48 board with SLI, but unless you wanna get some skull looks like its not likely to happen anytime soon.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Rusin
Originally posted by: SickBeast
Just to de-bunk these questions surrounding the 9600's shaders:

Why doesn't someone with an 8800GT open up RivaTuner, and reduce your shader clockspeed to 50% of your stock speed (should be around 800mhz). That should, in theory, give them equal shading power so long as their SPs are the same.

Then, just benchmark a game. If it benchmarks the same (or very close to the same), it probably means that the G94 and G92 have idential SPs.

If I have time later tonight I'll see what happens when I run my 8800GTS 320mb's shaders at 800mhz. :moon:
I did run few tests with my card and used following clocks:
620/1440/902
620/720/902

Unreal Tournament 3: No difference
Lost Planet (DX9): No difference
3DMark06: No difference
Supreme Commander: No difference
I noticed the same thing in COD4 on my card. It looks like we're onto something...
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: thilan29
Originally posted by: Rusin
I did run few tests with my card and used following clocks:
620/1440/902
620/720/902

Unreal Tournament 3: No difference
Lost Planet (DX9): No difference
3DMark06: No difference
Supreme Commander: No difference

No difference whatsoever?? That's news for sure.
Differencies were below 1% (and below 1 fps).. well inside possible error margin.

I raised the bets and made it 595MHz for shaders and ran 3dMark06
620/1440/902: 10742
620/595/902: 10690
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: keysplayr2003
Is someone with a 8800GT kind enough to just use RivaTuner and disable 48 additional shaders please (if possible), clock the core and shaders exactly like a 9600GT, and run some comparable benches? Use the same drivers used on 9600GT reviews. Even an 8800GTS512 would do, just disable 64 shaders. Is this possible?

Does anyone find this method unacceptable to compare it to a 9600GT?
If so, please say why.

If it runs the same as a 9600GT, I will offer every apology to Azn possible. If it runs slower, I won't say a word, I promise.

If this compression technology is indeed available to all G92's as well as a G94, would the latest drivers used for 9600GT testing use this compression technology as well? Or are the G92 cores not capable somehow? This would be a good way to end a long debate here in this thread, and we can start another thread dedicated to this test if you like.

Heh, keys you are trying too hard to use logic to make them understand... everyone realizes it but two people who seem oblivious to facts, logic, and reason. It was giving me a headach just to read their posts, trying to make him understand is like banging your head against a wall, don't do it to yourself.



From what I hear nvidia made the following changes:
1. Optimized their AA/AF implementations so they take significantly less ram.
2. supposedly they have new compression algorithms that increase effectual bandwidth.
3. other changes they are not telling anyone. (the first two are rumers anyways, they are not telling what they changed).

I prefer comparing this card to a 8800GTS 512... same core clock, same shader clock, exactly half the SP, slightly lower mem clock, very close performance...
EDIT: also half the "Texture Address / Filtering"

I wonder though... maybe its just all a vram issue... the new card might be so much more efficient in its AA/AF as far as vram is concerned that it improves performance tremendously across the board.
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
If there is no difference between those runs, then what is the big deal about shader clock then?

It is all about the number of shader processors then.
Also, it is all about the core and memory clocks.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Thank you guys for testing it out. I knew long as shader is ample enough for a game it doesn't make that big of a difference. Fillrate on the other hand fed with high memory always give you big gains long as your cpu isn't bottlenecking it.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Cheex:
Well shader performance is all about shaders + shader clocks.

It seems that shader performance is not that important anymore that it was at Geforce 7 era.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Cheex
If there is no difference between those runs, then what is the big deal about shader clock then?

It is all about the number of shader processors then.
Also, it is all about the core and memory clocks.
Theoretically by running the shader clock at half, it's the same as having half the number of shaders. Essentially, it gives you the same amount of shader power, in theory.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Rusin
Originally posted by: SickBeast
Just to de-bunk these questions surrounding the 9600's shaders:

Why doesn't someone with an 8800GT open up RivaTuner, and reduce your shader clockspeed to 50% of your stock speed (should be around 800mhz). That should, in theory, give them equal shading power so long as their SPs are the same.

Then, just benchmark a game. If it benchmarks the same (or very close to the same), it probably means that the G94 and G92 have idential SPs.

If I have time later tonight I'll see what happens when I run my 8800GTS 320mb's shaders at 800mhz. :moon:
I did run few tests with my card and used following clocks:
620/1440/902
620/720/902

Unreal Tournament 3: No difference
Lost Planet (DX9): No difference
3DMark06: No difference
Supreme Commander: No difference

ok... now test it with the following games:
Crysis with quality shaders
Company of Heroes DX10 with high shaders
any other shader intensive DX10 game... (I am guessing WIC will also be one)

The whole point of shader performance is that it is a huge limitation in any game that uses the DX10 realistic shadows...
They kick the card's ass...
I mean... does superme commander even HAVE shader useage? I don't see many shadows, nor do I see realistic light effects...

http://www.anandtech.com/video/showdoc.aspx?i=3029&p=4
This is the best side by side picture difference of how shadows look like between DX9 and DX10... this kind of shadow rendering (horribly fake vs completely realistic) complete kicks any card's ass and has an exact correlation between shader power and total FPS in modern cards tested with it...
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
Originally posted by: keysplayr2003
Is someone with a 8800GT kind enough to just use RivaTuner and disable 48 additional shaders please (if possible), clock the core and shaders exactly like a 9600GT, and run some comparable benches? Use the same drivers used on 9600GT reviews. Even an 8800GTS512 would do, just disable 64 shaders. Is this possible?

Does anyone find this method unacceptable to compare it to a 9600GT?
If so, please say why.

If it runs the same as a 9600GT, I will offer every apology to Azn possible. If it runs slower, I won't say a word, I promise.

If this compression technology is indeed available to all G92's as well as a G94, would the latest drivers used for 9600GT testing use this compression technology as well? Or are the G92 cores not capable somehow? This would be a good way to end a long debate here in this thread, and we can start another thread dedicated to this test if you like.

Heh, keys you are trying too hard to use logic to make them understand... everyone realizes it but two people who seem oblivious to facts, logic, and reason. It was giving me a headach just to read their posts, trying to make him understand is like banging your head against a wall, don't do it to yourself.



From what I hear nvidia made the following changes:
1. Optimized their AA/AF implementations so they take significantly less ram.
2. supposedly they have new compression algorithms that increase effectual bandwidth.
3. other changes they are not telling anyone. (the first two are rumers anyways, they are not telling what they changed).

I prefer comparing this card to a 8800GTS 512... same core clock, same shader clock, exactly half the SP, slightly lower mem clock, very close performance...

I wonder though... maybe its just all a vram issue... the new card might be so much more efficient in its AA/AF as far as vram is concerned that it improves performance tremendously across the board.

You mean these tweaks?


http://www.pcper.com/article.php?aid=522&type=expert

Other than those features chages, NVIDIA was eager to promote some "new" features that the GeForce 9600 GT. I say "new" like that only because these features ALREADY existed on the G92 cores of the 8800 GT and GTS, they just weren't advertised as heavily. Take this compression technology that allows more efficient transfer of data from memory to the GPU - NVIDIA is comparing it to the G80 in the graph above, not G92.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
There are no new FEATURES (aside from the purevideo changes), there are new architectural modifications that provide tremendous improvement and they don't say what they were...
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Rusin
Cheex:
Well shader performance is all about shaders + shader clocks.

It seems that shader performance is not that important anymore that it was at Geforce 7 era.

These SP give you much more shader performance than Geforce 7 era GPU's. 16SP is probably equivalent to say 7900gtx I presume. Maybe a little more.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
There are no new FEATURES (aside from the purevideo changes), there are new architectural modifications that provide tremendous improvement and they don't say what they were...

Would you mind telling everyone where you got this information? A link would be nice.

Far as architectural changes.. I don't see any changes. Do you?

http://www.bit-tech.net/conten...s_512/8800gts-flow.jpg
G92GTS

http://www.bit-tech.net/conten..._graphics_card/g94.jpg
G94
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Originally posted by: Rusin
Cheex:
Well shader performance is all about shaders + shader clocks.

It seems that shader performance is not that important anymore that it was at Geforce 7 era.

Precisely...:thumbsup:
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: taltamir
Originally posted by: Rusin
Originally posted by: SickBeast
Just to de-bunk these questions surrounding the 9600's shaders:

Why doesn't someone with an 8800GT open up RivaTuner, and reduce your shader clockspeed to 50% of your stock speed (should be around 800mhz). That should, in theory, give them equal shading power so long as their SPs are the same.

Then, just benchmark a game. If it benchmarks the same (or very close to the same), it probably means that the G94 and G92 have idential SPs.

If I have time later tonight I'll see what happens when I run my 8800GTS 320mb's shaders at 800mhz. :moon:
I did run few tests with my card and used following clocks:
620/1440/902
620/720/902

Unreal Tournament 3: No difference
Lost Planet (DX9): No difference
3DMark06: No difference
Supreme Commander: No difference

ok... now test it with the following games:
Crysis with quality shaders
Company of Heroes DX10 with high shaders
any other shader intensive DX10 game... (I am guessing WIC will also be one)
Well perhaps I should install Vista (bought it some time ago, used it and returned to XP)
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Rusin
Originally posted by: taltamir
Originally posted by: Rusin
Originally posted by: SickBeast
Just to de-bunk these questions surrounding the 9600's shaders:

Why doesn't someone with an 8800GT open up RivaTuner, and reduce your shader clockspeed to 50% of your stock speed (should be around 800mhz). That should, in theory, give them equal shading power so long as their SPs are the same.

Then, just benchmark a game. If it benchmarks the same (or very close to the same), it probably means that the G94 and G92 have idential SPs.

If I have time later tonight I'll see what happens when I run my 8800GTS 320mb's shaders at 800mhz. :moon:
I did run few tests with my card and used following clocks:
620/1440/902
620/720/902

Unreal Tournament 3: No difference
Lost Planet (DX9): No difference
3DMark06: No difference
Supreme Commander: No difference

ok... now test it with the following games:
Crysis with quality shaders
Company of Heroes DX10 with high shaders
any other shader intensive DX10 game... (I am guessing WIC will also be one)
Well perhaps I should install Vista (bought it some time ago, used it and returned to XP)

Rusin, you are certain you disabled 48 pipelines in your 8800GT?
Please check with GPU-Z after you disable them to make sure.

Thanks for doing this.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: keysplayr2003
Rusin, you are certain you disabled 48 pipelines in your 8800GT?
Please check with GPU-Z after you disable them to make sure.

Thanks for doing this.
How can you disable them?

We're running them at half the clockspeed to simulate turning off 48 of them. :light:
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: SickBeast
Originally posted by: keysplayr2003
Rusin, you are certain you disabled 48 pipelines in your 8800GT?
Please check with GPU-Z after you disable them to make sure.

Thanks for doing this.
How can you disable them?

We're running them at half the clockspeed to simulate turning off 48 of them. :light:

Riva Tuner? Same thing as unlocking pipes on a 6800vanilla, only in reverse.
I just looked at my 8800GTS in Riva Tuner. I sure as heck can't unlock any shaders. I will try to lock some and report back.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well... I tested CoH opposing fronts... E8400, 8800GTS 512MB
Everything max (high / ultra / etc) 16xQ CSAA with 1920x1200 resolution...
Core/Shader/Mem
650/1625/976 - 16.7 fps on time demo
(stock)
650/1625/898 - 15.7 fps on time demo
(as close to the 9600GT clockspeeds I can get)
650/815/898 - 15.4 fps on time demo
(minimum shader clock, almost half the shader power, which is presumably equivlanet to the 9600GT assuming shader power is exactly SPamount x SPclock, except for the fact it has lower texturing / filtering whatever it means)

Actually loading a mission at those settings with stock speeds (which was 16.7fps average, with a max of over 40 and a min of over 10)... loaded with fraps
1 to 7 fps... It was a solid 4fps with occasional spikes to 1 or 7fps. I didn't touch anything either, it was just sitting there looking at the slideshow.
The time demo hasn't changed at all from v1.000 of the game, back then it was DX9 only and lacking many features that are now here... it is entirely a cut scene, with no actual game play, and it doesn't seem to have any scenes where shadows come into play. That is... there are no "light sources" within it to showcase the engine modifcations...

So I would deem that test an epic fail.
PS. I have vysnc and triple buffering forced on in drivers, always have, I can't stomach tearing...

I will see about testing something more indicative of real gameplay later.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: keysplayr2003
Originally posted by: SickBeast
Originally posted by: keysplayr2003
Rusin, you are certain you disabled 48 pipelines in your 8800GT?
Please check with GPU-Z after you disable them to make sure.

Thanks for doing this.
How can you disable them?

We're running them at half the clockspeed to simulate turning off 48 of them. :light:

Riva Tuner?
Are you talking about the "nV Strap" driver? I didn't realize that it was possible to disable shader units in RivaTuner.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
As far as I understand they are just downlocking the shaders to half the speed to simulate half the shaders... ofcourse that is assuming a perfect clock x amount = performance equasion, which in my opnion is doubtful..

If there is a way to actually disable them I would like to know.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,413
401
126
Originally posted by: Azn
Nvidia updated their purevideo but there is no evidence or any of the articles out there that g94 has an advantage over g92.
Please enlighten me if there is some actual facts about these supposed tweaks.
IINM, PureVideo HD doesn't do VC-1 motion comp whereas ATI's UVD does. G94 (and G98 - looking forward to the new 8400GS) suppossedly address this issue.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: taltamir
As far as I understand they are just downlocking the shaders to half the speed to simulate half the shaders... ofcourse that is assuming a perfect clock x amount = performance equasion, which in my opnion is doubtful..

If there is a way to actually disable them I would like to know.

Check above post. I did it with Riva Tuner under the NVStrap section. I just disabled 2 "units" that had 16 shaders each. I left the ROP's alone of course. All we are interested in are the shaders.

A reboot is required each time it is configured.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |