9600GT SLi review

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Originally posted by: bryanW1995
Originally posted by: Cheex
Originally posted by: nitromullet
Originally posted by: Cheex
Originally posted by: Killrose
182% efficiency?

I think they need to rethink thier logic, LOL

Why?

1 x 9600GT = 100%
2 x 9600GT = 182%

That is a performance ratio. What is there to rethink?

...that's 182 out of a possible 200... Which gives you 90% efficiency, not 182%. Still very nice though.

Well, if you are really going to look at it that way, then it is 91%...but I'm not nit-picking about it...

That is still amazing because we all know you're NOT going to get 100% all the time but 91% is VERY, VERY, GOOD!!

some of their numbers showed over 100% improvement in frame rate

Yes, and in only 2 occasions does it go below 50%.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
Why are we using CoD 4 to measure shader performance? That game has a relatively primitive engine compared to other modern games, especially with shaders; it doesn?t even use FP HDR.

It?s also generally CPU limited in most situations so staring at a FPS counter while playing a map isn?t really going to show anything.

I ran tests on other games (including the four year old Riddick using the SM 2.0++ path) to demonstrate the impact of running 128 SPs vs 64 SPs on my 8800 Ultra (everything else was at stock) and here?s what I got:

Game.............64...128..Gain
===================
Bioshock.........34...53....56%
Call of Juarez..27...39....44%
Crysis.............25...39....56%
Riddick...........29....40....38%

As you can see, there is a substantial performance gain when going from 64 SPs to 128 SPs, especially in modern engines like UT3 and CryEngine.

Also I run those games with AA so it?ll strain the memory bandwidth and ROPs more than otherwise.
 

dadach

Senior member
Nov 27, 2005
204
0
76
Originally posted by: keysplayr2003
Ok, I ran CoD4 with 64 shaders and got ZERO difference on my 8800GTS G80. I further disabled another 32 shaders for a grand total of 32 enabled shaders. My framerate absolutely TANKED. So, it looks like 64 shaders is enough for anybody, in CoD4 at least. A lot of these shaders, it would appear, are sitting idle.

Azn, my hat is off to you. You were right. But this is all I really wanted out of our debate. I didn't care if I was right or wrong, but just wanted to know the "true" answers. And now I think we have them. There is no special compression technology it would appear. Looks like ROP's are a much bigger factor than shaders. Oh BTW, we can disable ROP's as well in RivaTuner. That would be an interesting thing to do. Maybe disable 4 or 8 of them to see how much it affects performance.
Well, time to put my shaders back online. Be back in a few.

next time keys, tone down the fanboyism, and you will come off more fair like your title on forum requires..and take some of your own advice:

"Let me tell you, it's better to say nothing and just observe."

way to put him in his place azn
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Those numbers look ridiculous to me. What resolution is that? A 9600gt with 64SP gets 30-31fps in Crysis @ high detail 1280x1024. I think your $650 ultra is broken.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: BFG10K
Why are we using CoD 4 to measure shader performance? That game has a relatively primitive engine compared to other modern games, especially with shaders; it doesn?t even use FP HDR.

It?s also generally CPU limited in most situations so staring at a FPS counter while playing a map isn?t really going to show anything.

I ran tests on other games (including the four year old Riddick using the SM 2.0++ path) to demonstrate the impact of running 128 SPs vs 64 SPs on my 8800 Ultra (everything else was at stock) and here?s what I got:

Game.............64...128..Gain
===================
Bioshock.........34...53....56%
Call of Juarez..27...39....44%
Crysis.............25...39....56%
Riddick...........29....40....38%

As you can see, there is a substantial performance gain when going from 64 SPs to 128 SPs, especially in modern engines like UT3 and CryEngine.

Also I run those games with AA so it?ll strain the memory bandwidth and ROPs more than otherwise.

Surely there will be games that stress shaders more than others. There are games that need more than 64 shaders. But CoD4 was just the game we happened to try out.

You can use RivaTuner to disable 64 shaders on your Ultra and run your same tests. And, in contrast, you can disable 8 ROP's instead for a second test.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: dadach
Originally posted by: keysplayr2003
Ok, I ran CoD4 with 64 shaders and got ZERO difference on my 8800GTS G80. I further disabled another 32 shaders for a grand total of 32 enabled shaders. My framerate absolutely TANKED. So, it looks like 64 shaders is enough for anybody, in CoD4 at least. A lot of these shaders, it would appear, are sitting idle.

Azn, my hat is off to you. You were right. But this is all I really wanted out of our debate. I didn't care if I was right or wrong, but just wanted to know the "true" answers. And now I think we have them. There is no special compression technology it would appear. Looks like ROP's are a much bigger factor than shaders. Oh BTW, we can disable ROP's as well in RivaTuner. That would be an interesting thing to do. Maybe disable 4 or 8 of them to see how much it affects performance.
Well, time to put my shaders back online. Be back in a few.

next time keys, tone down the fanboyism, and you will come off more fair like your title on forum requires..and take some of your own advice:

"Let me tell you, it's better to say nothing and just observe."

way to put him in his place azn

Actually, dadach. I put myself in my place. It was my idea to disable shaders and test in the first place. Maybe next time, you'll understand that I am after the truth, not magical shaders, or whatever the subject of discussion might be. Intellectual discussions often get heated, dadach. And some may perceive it as fanboyism as you certainly have. If you can tell me what I seemed to be a fan of in this thread, I'd appreciate it.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Why are we using CoD 4 to measure shader performance? That game has a relatively primitive engine compared to other modern games, especially with shaders; it doesn?t even use FP HDR.

It?s also generally CPU limited in most situations so staring at a FPS counter while playing a map isn?t really going to show anything.

I ran tests on other games (including the four year old Riddick using the SM 2.0++ path) to demonstrate the impact of running 128 SPs vs 64 SPs on my 8800 Ultra (everything else was at stock) and here?s what I got:

Game.............64...128..Gain
===================
Bioshock.........34...53....56%
Call of Juarez..27...39....44%
Crysis.............25...39....56%
Riddick...........29....40....38%

As you can see, there is a substantial performance gain when going from 64 SPs to 128 SPs, especially in modern engines like UT3 and CryEngine.

Also I run those games with AA so it?ll strain the memory bandwidth and ROPs more than otherwise.

Surely there will be games that stress shaders more than others. There are games that need more than 64 shaders. But CoD4 was just the game we happened to try out.

You can use RivaTuner to disable 64 shaders on your Ultra and run your same tests. And, in contrast, you can disable 8 ROP's instead for a second test.

I think BFG tested uber high resolution where 64 shader was bottlenecking a bit more. When you increase resolution you also need more shader power. Much like you need more bandwidth and fillrate.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Azn
Keys did you try disabling the rops?

No, not yet. Just the shaders. I will try later this afternoon after work.
I can also try the following games:

Crysis
BioShock
CoD4
STALKER

These would be the most recent games I have ATM. I like FPS's over RPG's. I've never even played Oblivion. hehe.

Maybe BFG can give it a whirl on his Ultra.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Keys - are you (and is everyone else) pretty sure that using RivaTuner to disable shaders is completely safe? If so, I'm more than willing to go ahead and disable 48 of my 8800GT's shaders and do a run through of Oblivion at 19x12 to compare performance. I've got a BFG version with a very mild (625 vs 600) overclock on the core. I could also run tests on Neverwinter Nights 2 and Lord of the Rings Online, just for fun.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
i got my 9600GT today but can i just say that if i was getting a card for over 1680x res

then i wouldnt of bothered with anything thats out now and id of just kept on waiting.


other than that this card is seriously fast and im really impressed with it.

oh and just a quick question, im using XP home now and have been given XP MCE 2005,

will i lose any perfromance installing Media center as opposed to XP Home ?

Cheers


 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: dreddfunk
Keys - are you (and is everyone else) pretty sure that using RivaTuner to disable shaders is completely safe? If so, I'm more than willing to go ahead and disable 48 of my 8800GT's shaders and do a run through of Oblivion at 19x12 to compare performance. I've got a BFG version with a very mild (625 vs 600) overclock on the core. I could also run tests on Neverwinter Nights 2 and Lord of the Rings Online, just for fun.

I don't see how it can harm anything as you are just essentially "putting the shader blocks to sleep", but I cannot guaranty that it is 100% safe. I re-enabled my full 96 shaders without issue on my 8800GTS 640 and happily continued gaming. If you are concerned about damaging the shader units, then I would think twice about it. I don't really know what could, or potentially go wrong.

SickBeast, I believe, has tried it. Maybe Taltamir as well.

I think it is safe for the most part. I am going to try my ROP's in a little while. I have 20 by default. Folks with 8800GTXs and Ultras have 24. All G92/94 cards have 16 ROPs.

I wanted to try mine at 16, then 12, then 8 respectively and record performance differences for each (if any). Leaving my shaders at 96 throughout testing.

Just some quick notes to put down it text here:

My 8800GTS in CoD4 running at 1680x1050 all setting high. 4xAA.

96 shaders: Avg fps is 60 fps in multiplayer
64 shaders: Avg fps is 60 fps in multiplayer
32 shaders: Avg fps is 17 fps in multiplayer (yecchh!)

I can see why the 8600's didn't do so well.

WARNING!!!!!!! Do not disable Raster Units in Riva Tuner. I have trouble now. Working it out. So don't do it!!!

EDIT: Ok back to normal. I had to run "Last Known good Configuration" by pressing F8 before operating system loads.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
whoa whoa whoa.... Just a shot in the dark here... BUT

Perhaps nvidia is taking the g92 architecture further. Why would they decrease the rops/tmus/memory size/memory bandwidth/etc and provide only a higher clock speed. I bet this g92 style die is capable of 32 or 48 rops, 1024 or 2048mb memory, 384 or 512-bit mem bus, and many other "limited" options. Wonder if they plan to sit on this til the next process drop and unleash a fully capable g92 at 45nm or 32nm. Anybody?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Keys, one quick question. How can you get at your keyboard with those enormous balls of yours in the way?

Definitely appreciate your testing. This is incredibly interesting.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
@ sniperdaws: possibly? Vista 32 home sucks more than xp thats a fact. my vista came with media center built in. don't know if they'res any correlation. google up xp vs. mce gaming benchmarks.

I'm sure xp performs better, as it outperforms about every other os due to such mature drivers.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
i tried google but im usless at googling.

I do have Vista Premium 32/64bit but i dont like it and my TV card hasnt got signed drivers so its a ballache.

Thanks, and sorry for Hijacking the thread

Move along people nothing to see.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: keysplayr2003
Originally posted by: dreddfunk
Keys - are you (and is everyone else) pretty sure that using RivaTuner to disable shaders is completely safe? If so, I'm more than willing to go ahead and disable 48 of my 8800GT's shaders and do a run through of Oblivion at 19x12 to compare performance. I've got a BFG version with a very mild (625 vs 600) overclock on the core. I could also run tests on Neverwinter Nights 2 and Lord of the Rings Online, just for fun.

I don't see how it can harm anything as you are just essentially "putting the shader blocks to sleep", but I cannot guaranty that it is 100% safe. I re-enabled my full 96 shaders without issue on my 8800GTS 640 and happily continued gaming. If you are concerned about damaging the shader units, then I would think twice about it. I don't really know what could, or potentially go wrong.

SickBeast, I believe, has tried it. Maybe Taltamir as well.

I think it is safe for the most part. I am going to try my ROP's in a little while. I have 20 by default. Folks with 8800GTXs and Ultras have 24. All G92/94 cards have 16 ROPs.

I wanted to try mine at 16, then 12, then 8 respectively and record performance differences for each (if any). Leaving my shaders at 96 throughout testing.

Just some quick notes to put down it text here:

My 8800GTS in CoD4 running at 1680x1050 all setting high. 4xAA.

96 shaders: Avg fps is 60 fps in multiplayer
64 shaders: Avg fps is 60 fps in multiplayer
32 shaders: Avg fps is 17 fps in multiplayer
(yecchh!)

I can see why the 8600's didn't do so well.

WARNING!!!!!!! Do not disable Raster Units in Riva Tuner. I have trouble now. Working it out. So don't do it!!!

EDIT: Ok back to normal. I had to run "Last Known good Configuration" by pressing F8 before operating system loads.


maybe try 48 shaders to get a more precise number between 60 and 17?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: v8envy
Keys, one quick question. How can you get at your keyboard with those enormous balls of yours in the way?

Definitely appreciate your testing. This is incredibly interesting.

LMAOOOO!!

I guess you could say that, its not the end of the world for me if I kill my card. I'd be a bit upset, but not the end of the world. I'd go out and get another one. Don't get me wrong, I like my money as much as the next guy, it's just that I currently have the means to get another card. So I am not as frightened to try things.

When I disabled an ROP, I got SUPER garblage on the windows startup screen, then a BSOD, to a reboot. I panicked, but worked it out. So now I can tell everyone NOT to try disabling them. Shaders units do not seem to have any problems being disabled however. Actually, I think they appreciate the rest!

Keys
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
whoa whoa whoa.... Just a shot in the dark here... BUT

Perhaps nvidia is taking the g92 architecture further. Why would they decrease the rops/tmus/memory size/memory bandwidth/etc and provide only a higher clock speed. I bet this g92 style die is capable of 32 or 48 rops, 1024 or 2048mb memory, 384 or 512-bit mem bus, and many other "limited" options. Wonder if they plan to sit on this til the next process drop and unleash a fully capable g92 at 45nm or 32nm. Anybody?

From G80 to G92 to was cut cost and be more efficient. I don't see how they are going to raise the rop count. Maybe they can add a bigger memory controller but reducing rop would lower performance.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
with regards to the 9800gtx/GX2, read what this guy is saying:

"I think this will end up superior. Would it have less total bandwidth with the rumored specs even with the 2400mhz effective memory? Yes, yes it would. But G92's architectural enhancements (textures etc) over G80 and sheer clock speed should allow it to usurp the former high-end. Yes I understand people have already done this through overclocking current G92 products, but what matters in the market is the stock clocks, and on these products they are relatively higher than what we currently see.

That again all being said, it has been mentioned that this generation was supposed to be launch Q407, a normal ~1 year between high-end products, but was held back when there was a surplus of the G80 products in the channel, even at the reduced price.

My assumption has been that the 8800gt/s were only released because of nvidia needing products to compete with the 3800 series and now that excess G80's are out of the channel, they can go ahead with the launch of the products clocked higher and with more bandwidth at stock to actually call it the generation after the 8800 series.

I would just think of it this way...If you grabbed a 8800gt/gts512MB, you got to jump on the next-gen train early. If you overclock, you have defeated nvidia's intention, with the exception of the new highest-end part using faster memory than what is currently available.["
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Azn
Originally posted by: jaredpace
whoa whoa whoa.... Just a shot in the dark here... BUT

Perhaps nvidia is taking the g92 architecture further. Why would they decrease the rops/tmus/memory size/memory bandwidth/etc and provide only a higher clock speed. I bet this g92 style die is capable of 32 or 48 rops, 1024 or 2048mb memory, 384 or 512-bit mem bus, and many other "limited" options. Wonder if they plan to sit on this til the next process drop and unleash a fully capable g92 at 45nm or 32nm. Anybody?

From G80 to G92 to was cut cost and be more efficient. I don't see how they are going to raise the rop count. Maybe they can add a bigger memory controller but reducing rop would lower performance.

thats the point, they are reducing rops compared to 8800gtx/ultras. wth?????????
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
Originally posted by: Azn
Originally posted by: jaredpace
whoa whoa whoa.... Just a shot in the dark here... BUT

Perhaps nvidia is taking the g92 architecture further. Why would they decrease the rops/tmus/memory size/memory bandwidth/etc and provide only a higher clock speed. I bet this g92 style die is capable of 32 or 48 rops, 1024 or 2048mb memory, 384 or 512-bit mem bus, and many other "limited" options. Wonder if they plan to sit on this til the next process drop and unleash a fully capable g92 at 45nm or 32nm. Anybody?

From G80 to G92 to was cut cost and be more efficient. I don't see how they are going to raise the rop count. Maybe they can add a bigger memory controller but reducing rop would lower performance.

thats the point, they are reducing rops compared to 8800gtx/ultras. wth?????????

Well that was the old core with 4 textures per clock. When GTX came out it was like $500+. This G92 9800gtx is supposed to cost $399 is it? Nivida is milking people?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
Those numbers look ridiculous to me. What resolution is that?
It depends on the game, but they vary between 1600x1200 to 1920x1440. Also all of them were tested with 4xAA and TrMS except for Crysis which used 2xAA.

I think BFG tested uber high resolution where 64 shader was bottlenecking a bit more.
Well yeah, I tested the settings I actually play games at.

Testing low resolutions will simply move the bottleneck to the CPU, especially if it's a primitive engine like CoD 4. The tests I have seen so far have largely been flawed.

Also setting the shader core to half will simply not work as there?s a limited range it can be out from the core and if it?s too far RivaTuner will reset it to default.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |