9600GT SLi review

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
OK guys, here is the official word from NVIDIA on this, quoted verbatim:

This is no mojo going on. The GeForce 8800 GT and 9600 GT use the same stream processors.

The following are synthetic shader and fillrate tests. Note the sizable difference between 8800 GT and 9600 GT.*

What should be noted is that the 8800 GT and 9600 GT have identical bandwidth. Realworld games do not simply run a perlin noise function or fill polygons. They fetch a lot of textures, render targets and so on. These get swaped in and out of memory. With AA, bandwidth consumption goes up even more.

In general, games consume more BW than shader. This is why the GeForce 9600 GT performs 'almost' like a 8800 GT.

So there you have it, as I said, no new shaders, just new data compression in the G90s and the games using the cards similar bandwidth.

*there was a graph in the email here showing the Perlin Noise of 9600 at 59 and 8800GT at 116, and another showing the multitextured fillrate of the 9600 at 16192 and the 8800GT- both 3DMark06 at 19X12


 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: nRollo
OK guys, here is the official word from NVIDIA on this, quoted verbatim:

This is no mojo going on. The GeForce 8800 GT and 9600 GT use the same stream processors.

The following are synthetic shader and fillrate tests. Note the sizable difference between 8800 GT and 9600 GT.

What should be noted is that the 8800 GT and 9600 GT have identical bandwidth. Realworld games do not simply run a perlin noise function or fill polygons. They fetch a lot of textures, render targets and so on. These get swaped in and out of memory. With AA, bandwidth consumption goes up even more.

In general, games consume more BW than shader. This is why the GeForce 9600 GT performs 'almost' like a 8800 GT.

So there you have it, as I said, no new shaders, just new data compression in the G90s and the games using the cards similar bandwidth.


Rollo thanks for that tidbit!!!!

So shaders are the same, core is the same architecture, drivers are the same?

whats different? memory runs faster? core is clocked higher?

I still would like to see some test results.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
If the shaders on the G94 were superior in some way, it would have surpassed the 8800GT when the 8800GT ran at 64 shaders, not equaled it.
Not if your test didn't bottleneck the shaders enough.

No matter what game was used. CoD4, Bioshock, Crysis, whatever.
This is simply untrue. As an extreme example, do you believe testing GLQuake is equivalent to testing Bioshock?

How is it required to run say, CoD4 at 25x16 when all we are testing is Power per shader?
Because you need a test situation that stresses the shaders in order to test shaders. If you aren?t stressing the shaders how can you possibly test them?

Do you disagree that an 8800GT at 64 shaders and exact same clocks as a 9600GT would run STALKER any differently at any resolution? How about Bioshock? Lost Planet?
We don't have enough valid data to make that judgment.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
This is no mojo going on. The GeForce 8800 GT and 9600 GT use the same stream processors.

The following are synthetic shader and fillrate tests. Note the sizable difference between 8800 GT and 9600 GT.*

What should be noted is that the 8800 GT and 9600 GT have identical bandwidth. Realworld games do not simply run a perlin noise function or fill polygons. They fetch a lot of textures, render targets and so on. These get swaped in and out of memory. With AA, bandwidth consumption goes up even more.

In general, games consume more BW than shader. This is why the GeForce 9600 GT performs 'almost' like a 8800 GT.
nVidia's answer is all well and good, but it ignores the following:


[*]The 9600 GT has higher core and memory clocks than the 8800 GT, and it has more than half the SPs to begin with.
[*]The 9600 GT tests were run on newer drivers than the rest of the cards.
[*]People overclocking the 8800 GT?s memory bandwidth are seeing next to no performance gain compared to overclocking the core/shader.
[*]Despite all of this, in shader intensive games the 8800 GT is 30% faster in some situations over the 9600 GT.


expreview used the new 174 whql drivers that the 9600 runs on and put them on a 8800gt (g92). It boosts the 8800Gt's preformance as well. The "mystery magical filters" are driver side, and suitable for all g92's now, possibly via a modded driver INF...
Now we?re getting somewhere, as this is quite interesting. I?d like to see an 8800 GT compared to the 9600 GT using these drivers (assuming they function correctly on a 8800 GT) and that will give us the true performance rift to work around.

As a hypothetical example, those drivers could enable the ROP compression that was thus-far inactive, hence the 8800 GT isn?t benefitting like the 9600 GT is despite having the same core.
 

greengrass

Banned
Sep 18, 2007
60
0
0
Originally posted by: BFG10K
This is no mojo going on. The GeForce 8800 GT and 9600 GT use the same stream processors.

The following are synthetic shader and fillrate tests. Note the sizable difference between 8800 GT and 9600 GT.*

What should be noted is that the 8800 GT and 9600 GT have identical bandwidth. Realworld games do not simply run a perlin noise function or fill polygons. They fetch a lot of textures, render targets and so on. These get swaped in and out of memory. With AA, bandwidth consumption goes up even more.

In general, games consume more BW than shader. This is why the GeForce 9600 GT performs 'almost' like a 8800 GT.
nVidia's answer is all well and good, but it ignores the following:


[*]The 9600 GT has higher core and memory clocks than the 8800 GT, and it has more than half the SPs to begin with.
[*]The 9600 GT tests were run on newer drivers than the rest of the cards.
[*]People overclocking the 8800 GT?s memory bandwidth are seeing next to no performance gain compared to overclocking the core/shader.
[*]Despite all of this, in shader intensive games the 8800 GT is 30% faster in some situations over the 9600 GT.


expreview used the new 174 whql drivers that the 9600 runs on and put them on a 8800gt (g92). It boosts the 8800Gt's preformance as well. The "mystery magical filters" are driver side, and suitable for all g92's now, possibly via a modded driver INF...
Now we?re getting somewhere, as this is quite interesting. I?d like to see an 8800 GT compared to the 9600 GT using these drivers (assuming they function correctly on a 8800 GT) and that will give us the true performance rift to work around.

As a hypothetical example, those drivers could enable the ROP compression that was thus-far inactive, hence the 8800 GT isn?t benefitting like the 9600 GT is despite having the same core.

Good reviews here.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: nRollo
OK guys, here is the official word from NVIDIA on this, quoted verbatim:

This is no mojo going on. The GeForce 8800 GT and 9600 GT use the same stream processors.

The following are synthetic shader and fillrate tests. Note the sizable difference between 8800 GT and 9600 GT.*

What should be noted is that the 8800 GT and 9600 GT have identical bandwidth. Realworld games do not simply run a perlin noise function or fill polygons. They fetch a lot of textures, render targets and so on. These get swaped in and out of memory. With AA, bandwidth consumption goes up even more.

In general, games consume more BW than shader. This is why the GeForce 9600 GT performs 'almost' like a 8800 GT.

So there you have it, as I said, no new shaders, just new data compression in the G90s and the games using the cards similar bandwidth.

*there was a graph in the email here showing the Perlin Noise of 9600 at 59 and 8800GT at 116, and another showing the multitextured fillrate of the 9600 at 16192 and the 8800GT- both 3DMark06 at 19X12

Thank you nRollo for that piece. Games today just aren't shader intensive as people think.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
This is no mojo going on. The GeForce 8800 GT and 9600 GT use the same stream processors.

The following are synthetic shader and fillrate tests. Note the sizable difference between 8800 GT and 9600 GT.*

What should be noted is that the 8800 GT and 9600 GT have identical bandwidth. Realworld games do not simply run a perlin noise function or fill polygons. They fetch a lot of textures, render targets and so on. These get swaped in and out of memory. With AA, bandwidth consumption goes up even more.

In general, games consume more BW than shader. This is why the GeForce 9600 GT performs 'almost' like a 8800 GT.
nVidia's answer is all well and good, but it ignores the following:


[*]The 9600 GT has higher core and memory clocks than the 8800 GT, and it has more than half the SPs to begin with.
[*]The 9600 GT tests were run on newer drivers than the rest of the cards.
[*]People overclocking the 8800 GT?s memory bandwidth are seeing next to no performance gain compared to overclocking the core/shader.
[*]Despite all of this, in shader intensive games the 8800 GT is 30% faster in some situations over the 9600 GT.

Nvidia is point on what they described. A game fetches more texture from it's bandwidth than anything else.

9600gt has same memory bandwidth as 8800gt.

30% faster in rare situations. Majority of the games out today are less than that however.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
expreview used the new 174 whql drivers that the 9600 runs on and put them on a 8800gt (g92). It boosts the 8800Gt's preformance as well. The "mystery magical filters" are driver side, and suitable for all g92's now, possibly via a modded driver INF...

http://www.laptopvideo2go.com/...ex.php?showtopic=17609

So what does it mean? That G92 can do the same thing as G94.

Notice there are no performance numbers posted. Looks like 174 drivers are slightly tweaked drivers from their old. Much like 169 drivers were tweaked before it.

1 or 2 odd fps. Yes they are highly tweaked. :laugh:

Looks good so far, no real improvement in frame rate (the odd 1 or 2 fps). Temps are stable, lower than 167 series for my laptop though. Unlike the 174.13, I can enter sleep properly, and resume is definately much faster.
One thing to note, I am now running the RTM of Vista SP1(from microsoft's Open Licensing web site), with all the drivers under to the 174 series (167's, 169's, 171's & 173's) I had errors in the NVidia control panel, the 174's seem to have fixed this. Time will tell if this is the norm as and when the full SP1 rollout begins through Windows Update.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
It means that some performance you see in a g94 is due to these drivers. We took a modded INF and put it on the 8800gt as the inf only includes geforce 9600 series. what you're seeing at expreview if you read the article is the actual performance jumps given by just the drivers. its like 4-10% on avg.

Pair these new drivers with a 8800gt running 64SP and put it up against a 9600GT with the same clocks, and you get identical performance.

What does that say about the 9 series lineup ???????

AZN please read this:

http://en.expreview.com/2008/0...orceware-17416-whql/3/
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
It means that some performance you see in a g94 is due to these drivers. We took a modded INF and put it on the 8800gt as the inf only includes geforce 9600 series. what you're seeing at expreview if you read the article is the actualy performance jumps given by just the drivers. its like 4-10% on avg.

Pair these new drivers with a 8800gt running 64SP and put it up against a 9600GT with the same clocks, and you get identical performance.

What does that say about the 9 series lineup ???????

Notice 1 or 2 odd fps. It's tweaked from old drivers much like any drivers but it's no magic drivers that can compensate 40% less SP.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
read that link man, it has benchmark numbers for you.

specifically, check world in conflict at the point where they enable AA and AF. There is a 28% improvement in framerates when using the new driver set.

This had me confused earlier when i was saying the 9600 series has great advantages when it comes to rendering using AA and AF and how it takes almost no performance hit compared to G80. Now it appears it is identical to g92 as well, g92 = g94.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
read that link man, it has benchmark numbers for you.

specifically, check world in conflict at the point where they enable AA and AF. There is a 28% improvement in framerates when using the new driver set.

This had me confused earlier when i was saying the 9600 series has great advantages when it comes to rendering using AA and AF and how it takes almost no performance hit compared to G80. Now it appears it is identical to g92 as well, g92 = g94.

Didn't I say this before? If you read a few pages back I mentioned this compression technology is available on G92.

I will post it again.

http://www.pcper.com/article.php?aid=522&type=expert

Other than those features chages, NVIDIA was eager to promote some "new" features that the GeForce 9600 GT. I say "new" like that only because these features ALREADY existed on the G92 cores of the 8800 GT and GTS, they just weren't advertised as heavily. Take this compression technology that allows more efficient transfer of data from memory to the GPU - NVIDIA is comparing it to the G80 in the graph above, not G92.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I don't know why it doesn't work for me. I click on both links and it doesn't come up.

I hope this compression technology is available on my G84 since it's a little more advanced than G80.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
they are comparing it to an 8400GS is that g84?

That's a G86 I think basically a G84 cut down to 1/2 but they should be same technology wise. Did the scores improve?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
yesh when enabling aa and af from noaa and noaf in world in conflict, 28% performance boost.


crysis at 1280 x 1024 with AA enabled got a 17.56% boost.


 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |