GeForce 9-series lineup revealed

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
Originally posted by: Rusin
Originally posted by: Quiksilver
If I were to guess I would assume the performance scale would look something like this:

9800GX2: 15 to 30 percent faster than 8800 Ultra.
9800GTX: Equal to if not better than overclocked 8800 Ultra
9800GT: Equal to if not better than overclocked 8800GTS 512
9600GT: Equal to if not better than 3870.
I think that 9800 GX2 would be more powerfull than than that. I mean that 9600 GT SLI is already 15-30% faster than 8800 ultra in DX9 when using 1600x1200 + 4xAA 16xAF. With similar settings in DX10 we are talking about like 10% difference in 9600 GT SLI's favor. What kind of performance will 9800 GX2 have when it has double amount of SPs and TMUs?

In a perfect world I could see that, but the last dual gpu solution by nVidia wasn't that great so I was being realistic with the minimum only being 15% faster in some games and upwards to 30% in others.

Maybe after drivers mature some it could be 30% to 50% faster but I just don't think it will be there on launch day.

http://www.computerbase.de/art...rmancerating_qualitaet
(they have summed overall performance with different settings)
------
http://en.expreview.com/img/2008/02/23/9800GTX.jpg

If these specs for 9800 GTX holds truth it would mean that 9800 GTX would lose against 8800 GTX..perform equally with 8800 GTS 512. Differencies would be that this card would use new PCB (difference: Three way SLI support..).

Well, according the nordichardware those tables would be wrong, with the GTX being 750+ on core and the memory speeds being unknown.

Personally I'm not expecting much from this release either sames its pretty much just another refresh and not anything new. I'm just hoping ATI gets their 4000 series out into the market before or just after nVidia.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
If the 9800GX2 is the flagship 9-series card then it's logical to think that the 9800GTX and 9800GT are nothing more than respuns of the G92 8800GTS and 8800GT, the 8800GTS is already neck to neck with the GTX, if they increase the clockspeeds a little and optimize AA a little bit (like they did with the 9600GT) then the new 8800GTS (9800GTX) will be faster than the 8800 Ultra, giving nvidia a chance to retire the G80 core entirely, the 8800GT will go through the same changes and end up being a little faster, with the AA optimizations, the new 8800GT will be a much better card with AA.

It's quite simple actually, if we know the 9800GX2 is two GTS G92 on one card and we also know the GX2 is the flagship card, then it's impossible to think the 9800GTX and 9800GT will be major departures from the 8800GTS and 8800GT.

It's pure, hard logic, not the baseless FUD (wishful thinking) everyone seems to like to spread in these threads.

And enough about the 9600GT, there are no revolutionary changes, NV seemed to optimize AA heavily but other than that the 9600GT is only competitive because of the higher clocks and possibly a ROP/Bandwidth bottleneck in games, still, on shader intensive games the 8800GT still reigns supreme (well, it ALWAYS reigns supreme) not to mention the 173 Forceware drivers featured some heavy optimizations (especially in DX10) which means most reviews are not comparing the cards fairly since they use the old 169 for the 8800GT and the new 173 for the 9600GT. I remember seeing increases in the range of 5 fps for Crysis under DX10 (mind you this was going from 17 to 22 so it was a nice boost) with these new drivers so that's something else to consider.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Shader is the future and currently games aren't shader bound where SP dictates massive performance difference than fillrate and bandwidth.
We've been over this multiple times before and you were proven wrong. Remember the graphs I posted that demonstrated how prolific shaders are in modern games?

What it needs is more ROP and memory bandwidth.
ROPs and memory bandwidth aren't a factor when shaders are running calculations to generate ?rich? pixels.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
BFG, what cards is nvidia planning to release in Q3/Q4? Are they planning another 384 bit card, maybe a 9900gtx or something?
According to rumors it's the GT200 that is supposed to be a true high-end card rather than an SLI gimmick.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: BFG10K
BFG, what cards is nvidia planning to release in Q3/Q4? Are they planning another 384 bit card, maybe a 9900gtx or something?
According to rumors it's the GT200 that is supposed to be a true high-end card rather than an SLI gimmick.

okay great, I'm not alone thinking this is a sli gimmick. What crap. I was hoping the 9800gtx would be ownage. Now its looking like going from an 8800gts512 --> 9800gtx512 is the same as going from an e6850 --->e8400 (basically nothing)
 

lopri

Elite Member
Jul 27, 2002
13,221
612
126
Yup. It looks like B3D folks have already figured it out.

http://forum.beyond3d.com/showthread.php?t=46800&page=2

New Flagship from NV: 9800GX2 (dual G92)
9800 GTX / 9800 GT : New stepping G92 with higher clocks and possibly with different memory config
GT200: The true next-gen single GPU solution that won't show in near future

Some even speculate that NV originally planned to launch the GT200 but something went wrong and had to make do with G92 which was supposed to be mid-range part had G100 (now called GT200) launch was done as planned.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Shader is the future and currently games aren't shader bound where SP dictates massive performance difference than fillrate and bandwidth.
We've been over this multiple times before and you were proven wrong. Remember the graphs I posted that demonstrated how prolific shaders are in modern games?

I think you have it mixed up. You were proven wrong when GTX still beat G92GTS with higher shader clocks. And your graph yes it does show shader does make a difference but then again your graph was 3 year old based on old cards with weak shader capabilities.

What it needs is more ROP and memory bandwidth.
ROPs and memory bandwidth aren't a factor when shaders are running calculations to generate ?rich? pixels.

You are telling me pixel fillrate doesn't matter much like you claimed memory bandwidth doesn't matter. Is that why 8800gtx is still beats G92?
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: jaredpace


okay great, I'm not alone thinking this is a sli gimmick. What crap. I was hoping the 9800gtx would be ownage. Now its looking like going from an 8800gts512 --> 9800gtx512 is the same as going from an e6850 --->e8400 (basically nothing)
If we are accurate this will be smaller change than that one with Core2-processors. E8400 uses less power than E6850 due to dieshrink..

I'd say it's more like Radeon X1600XT -> X1650 PRO change was..
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Dadofamunky
Originally posted by: Quiksilver
.
If I were to guess I would assume the performance scale would look something like this:

9800GX2: 15 to 30 percent faster than 8800 Ultra.
9800GTX: Equal to if not better than overclocked 8800 Ultra
9800GT: Equal to if not better than overclocked 8800GTS 512
9600GT: Equal to if not better than 3870.

Man, if it shakes out that way, I can't say that's a big product refresh. The 9600GT looks like the only real winner in the lot. It could leave an opportunity for AMD to take some market share from the higher end, if only after they nail down their 3870X2 drivers. But then of course we'll see. $449 for 85-90% of 8800U performance sounds a lot better to me than the premiums NV will ask for the 9800GTX/X2.

Of course, you'd NEVER see NV give its user base a break and cut loose their GTX inventory on sale price. I'd consider one but not at their current levels.

Of COURSE, AMD just slashed their prices on 3870s so I overpaid $70 on my damn card just a month after I installed it grrrr. :disgust:

I'm not too pissed, I got mine for $225 shipped from newegg on launch day...although I do wish that i had waited the 3 mos and gotten that 8800gt msi OC version with the dual-slot cooler...I just really hate these single-slot cooling designs, they're louder and don't exhaust the heat out of the case.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Rusin
Originally posted by: Quiksilver
If I were to guess I would assume the performance scale would look something like this:

9800GX2: 15 to 30 percent faster than 8800 Ultra.
9800GTX: Equal to if not better than overclocked 8800 Ultra
9800GT: Equal to if not better than overclocked 8800GTS 512
9600GT: Equal to if not better than 3870.
I think that 9800 GX2 would be more powerfull than than that. I mean that 9600 GT SLI is already 15-30% faster than 8800 ultra in DX9 when using 1600x1200 + 4xAA 16xAF. With similar settings in DX10 we are talking about like 10% difference in 9600 GT SLI's favor. What kind of performance will 9800 GX2 have when it has double amount of SPs and TMUs?

http://www.computerbase.de/art...rmancerating_qualitaet
(they have summed overall performance with different settings)
------
http://en.expreview.com/img/2008/02/23/9800GTX.jpg

If these specs for 9800 GTX holds truth it would mean that 9800 GTX would lose against 8800 GTX..perform equally with 8800 GTS 512. Differencies would be that this card would use new PCB (difference: Three way SLI support..).

bfg might go to nvidia hq and piss on their sign or something if that happens...

ok, just read the gt200 comments. So gt200 is really going to be what g100 was supposed to be??? wow... what about r700? Is that going to be a BS lineup revamp, too, or is amd at least going to give us something to sink our teeth into?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
You were proven wrong when GTX still beat G92GTS with higher shader clocks.
But in many cases the GTS 512 MB is faster. I also posted multiple graphs in the last thread that demonstrated this fact but you continually ignored them.

And your graph yes it does show shader does make a difference but then again your graph was 3 year old based on old cards with weak shader capabilities.
What are you talking about? The graphs demonstrated the shader/texture ratio in games and also demonstrated the percentage of games that use shaders.

They had absolutely nothing to do with a particular video card; they were all about games.

I asked you last time what part of the graphs you didn?t understand but you didn't respond. Clearly you still don't understand what the graphs show, nor do you understand the significance of the ratio above.

You are telling me pixel fillrate doesn't matter much like you claimed memory bandwidth doesn't matter.
No, I'm saying if the game is shader bound (which most are) then shader performance will be the primary bottleneck, not ROPs, memory bandwidth or fillrate.

Is that why 8800gtx is still beats G92?
Because in some situations its more powerful shader compatibilities are not enough to offset its other disadvantages, but the fact that it even wins so often despite having grossly inferior memory bandwidth to a GTX/Ultra should tell you the problem isn?t usually memory bandwidth.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
You were proven wrong when GTX still beat G92GTS with higher shader clocks.
But in many cases the GTS 512 MB is faster. I also posted multiple graphs in the last thread that demonstrated this fact but you continually ignored them.

You always seem to block out what was said on what was said. G92GTS does well because it does 8 textures per clock for every 16SP while G80 did 4 textures per clock.


And your graph yes it does show shader does make a difference but then again your graph was 3 year old based on old cards with weak shader capabilities.
What are you talking about? The graphs demonstrated the shader/texture ratio in games and also demonstrated the percentage of games that use shaders.

They had absolutely nothing to do with a particular video card; they were all about games.

I asked you last time what part of the graphs you didn?t understand but you didn't respond. Clearly you still don't understand what the graphs show, nor do you understand the significance of the ratio above.

I remember that graph showed us nothing other than older games were using shader effects.

Would you mind posting the graph again?

You are telling me pixel fillrate doesn't matter much like you claimed memory bandwidth doesn't matter.
No, I'm saying if the game is shader bound (which most are) then shader performance will be the primary bottleneck, not ROPs, memory bandwidth or fillrate.

That really depends on the game. Would you mind telling me how G94 does so well against G92.

Is that why 8800gtx is still beats G92?
Because in some situations its more powerful shader compatibilities are not enough to offset its other disadvantages, but the fact that it even wins so often despite having grossly inferior memory bandwidth to a GTX/Ultra should tell you the problem isn?t usually memory bandwidth.

Now you are getting it. SP alone just isn't enough. G92GTS sometime wins in the lower resolutions because it has 64TMU vs 32TMU on the GTX. Now if it had more memory bandwidth it would be able to use all of that texture fillrate and crush the GTX.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
G92GTS does well because it does 8 textures per clock for every 16SP while G80 did 4 textures per clock.
Which is not even a factor when using FP rendering, which pretty much every modern game uses if it?s doing HDR.

And again you need to post evidence of your claims. 3DMark multitexturing tests are not evidence. They are meaningless to games.

I remember that graph showed us nothing other than older games were using shader effects.
The graph was showing an upward trend in shader usage; if you think shaders are decreasing in modern games you need to provide evidence of that claim. Otherwise you need to retract your fallacious reasoning.

Would you mind posting the graph again?
No, I'm not going to post it again. You need to answer the questions that were asked of you in the first thread.

Would you mind telling me how G94 does so well against G92.
That is currently unknown at the moment. It could be core changes but it could also be driver changes to 17x.xx which require testing on old cards to confirm.

Now you are getting it. SP alone just isn't enough. G92GTS sometime wins in the lower resolutions because it has 64TMU vs 32TMU on the GTX.
You need to stop parroting your texturing mantra as it's quite tiresome.

[*]Fact 1: shader/texturing ratio is increasing in games. That means for every texturing operation there are many more shader operations with the trend increasing in favor of shading.

[*]Fact 2: shader usage is increasing in games, not decreasing.

[*]Fact 3: 3DMark multitexturing tests are meaningless because modern games are not texture bound or texture bottlenecked.

[*]Fact 4: the texturing advantage of the G9x over the G80 is not a factor in FP rendering which again is on the rise. Most if not all commonly benchmarked games use this rendering method to provide HDR.

Now if it had more memory bandwidth it would be able to use all of that texture fillrate and crush the GTX.
Uh no, given texturing fillrate isn't the main issue here, expecially not in games using FP rendering.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
G92GTS does well because it does 8 textures per clock for every 16SP while G80 did 4 textures per clock.
Which is not even a factor when using FP rendering, which pretty much every modern game uses if it?s doing HDR.

And again you need to post evidence of your claims. 3DMark multitexturing tests are not evidence. They are meaningless to games.

So all games are using FP rendering in every scene? Do you have evidence of this? Are you a game developer?

Why isn't a 3dmark multitexturing tests not evidence?

http://images.vnu.net/gb/inqui...-dx10-hit/fillrate.jpg

I remember that graph showed us nothing other than older games were using shader effects.
The graph was showing an upward trend in shader usage; if you think shaders are decreasing in modern games you need to provide evidence of that claim. Otherwise you need to retract your fallacious reasoning.

You always accuse people of something that was never said. Show me where I said shaders are decreasing in modern games? You need to link me to what was said first so I can retract it.


Would you mind posting the graph again?
No, I'm not going to post it again. You need to answer the questions that were asked of you in the first thread.

How am I supposed to answer you when I don't even remember very much of that graph? Either link to your claims again or it's a dead subject.

Would you mind telling me how G94 does so well against G92.
That is currently unknown at the moment. It could be core changes but it could also be driver changes to 17x.xx which require testing on old cards to confirm.

So Nvidia wrote some magical drivers? Some uber tweaks that make huge gains? But according to you shader is what matters. But in the case it doesn't matter as much does it?


Now you are getting it. SP alone just isn't enough. G92GTS sometime wins in the lower resolutions because it has 64TMU vs 32TMU on the GTX.
You need to stop parroting your texturing mantra as it's quite tiresome.

[*]Fact 1: shader/texturing ratio is increasing in games. That means for every texturing operation there are many more shader operations with the trend increasing in favor of shading.

[*]Fact 2: shader usage is increasing in games, not decreasing.

[*]Fact 3: 3DMark multitexturing tests are meaningless because modern games are not texture bound or texture bottlenecked.

[*]Fact 4: the texturing advantage of the G9x over the G80 is not a factor in FP rendering which again is on the rise. Most if not all commonly benchmarked games use this rendering method to provide HDR.

Now if it had more memory bandwidth it would be able to use all of that texture fillrate and crush the GTX.
Uh no, given texturing fillrate isn't the main issue here, expecially not in games using FP rendering.

Sorry fillrate is still king long as it's not being bottnecked by bandwidth and shader bound...

This is evident in 9600gt and the graph I linked you above.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
So all games are using FP rendering in every scene?
That depends if there's HDR in the scene, and in today?s commonly benchmarked titles there?s a high chance there is. This trend will only become more common, thereby making your texturing claims even more of a non-factor.

Why isn't a 3dmark multitexturing tests not evidence?
Because 3DMark is meaningless to games. In order to make that result meaningful you'd have to demonstrate games are bottlenecked by texturing but so far you haven't done that at all.

You always accuse people of something that was never said. Show me where I said shaders are decreasing in modern games?
Let me quote you again:

I remember that graph showed us nothing other than older games were using shader effects.
You made the comment about "older games", as if they're somehow not relevant to new games. So tell us then, do you think shaders are more or less prominent in modern games?

How am I supposed to answer you when I don't even remember very much of that graph?
This is the last time I'm linking them for you, and if you don't address them you will be reported for trolling.

http://www.beyond3d.com/content/reviews/2/2

But according to you shader is what matters. But in the case it doesn't matter as much does it?
If the driver improves shader performance of course it matters. What do you suppose was the point of those thousands of shaders nVidia wrote for substitution in the GF5/GF6 series, hmm?

Why bother when according to you all they needed to do was increase the texture fillrate and bandwidth?

In fact the 5950 had much higher bandwidth and texturing than the 9800 yet it still lost in modern games at the time.

Way back in 2002 ATi already spotted the trend of moving towards shaders and away from texturing which is why they adopted a 8x1 configuration for the 9700 Pro.

Sorry fillrate is still king long as it's not being bottnecked by bandwidth and shader bound...
But it will be bottlenecked by shaders and that?s the point. If the pixel shader is looping around and creating rich "pixels" fillrate means precisely squat since you can't "fill" anything until they?re finished.

Likewise if they?re pure ALU calculations they aren?t even touching the memory.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Its hard to isolate or even pinpoint the exact bottlenecks, and then claim that its because one has more texturing power or shader performance etc. Could be a combination of alot of things.

Simply ATI's RV670 has more arithmetic power than the fastest G80/G92/G94. A RV670 has twice the GFLOPs against a full G94. Pretty much its very fast in pure maths crunching, much suited for the research area of course and not much use in real world gaming scenarios since there are drawbacks in how they are "set up" in the RV670/R600 namely it being based on a VLIW architecture, vec5ish implementation of ALUs etc.

Now we all know how the shader/texture ratio has been increasing. But currently IMO its increasing in a linear fashion not exponential. People seem to forget the kind of performance hit a RV670/600 takes when enabling 16xAF due to its weak texturing performance against its competition which pretty much overkills in that area. A good example is the 9600GT against the RV670. The 9600GT takes less of an AA/AF hit then RV670. We can say that because of having superior texturing performance and hardware AA that allows G94 to keep up with a full fledged RV670 most of the time. I still havent seen a game where the ALU/tex ratio benefits AMD/ATI's architecture (since its much higher than nVIDIA's).

I think the 8800GTS/GT both is limited by bandwidth primarily. In situations where the G92 beats a G80 should definitely be shader bound, or maybe texture bound (hardly likely since the G80 still packs quite abit of texturing power). If they can pair and give the 9800GTX with more bandwidth, it will be faster consistently than the ultra. (given its leaked core/shader clocks of 675/1688)



 

Rumple

Member
Oct 4, 2004
128
0
0
Please god let the 9800s come out before march 15th....That is when my step up thru evga expires!!! ARGH!!!!!!!!!!!!!!!!!!!!!!!
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Cookie Monster
Its hard to isolate or even pinpoint the exact bottlenecks, and then claim that its because one has more texturing power or shader performance etc. Could be a combination of alot of things.

Simply ATI's RV670 has more arithmetic power than the fastest G80/G92/G94. A RV670 has twice the GFLOPs against a full G94. Pretty much its very fast in pure maths crunching, much suited for the research area of course and not much use in real world gaming scenarios since there are drawbacks in how they are "set up" in the RV670/R600 namely it being based on a VLIW architecture, vec5ish implementation of ALUs etc.

Now we all know how the shader/texture ratio has been increasing. But currently IMO its increasing in a linear fashion not exponential. People seem to forget the kind of performance hit a RV670/600 takes when enabling 16xAF due to its weak texturing performance against its competition which pretty much overkills in that area. A good example is the 9600GT against the RV670. The 9600GT takes less of an AA/AF hit then RV670. We can say that because of having superior texturing performance and hardware AA that allows G94 to keep up with a full fledged RV670 most of the time. I still havent seen a game where the ALU/tex ratio benefits AMD/ATI's architecture (since its much higher than nVIDIA's).

I think the 8800GTS/GT both is limited by bandwidth primarily. In situations where the G92 beats a G80 should definitely be shader bound, or maybe texture bound (hardly likely since the G80 still packs quite abit of texturing power). If they can pair and give the 9800GTX with more bandwidth, it will be faster consistently than the ultra. (given its leaked core/shader clocks of 675/1688)

Thank you Cookie. You always explain it better than me.

I agree with why 9600gt performs the way it does and pretty much why G92 wins GTX when it is shader or texture bound. It is both. But those situations only happen where memory bandwidth isn't a issue.

According to BFG however shader dictates game performance and create "rich" pixels. Memory bandwidth and fillrate doesn't matter.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
So all games are using FP rendering in every scene?
That depends if there's HDR in the scene, and in today?s commonly benchmarked titles there?s a high chance there is. This trend will only become more common, thereby making your texturing claims even more of a non-factor.

So would that be yes or no for every parts of the games?

Why isn't a 3dmark multitexturing tests not evidence?
Because 3DMark is meaningless to games. In order to make that result meaningful you'd have to demonstrate games are bottlenecked by texturing but so far you haven't done that at all.

It's meaningless to games? It is tool to measure what the card is capable of. I wouldn't call it meaningless.

You always accuse people of something that was never said. Show me where I said shaders are decreasing in modern games?
Let me quote you again:

I remember that graph showed us nothing other than older games were using shader effects.
You made the comment about "older games", as if they're somehow not relevant to new games. So tell us then, do you think shaders are more or less prominent in modern games?

You always blow things out of context. What I meant is that graph you linked few months back showed that shader was used even in older games. I never said shader was decreasing in modern games.


How am I supposed to answer you when I don't even remember very much of that graph?
This is the last time I'm linking them for you, and if you don't address them you will be reported for trolling.

http://www.beyond3d.com/content/reviews/2/2

Exactly that graph showed that old games used shaders. Very little of it but still. That's what I meant.

Yes please do report me. You reply to my message I'm replying back. If not don't reply to my posts. fair enough? BTW are you a mod yet?


But according to you shader is what matters. But in the case it doesn't matter as much does it?
If the driver improves shader performance of course it matters. What do you suppose was the point of those thousands of shaders nVidia wrote for substitution in the GF5/GF6 series, hmm?

Why bother when according to you all they needed to do was increase the texture fillrate and bandwidth?

In fact the 5950 had much higher bandwidth and texturing than the 9800 yet it still lost in modern games at the time.

Way back in 2002 ATi already spotted the trend of moving towards shaders and away from texturing which is why they adopted a 8x1 configuration for the 9700 Pro.

Geforce FX series was a disaster. First off they were what? 4pipe 8tmu cards with 128bit memory was it and than added 256bit memory bus? A radeon was 8pipe 8tmu card with 256bit memory? Do you see the advantage Radeon has over FX series? You are right about FX series having higher texture fillrate but it's pixel fillrate is much lower than 9800pro not to mention wider bus. Let's not forget bigger rops give higher performance in upper resolutions and help with AA.

Why are you talking about FX vs 9800 series again? This has no relevance to what we are talking about with current gaming situations and gpu.

I never said all you need to do is increase the texture fillrate. I believe long as a card is not being hold back by shader or bandwidth fillrate is king which includes pixel and texture fillrate.

There's no such thing as magic drivers and uber tweaks. Improvements come over time and hard work. Much like anything else.

Sorry fillrate is still king long as it's not being bottnecked by bandwidth and shader bound...
But it will be bottlenecked by shaders and that?s the point. If the pixel shader is looping around and creating rich "pixels" fillrate means precisely squat since you can't "fill" anything until they?re finished.

Likewise if they?re pure ALU calculations they aren?t even touching the memory.

Then explain the 9600gt phenomenon. You say shader creates "rich" pixel but current games aren't shader bound with modern GPU. Maybe the Geforce 7 but not geforce 8... 9600gt pretty much proves this theory. Maybe in the future 8800gt would take a substantial lead over 9600gt but currently it's about 10-30% depending on the game.

 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
"GeForce 9800GTX will be the fastest single-core graphics card from chip maker NVIDIA, when it arrives before the end of next month. The rumors have been speaking of core frequencies all the way up to 850MHz, which was way off. According to a post over at Expreview, GeForce 9800GTX will sport no more than 675MHz core frequency, which is basically slightly faster than the GeForce 8800GTS 512MB.

This means that the shaders operate at a frequency of 1688MHz and the memory have been upgraded to 1100MHz GDDR3 (2200MHz DDR). The amount of memory is still 512MB and since this is just a new flavor or the G92 chip, it sports a 256-bit bus, which in no way will limit performance. They also confirm that this is the G92-420 chip we've been reporting about."

Not great specs, but not bad. It should OC to ~800/1900/2300. Maybe close to 1000 core with a vmod. Can anyone figure out how fast that will be based on the 9600GT's performance?

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Shaq
"GeForce 9800GTX will be the fastest single-core graphics card from chip maker NVIDIA, when it arrives before the end of next month. The rumors have been speaking of core frequencies all the way up to 850MHz, which was way off. According to a post over at Expreview, GeForce 9800GTX will sport no more than 675MHz core frequency, which is basically slightly faster than the GeForce 8800GTS 512MB.

This means that the shaders operate at a frequency of 1688MHz and the memory have been upgraded to 1100MHz GDDR3 (2200MHz DDR). The amount of memory is still 512MB and since this is just a new flavor or the G92 chip, it sports a 256-bit bus, which in no way will limit performance. They also confirm that this is the G92-420 chip we've been reporting about."

Not great specs, but not bad. It should OC to ~800/1900/2300. Maybe close to 1000 core with a vmod. Can anyone figure out how fast that will be based on the 9600GT's performance?

hopefully A LOT faster than any of these overclocked Nvidia cards:
http://www.jmax-hardware.com/i...&limit=1&limitstart=13
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ok... apperantly nvstrap is for win2k and xp only... no vista or vista 64 bit availability...
I tried running the game at 1920x1200 max everything with AA disabled and I was getting a solid 20fps with occasional spike. Under clocking the shaders only to 815 from 1625 changed NOTHING and I was getting the exact same.

So either they don't make a difference for CoH (at least on the map I was looking at)... Or the the underclock didn't stick... but i did get a slightly lower test score from the time demo... so meh.

I guess I was wrong, at least in this game. I apologize for my harsh words before then. At least in this game the shaders make no difference.
 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
Just noticed NH updated their spec's for the GTX (675/2200) rather than originally 750+/unknown, which now makes me disappointed unless they make a ton of performance tweaks to the card to make it worth it otherwise it might make more sense just to buy one of these: http://www.gainward.net/produc...il.php?products_id=155 sames its so very close to the 9800GTX spec's.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Quiksilver
Just noticed NH updated their spec's for the GTX (675/2200) rather than originally 750+/unknown, which now makes me disappointed unless they make a ton of performance tweaks to the card to make it worth it otherwise it might make more sense just to buy one of these: http://www.gainward.net/produc...il.php?products_id=155 sames its so very close to the 9800GTX spec's.

I would love to see a 9800gtx vs. that card.

It's looking like If you were lucky enough to purchase the $254 8800GTS 512 deal at newegg, you could be in good shape come March 11. Two of those overclocked and running SLI would be about 500 bucks and pretty hard to beat, correct?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |