lopri
Elite Member
- Jul 27, 2002
- 13,211
- 597
- 126
COD4 certainly looks like an outlier for whatever reason.
http://www.anandtech.com/video/showdoc.aspx?i=3235&p=2
http://www.anandtech.com/video/showdoc.aspx?i=3235&p=2
Originally posted by: dreddfunk
Originally posted by: apoppin
You are late to the party Dredd ... Keys and BFG and a few others are already *doing* just that
Originally posted by: Azn
#1 has been tested already if you read few pages back and already been proven that you don't need a 112SP to get near 8800gt performance.
9600gt is proof anyways for #1.
#2 I can't test because I don't have a 8800gt or 9600gt. Yes I would really appreciate if you would test it and shut BFG up once and for all.
'Late to the party', 'already been proven', hardly. All respect to Keys and BFG--as I think it was wonderful for them to do the tests to begin with--I don't believe their tests to be conclusive in any way, which is indicated by the fact that both plan further tests.
However, and this is the point of my posting, I think testing two G80 GPUs, is the wrong choice. I'm interested in their results, but their applicability to G92/G94 cards, with their different texture and memory setups, remains tentative.
The issue at hand is how a *G94* card with 64 SPs performs close to a *G92* card with 112 SPs. The answer to that question isn't going to lie in running tests on two G80 cards.
So I suggest coordinating a test between G92/G94 cards, while *controlling* for variables (such as number of shaders and texture units).
I will try to do what I can on my own, as a G92 owner, but someone with a G94 and a computer with similar specs would really need to be in the equation to *prove* anything (I'm on an e2140 @2.4, IP35-E, 3GB HP Value RAM DDR2-667 @600, BFG 8800GT OC @625, monitor up to 1920x1200).
If anyone knows of any downloadable time-demos that seem reasonable to test, I'd be more than happy to give it a go. I don't have many of the latest FPS-style games, I got out of the habit of playing them, but I'm willing to do what I can within reason (i.e. I don't have the money to go out and buy five games just to run benchmarks).
Cheers.
See, I don?t think he was. While you can clock the shaders separately there?s a limit as to how far you can change them relative to the core clock, and if you go too far RivaTuner simply resets the clock to the last valid one.Why is that Sickbeat was to downclock his SP to 800mhz
I can?t. Anyway, I don?t need to.Try clocking your SP to 800mhz or so and do the bench again.
But I?ve since changed nothing but the shader clock.When you diable 1/2 of SP on your ultra, your FP16 Texture fillrate is cut to 1/2 as well
Err, downclocking the shaders is proof.Why don't you provide proof instead by downclocking your SP.
And? They?re comparing two different cores, two different clocks speeds with two different drivers, and likely with different settings to mine.Comparing 8800gt and 9600gt in bioshock the performance difference is there but not the numbers you posted by 50% reduction in frame rates.
Which again has been demonstrated to hold true with simply downclocking the shaders.But you didn't only demonstrate 128SP to 64SP. You cut your texture fillrate by 1/2 as well.
What part of my results are you having difficulty reading and/or comprehending?What evidence did you bring?
Neither did you at first. In fact you were running around telling everyone you expected a zero performance gain until I provided results to the contrary and you flip-flopped on what you were saying before.You didn't even have a clue your texture fillrate was cut by half when you disabled your SP and tested your games on it.
It doesn?t. Come on, show us developer and/or IHV commentary that states modern games will be bottlenecked in the manner your multi-texturing graph demonstrates. If you can?t you need to retract your claim.Kind of like 3dmark multi-texture test doesn't matter
Yep. Again you don?t seem to understand the basic notion of bottlenecks and why some test-beds are not useful to prove anything, you just keep running around and screaming your ?raw texture fillrate is king? mantra.and COD4 is primitive engine?
Whenever someone posted they got no performance change.Where was my comment about disabling 64SP have no performance impact?
Well you were wrong there too.I might have said little impact but I did not say NO impact.
I thought in the other thread you didn?t want to discuss old cards? So which is it Azn, do we talk about old cards or not?Is this why Geforce 7 is getting weak performance and x1000 does better on this game and my budget 8600gts with 128bit bus does good as 7900gtx?
I don?t need to, I can see from actually playing the game that it lacks many modern features such as FP HDR and that in places it looks more dated than the first two games.Why don't you tell Activision that COD4 is primitive.
Riiight, and I suppose Call of Duty 4 is perfect to test shaders? :roll:Crysis isn't just shader intensive it is video card intensive.
Originally posted by: bryanW1995
If the 675/2200 rumors are true, then it needs a lot of AA improvments to catch up to gtx/ultra at high res. This card could be a big flop just because it's going to be so much higher than 8800gt for minimal performance increase. even rollo hasn't been bragging on it, he's been talking about the 9800gx2 instead. That should tell us that A.) the card is weak, or B.) rollo is trapped under something heavy and can't reach his keyboard. I'm going with A.
Originally posted by: BFG10K
See, I don?t think he was. While you can clock the shaders separately there?s a limit as to how far you can change them relative to the core clock, and if you go too far RivaTuner simply resets the clock to the last valid one.Why is that Sickbeat was to downclock his SP to 800mhz
That?s yet another possible reason why he wasn?t seeing a difference.
I can?t. Anyway, I don?t need to.Try clocking your SP to 800mhz or so and do the bench again.
Based on my second test changing nothing but shader clocks (i.e. texturing was not affected) I got a 13% gain from 24% more shader power, or about a 1:2 ratio.
Using the 1:2 to ratio on my first test we see that when I increased my SPs from 64 to 128 (100%) I got a 56% performance gain, so clearly 50% of that is from shaders while only 6% is from texturing.
But I?ve since changed nothing but the shader clock.When you diable 1/2 of SP on your ultra, your FP16 Texture fillrate is cut to 1/2 as well
Err, downclocking the shaders is proof.Why don't you provide proof instead by downclocking your SP.
And? They?re comparing two different cores, two different clocks speeds with two different drivers, and likely with different settings to mine.Comparing 8800gt and 9600gt in bioshock the performance difference is there but not the numbers you posted by 50% reduction in frame rates.
Which again has been demonstrated to hold true with simply downclocking the shaders.But you didn't only demonstrate 128SP to 64SP. You cut your texture fillrate by 1/2 as well.
What part of my results are you having difficulty reading and/or comprehending?What evidence did you bring?
Let me know and I?ll try to help you out.
Neither did you at first. In fact you were running around telling everyone you expected a zero performance gain until I provided results to the contrary and you flip-flopped on what you were saying before.You didn't even have a clue your texture fillrate was cut by half when you disabled your SP and tested your games on it.
Anyway, as my shader clock test demonstrated, the loss of texturing played a minor part in the performance hit.
It doesn?t. Come on, show us developer and/or IHV commentary that states modern games will be bottlenecked in the manner your multi-texturing graph demonstrates. If you can?t you need to retract your claim.Kind of like 3dmark multi-texture test doesn't matter
I?ve demonstrated my side with reviewer and IHV comments that demonstrate shader bottlenecks.
Yep. Again you don?t seem to understand the basic notion of bottlenecks and why some test-beds are not useful to prove anything, you just keep running around and screaming your ?raw texture fillrate is king? mantra.and COD4 is primitive engine?
Whenever someone posted they got no performance change.Where was my comment about disabling 64SP have no performance impact?
Well you were wrong there too.I might have said little impact but I did not say NO impact.
I thought in the other thread you didn?t want to discuss old cards? So which is it Azn, do we talk about old cards or not?Is this why Geforce 7 is getting weak performance and x1000 does better on this game and my budget 8600gts with 128bit bus does good as 7900gtx?
I don?t need to, I can see from actually playing the game that it lacks many modern features such as FP HDR and that in places it looks more dated than the first two games.Why don't you tell Activision that COD4 is primitive.
Riiight, and I suppose Call of Duty 4 is perfect to test shaders? :roll:Crysis isn't just shader intensive it is video card intensive.
Crysis is one of the most shader intensive games ever made, if not the most intensive. It ships with 85,000 of them for heaven?s sake.
Originally posted by: dreddfunk
jared - I think that's true in order to come up with a reasonably-founded guess about performance, but why stop there? Why not just use RivaTuner to scale back SPs on an 8800GT and see what we come up with. The truth is, I've no idea how different the G80 architecture is from G92, on the low level of how the SPs operate.
I kind of want to get to the bottom of whether or not the G94s SPs are much improved, or that shaders just don't matter as much as I thought. I know specs won't be identical outside of a testing lab, but I feel that we could do a lot better than what we're doing right now, 'tis all.
I'll look for the Crysis demo. Does it have a built-in time demo?
[edited for spelling...or lack thereof]
I already offered to explain my results to you but I haven't heard back.BFG all you do is reply with no evidence. You just reply and say yes you did. Pathetic.
We've already been over this. Repeatedly.While Sickbeat downclocked and even disabled 48 of his SP and saw no noticeable difference.
Ditto.You can argue all you want by yourself. I'm sick of your bating troll tactics.
Like I said, we've been over this before. Multiple times. The problem is your refusal to accept anything that contradicts with your texturing ?theories?.Sickbeat already tested his 8800gt with COD4 by disabling 48 of his SP and even downclocked his SP clocks. He saw no noticeable difference.
I could test Crysis and more games but why bother? You obviously don't like seeing anything that contradicts your views so I?d simply be wasting my time.Crysis has a timedemo as well since BFG thinks it's shader intensive.
Aye, ain't that the truth.I have noticed Azn self-contradicting (when comparing posts from other threads) as well.
Originally posted by: BFG10K
Aye, ain't that the truth.I have noticed Azn self-contradicting (when comparing posts from other threads) as well.
In the other thread now he's trying to tell me my AA settings use texture fillrate (LOL!) when ironically the fact I use AA helps his cause because it strains memory bandwidth and ROPs which lessens potential differences from low shader clocks.
Of course he doesn?t see it that way.
Originally posted by: bryanW1995
If the 675/2200 rumors are true, then it needs a lot of AA improvments to catch up to gtx/ultra at high res. This card could be a big flop just because it's going to be so much higher than 8800gt for minimal performance increase. even rollo hasn't been bragging on it, he's been talking about the 9800gx2 instead. That should tell us that A.) the card is weak, or B.) rollo is trapped under something heavy and can't reach his keyboard. I'm going with A.
Originally posted by: keysplayr2003
Azn & BFG, you both need to calm it down.
BFG, You took that CoD4 experiment way too seriously. It was just a quick, look see. For CoD4, at 1680x1050, utilizing only 64 shaders in a 8800GT or 8800GTS 640, had no effect. This should tell you something more important than what resolution we tried. You completely missed the "little goal" of our little test. And you carry on this way as if you believe we are all a flawed species and stupid.
Not liking it.
Originally posted by: chizow
Wow such an interesting discussion from something as innocuous 9600 SLI performance. Looks like I'm a bit late but.....I think Keys and Azn are onto something with the differences between G80 and G92 TMU comparisons. I've felt ROPs were the biggest determining factor of performance on 8800 parts for some time but I didn't realize the TMUs were directly tied to shader speeds/clusters as it seems some of these tests indicate. I was pretty convinced TMU performance was tied to core clock but it seems core clock speeds mainly impact ROP performance.
I took it seriously because Azn believes he is right based on the results, but the results can?t be used in such a fashion.BFG, You took that CoD4 experiment way too seriously.
What was the goal, may I ask? If it was to confirm was Azn was saying then it failed.You completely missed the "little goal" of our little test.
It is because TMUs run at core clocks, not at shader clocks. However TMUs are tied to SPs so if some are disabled you also disable some TMUs.I was pretty convinced TMU performance was tied to core clock.
Originally posted by: BFG10K
I took it seriously because Azn believes he is right based on the results, but the results can?t be used in such a fashion.BFG, You took that CoD4 experiment way too seriously.
It's like me saying games aren't GPU limited and then running GLQuake at 320x240 to prove it.
It's great I ran the test but it doesn't really prove anything.
What was the goal, may I ask? If it was to confirm was Azn was saying then it failed.You completely missed the "little goal" of our little test.
Again I?m not having a go at you, I?m just explaining to Azn why he can?t use that as evidence.
Originally posted by: keysplayr2003
Originally posted by: BFG10K
I took it seriously because Azn believes he is right based on the results, but the results can?t be used in such a fashion.BFG, You took that CoD4 experiment way too seriously.
It's like me saying games aren't GPU limited and then running GLQuake at 320x240 to prove it.
It's great I ran the test but it doesn't really prove anything.
What was the goal, may I ask? If it was to confirm was Azn was saying then it failed.You completely missed the "little goal" of our little test.
Again I?m not having a go at you, I?m just explaining to Azn why he can?t use that as evidence.
The "little goal" was to establish whether or not the G94 had improved architecture or not over G92. It turns out the number of shader processors is irrelevant, if they aren't used. In CoD4, it is obvious that 112 shaders are not needed, and it was not that G94 had improved architecture at all. That's all there really was to it. We got what we needed, or at least what I needed out of it.
Moral: Just because a GPU has a billion shaders, doesn't mean they all get used 100% of the time. The 9600GT performance was misleading without the knowledge of the previous statement. Now that it is understood, 9600GT performance makes sense, and can be justified in our minds.
And yes, your mind is surely thinking of an obvious question. What about other more shader intensive games?
Of course there are games that will need more shader power than the 9600GT has. But again, this was not our goal. Our goal was simply to see whether the 9600GT was an improved archictecture over G92, or just a cut down G92. Nothing more, nothing less.
But then, as always, it morphed into something else I guess.
If you want to conduct tests on every popular game at various resolutions while disabling shaders, that would be interesting, and I would be willing to help out.
I have 2 8800GT's coming by Saturday and a 700W SLI certified PSU by tomorrow.
After seeing 9800GTX specs, there's no more point in waiting.
Originally posted by: apoppin
SLi huh? .. i KNEW you would go for it, Keys!
awesome ... and congratulations
.. although i remain unconvinced that we have seen real GTX specifications
... and you are right, if you want to explore nvidia's multi GPU, there is no need to wait .. i am sure you got a very good deal
i'll 'call' you in May with r700 and up the ante, shall i?