Weyoun:
<< this is just in reply to RoboTech's lack of vision when it comes to getting the full list of t&l games http://www.nvidia.com/products.nsf/xformlight_titles.html >>
oh gawd, have you bothered to look at that list? That is a list of just about every OpenGL game out there. Just because there is "support" for a game doesn't mean it's going to make much of a difference. I mean, lemme tell ya, T&L makes my "Ultimate Bass Fishing" game SOOOOOOOOOOOOO much more realistic. <rolls eyes> Take another look at those titles, and you'll see that, from what I could tell, at least half of them haven't been released yet. So again, how does T&L make a difference in my life? MDK2 and Evolva. Wow. I don't even like Evolva. <-- what's with these ghey smilies anyway? The older ones were cooler....
rstokc:
<< A good comparison "urllanetHardware Video Card Shootout " >>
what made that a good comparison? Looks to me like it should've been renamed "PlanetHardware 3-game Benchmarking shootout"
<< Most people prefer the GF2 over the V5 if the V5 was better overall more people would buy it. >>
well, an innocently idealistic statement like that makes me remember that you're a kid. *Most* people haven't used both cards (yourself included) to do anything more than benchmark a few games.
Weyoun:
<< sorry to ask, but why is the lack of a second TMU the greatest downfall of the V5? >>
2 reasons: doesn't allow for single-pass multitexturing+trilinear filtering (at the same time), and if it had a 2nd pair of TMU's, the texel fillrate would be double what it is now, and the 5500 would be a very nice blend of fillrate/bandwidth limitation.
BFG:
<< I lost a grand total of just under 3 fps zipping the slider to -0.75. Poor me. 130 fps to 127, whatever will I do with such a horrid framerate?
Well given 3dfx are slower than everyone else to begin with, I wouldn't throw away those 3 fps which such reckless abandon >>
What kind of a sillyass comment is that? It's not a dick-measuring contest, is it? Is 127 fps a playable framerate for Quake3, yes or no?
<< wrong. UT is a perfect example.
UT is an poor example of anything. >>
You know BFG, there are people who play that game. you might be surprised. Soem of us use our cards to do more than just benchmark. Now then, my POINT, oh sharp-headed one, is that UT is an example where you can see that a good "average" framerate does NOT equate to a good "minimum" framerate. The GTS was a good example of that. Under D3d, I found that the card had a pretty close "average" framerate, yet it's minimum framerate was much lower than the 5500 (And it's maximum framerate was much higher, obviously)
<< for $500, the GF Ultra better drop to it's knees and blow me.
Hmmm... that's one feature nVidia doesn't have. That's what the V5 6000 is for. I have been told that's what the wall outlet requirement is for. >>
WHOO-HOOO!!!! See, and all you peeps are busting on 3dfx for that thing. HA!!
<< By capping do you mean something like say... turning on VSYNC? And then checking to see which card has the most fluctuations? I believe that is what my whole post was about!!! >>
yes, read that afterward. Unfortunately, I don't want to try and enable 125 Hz Vsync @ 1024x768 on my monitor.
fodd3r:
<< ut is an okay benchmark --though the game sucks, .... it's basically the only one which shows the frame rate under dx and still support the much of the current feature set. >>
er...UT is a poor benchmark for anything other than games that use the Unreal engine. It's feature set is about, what, 2-3 years old now, since it's the same engine as Unreal? And the game kicks ass, but you're right, that *is* another thread.
<< or such as mdk way to geforce centric to matter >>
it matters to peeps who play MDK2, and it's not "geforce centric", it's an OpenGL game, and nvidia has the best OGL ICD going.
<< And what about the crusher banchmarks? Are they to low weighted as well?
yes they are, because they have a few "trouble" spots, but largely consist of easier frames. a better demo would a be one that's edited such that the demo only consists of "trouble" spots.[/b] >>
well, you have the start of a good point here. In Quake2, we had the crusher.dm2 and the massive1.dm2 that were great benchmarks. They were constant action, constant stress, and gave you an idea of just how low your framerate would dip. There hasn't been a demo for Quake3 which became the "de facto" standard like crusher.dm2.
When I had both the 5500 and the GTS, I recorded a demo of myself on q3t4 (the harshest map in Vanilla Q3, overall, for graphics cards) fighting 10 bots. It was a race to 50 frags, and of course, I had god mode and all weapons. It was a constant fight. There were no areas in the demo where the framerate climbed noticeably on either card.
When I benchmarked both cards on it, the 5500 and the GTS were less than 3 fps apart at 960 and 1024, about 4 fps @ 1152, and (memory serves) about 6 or 7 fps @ 1600x1200. Relatively minor, and that was a pretty darn good representation of the *worst* case scenario. When I added my "visual config tweeks", neither the 5500 nor the GTS dropped below about 50 fps. Those tests were done with TC *enabled* on the GTS, BTW.
<< will the geforces crawl?
of course they will! >>
dude, you're drunk. The only time the 32MB GTS "crawled" was when I had 1024x768xSHQ on quaver with TC off. The 64MB GTS never "crawled".
<< this is just in reply to RoboTech's lack of vision when it comes to getting the full list of t&l games http://www.nvidia.com/products.nsf/xformlight_titles.html >>
oh gawd, have you bothered to look at that list? That is a list of just about every OpenGL game out there. Just because there is "support" for a game doesn't mean it's going to make much of a difference. I mean, lemme tell ya, T&L makes my "Ultimate Bass Fishing" game SOOOOOOOOOOOOO much more realistic. <rolls eyes> Take another look at those titles, and you'll see that, from what I could tell, at least half of them haven't been released yet. So again, how does T&L make a difference in my life? MDK2 and Evolva. Wow. I don't even like Evolva. <-- what's with these ghey smilies anyway? The older ones were cooler....
rstokc:
<< A good comparison "urllanetHardware Video Card Shootout " >>
what made that a good comparison? Looks to me like it should've been renamed "PlanetHardware 3-game Benchmarking shootout"
<< Most people prefer the GF2 over the V5 if the V5 was better overall more people would buy it. >>
well, an innocently idealistic statement like that makes me remember that you're a kid. *Most* people haven't used both cards (yourself included) to do anything more than benchmark a few games.
Weyoun:
<< sorry to ask, but why is the lack of a second TMU the greatest downfall of the V5? >>
2 reasons: doesn't allow for single-pass multitexturing+trilinear filtering (at the same time), and if it had a 2nd pair of TMU's, the texel fillrate would be double what it is now, and the 5500 would be a very nice blend of fillrate/bandwidth limitation.
BFG:
<< I lost a grand total of just under 3 fps zipping the slider to -0.75. Poor me. 130 fps to 127, whatever will I do with such a horrid framerate?
Well given 3dfx are slower than everyone else to begin with, I wouldn't throw away those 3 fps which such reckless abandon >>
What kind of a sillyass comment is that? It's not a dick-measuring contest, is it? Is 127 fps a playable framerate for Quake3, yes or no?
<< wrong. UT is a perfect example.
UT is an poor example of anything. >>
You know BFG, there are people who play that game. you might be surprised. Soem of us use our cards to do more than just benchmark. Now then, my POINT, oh sharp-headed one, is that UT is an example where you can see that a good "average" framerate does NOT equate to a good "minimum" framerate. The GTS was a good example of that. Under D3d, I found that the card had a pretty close "average" framerate, yet it's minimum framerate was much lower than the 5500 (And it's maximum framerate was much higher, obviously)
<< for $500, the GF Ultra better drop to it's knees and blow me.
Hmmm... that's one feature nVidia doesn't have. That's what the V5 6000 is for. I have been told that's what the wall outlet requirement is for. >>
WHOO-HOOO!!!! See, and all you peeps are busting on 3dfx for that thing. HA!!
<< By capping do you mean something like say... turning on VSYNC? And then checking to see which card has the most fluctuations? I believe that is what my whole post was about!!! >>
yes, read that afterward. Unfortunately, I don't want to try and enable 125 Hz Vsync @ 1024x768 on my monitor.
fodd3r:
<< ut is an okay benchmark --though the game sucks, .... it's basically the only one which shows the frame rate under dx and still support the much of the current feature set. >>
er...UT is a poor benchmark for anything other than games that use the Unreal engine. It's feature set is about, what, 2-3 years old now, since it's the same engine as Unreal? And the game kicks ass, but you're right, that *is* another thread.
<< or such as mdk way to geforce centric to matter >>
it matters to peeps who play MDK2, and it's not "geforce centric", it's an OpenGL game, and nvidia has the best OGL ICD going.
<< And what about the crusher banchmarks? Are they to low weighted as well?
yes they are, because they have a few "trouble" spots, but largely consist of easier frames. a better demo would a be one that's edited such that the demo only consists of "trouble" spots.[/b] >>
well, you have the start of a good point here. In Quake2, we had the crusher.dm2 and the massive1.dm2 that were great benchmarks. They were constant action, constant stress, and gave you an idea of just how low your framerate would dip. There hasn't been a demo for Quake3 which became the "de facto" standard like crusher.dm2.
When I had both the 5500 and the GTS, I recorded a demo of myself on q3t4 (the harshest map in Vanilla Q3, overall, for graphics cards) fighting 10 bots. It was a race to 50 frags, and of course, I had god mode and all weapons. It was a constant fight. There were no areas in the demo where the framerate climbed noticeably on either card.
When I benchmarked both cards on it, the 5500 and the GTS were less than 3 fps apart at 960 and 1024, about 4 fps @ 1152, and (memory serves) about 6 or 7 fps @ 1600x1200. Relatively minor, and that was a pretty darn good representation of the *worst* case scenario. When I added my "visual config tweeks", neither the 5500 nor the GTS dropped below about 50 fps. Those tests were done with TC *enabled* on the GTS, BTW.
<< will the geforces crawl?
of course they will! >>
dude, you're drunk. The only time the 32MB GTS "crawled" was when I had 1024x768xSHQ on quaver with TC off. The 64MB GTS never "crawled".