Radeon HD X2900 XT or Nvidia 8800 GTS 640mb

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
first of all, what mod-ness?


secondly i'd like you to please show me something other than your opinion that says this [nonsense to me]:
the fact that it's a quasi-high-end card that pumps out alot of heat. There is no denying that the 2900XT is the hottest graphics card on the market (and not in the ghetto-fabulous slang way - the thing is genuinely HOT).

i don't believe it is "the hottest" ... again ... please ... show me where you got that "fact" from

the card itself is NOT blazing hot ... it would NOT ignite a fire - NO WAY!
--you do have a good imagination, however
You're no longer a mod? :Q



Was it because of that whole debacle with Crieg? You said you didn't like being a mod...was it that?

The 2900XT *is* the hottest. Please re-cycle my old link. I cannot bear to post it a third time to make the same point. The 8800GTX is the second hottest graphics card. Can you think of *any* graphics card that could be considered 'competition' for hotness? I really want to know what you think it hotter than a 2900XT. You said yourself you wouldn't hold your finger on one for long. :light:

If I could ignite a fire with my 8800GTS would that change your mind? The 2900XT is *hotter*. :evil:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Originally posted by: apoppin
first of all, what mod-ness?


secondly i'd like you to please show me something other than your opinion that says this [nonsense to me]:
the fact that it's a quasi-high-end card that pumps out alot of heat. There is no denying that the 2900XT is the hottest graphics card on the market (and not in the ghetto-fabulous slang way - the thing is genuinely HOT).

i don't believe it is "the hottest" ... again ... please ... show me where you got that "fact" from

the card itself is NOT blazing hot ... it would NOT ignite a fire - NO WAY!
--you do have a good imagination, however
You're no longer a mod? :Q



Was it because of that whole debacle with Crieg? You said you didn't like being a mod...was it that?

The 2900XT *is* the hottest. Please re-cycle my old link. I cannot bear to post it a third time to make the same point. The 8800GTX is the second hottest graphics card. Can you think of *any* graphics card that could be considered 'competition' for hotness? I really want to know what you think it hotter than a 2900XT. You said yourself you wouldn't hold your finger on one for long. :light:

If I could ignite a fire with my 8800GTS would that change your mind? The 2900XT is *hotter*. :evil:

nope ... it has nothing to do with Creig - or any video/cpu poster -whatsoever. it had everything to do with my feeling unable to post freely. i find it impossible to moderate threads i am participating in.
-so it's "me" again


i don't think you can ignite a fire with a properly running 8800GTS or 2900XT - unless the HSF fails or you stop it; i don't think there is a GPU 'thermal-shutdown' like with CPUs ... but then it will just internally meltdown and the card goes dead ... i had that happen with my 9800xt

ANYWay, from what i have researched ... it isn't hotter; that is WHY i want to see your links and where you get that as fact.

almost forgot ... to answer you last question ... sure ... make a video of you starting a fire with your GTS
:evil:
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Yeah but how does it affect most users aside from seeing a higher number on the 'GPU thermometer'?
Higher thermals usually lead to more noise due to more cooling required.

And weren't you the guy worrying about an extra $2 a month so he could buy some coca-cola?

I don't care about power consumption per-se, what I care about is how it translates to noise.

what are you talking about? *Show me the links!* i missed the "far higher" and "significant difference" you claim for thermal differences between r600 and g80.
Click.

At load the 2900 uses 50% more power than the 8800 GTS so there?s no real comparison there. That kind of power consumption visibly translates to more heat and more noise.

The 8800 Ultra is probably quite close but from the tests done online it's still quieter and vastly faster than the 2900 when running AA so if you?re going to have a noisy card you may as well get that one.

It's not necessarily allowing it. It's that ATI has it written into the driver differently.
Uh, what? ATi haven't written it into the driver because there's no profile to allow those games to use AA. You can try to use Oblivion's profile but that's unusably slow, not to mention there's no guarantee the game won't be glitch free.

I don't think ATi implements AA in those games because it forces reviewers to test without it and it puts ATi in a better light. If AA was tested ATi's cards would likely tank compared to nVidia.

I also played a game called Final Fantasy XI which had numerous reports of Nvidia's drivers leaking texture memory and dropping down to 10fps after a few minutes of play. This occured with all models of 8800 from the GTS320 to the Ultra. They all did it.
The Alt-Tab issue has now been fixed according to the driver release notes.

BFG10K - I'm confused - I thought that in DX10 mode, UT3 games couldn't do AA, period - due to a specific type of shading being used. That's what I THOUGHT anyway. If you mean DX9 mode, then you may be right, I have no clue, but last I knew, there was no AA for Bioshock, RS: Vegas and so on. Am I wrong?
You're mixing two diferent things, Unreal 3 and DX10; Unreal 3 games aren't necessarily DX10.

You're also mixing shading with DX10 when the issue is deferred rendering which has nothing to do with DX10. It can be done with DX10 but doesn't have to be.

R6: Vegas and MoH: Airborne are currently only DX9 and nVidia's driver can force AA there. Bioshock is DX9 and DX10 and while nVidia's driver can only force AA in DX9 mode, DX10 support is coming.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
Yeah but how does it affect most users aside from seeing a higher number on the 'GPU thermometer'?
Higher thermals usually lead to more noise due to more cooling required.

And weren't you the guy worrying about an extra $2 a month so he could buy some coca-cola?

I don't care about power consumption per-se, what I care about is how it translates to noise.

what are you talking about? *Show me the links!* i missed the "far higher" and "significant difference" you claim for thermal differences between r600 and g80.
Click.

At load the 2900 uses 50% more power than the 8800 GTS so there?s no real comparison there. That kind of power consumption visibly translates to more heat and more noise.

The 8800 Ultra is probably quite close but from the tests done online it's still quieter and vastly faster than the 2900 when running AA so if you?re going to have a noisy card you may as well get that one.

It's not necessarily allowing it. It's that ATI has it written into the driver differently.
Uh, what? ATi haven't written it into the driver because there's no profile to allow those games to use AA. You can try to use Oblivion's profile but that's unusably slow, not to mention there's no guarantee the game won't be glitch free.

I don't think ATi implements AA in those games because it forces reviewers to test without it and it puts ATi in a better light. If AA was tested ATi's cards would likely tank compared to nVidia.

I also played a game called Final Fantasy XI which had numerous reports of Nvidia's drivers leaking texture memory and dropping down to 10fps after a few minutes of play. This occured with all models of 8800 from the GTS320 to the Ultra. They all did it.
The Alt-Tab issue has now been fixed according to the driver release notes.

BFG10K - I'm confused - I thought that in DX10 mode, UT3 games couldn't do AA, period - due to a specific type of shading being used. That's what I THOUGHT anyway. If you mean DX9 mode, then you may be right, I have no clue, but last I knew, there was no AA for Bioshock, RS: Vegas and so on. Am I wrong?
You're mixing two diferent things, Unreal 3 and DX10; Unreal 3 games aren't necessarily DX10.

You're also mixing shading with DX10 when the issue is deferred rendering which has nothing to do with DX10. It can be done with DX10 but doesn't have to be.

R6: Vegas and MoH: Airborne are currently only DX9 and nVidia's driver can force AA there. Bioshock is DX9 and DX10 and while nVidia's driver can only force AA in DX9 mode, DX10 support is coming.

1) AA works in games that directly supports AA. The driver will not force AA into a game that does not have it coded. That's how it is. I can set 8xAA in the driver and a game like FEAR which only has an option for 4xAA will run at 8X. I can tell because of the FPS difference. Yet a game like Bioshock with no AA support cannot be forced unless like you said, you change it to oblivion.exe and lose whatever optimizations you gain from the driver for Bioshock. They need to change how AA is enabled, you can't force AA in a game that has no AA support to begin with like you can with Nvidia cards. I do not believe they did that to force reviewers to not use AA. Look at some reviews of the card using Doom3 and Quake4 with AA. The HD2900Xt beats the 8800GTS in games using that engine (OpenGL). Comes within 10fps of the GTX too. Yes with AA and yes at relatively high resolutions. It varies game to game because AMD is forced to do game specific optimization where as Nvidia usually doesn't need to because they have dev support working with their driver team (like Crytek for example). Advantage there obviously...

2) The issue I'm referring to was NOT the Alt-Tab "issue" because you cannot alt-tab out of Final Fantasy XI. It is designed to disallow that to happen and will kick you from the game's server and force you to sign in again. It is still a problem for people I talk to.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
At load the 2900 uses 50% more power than the 8800 GTS so there?s no real comparison there. That kind of power consumption visibly translates to more heat and more noise.

The 8800 Ultra is probably quite close but from the tests done online it's still quieter and vastly faster than the 2900 when running AA so if you?re going to have a noisy card you may as well get that one.

of course .. but it does not translate to a "hotter card" ... that is going to depend on the cooling solution. Water cooling would be silent. The card would be cooler to the touch.

Also, the xbit link shows HD2900xt is 30 watts higher consumption then GTX while AT's measurement shows 19 watts higher then the GTX [nevermind the Ultra] - perhaps because Xbit is measuring the "peak" - not the "average" load in 3D ... almost 30% less difference with AT's measurement.
... so, what is that - about 15% more power consumption then the GTX under load? Close to the Ultra.

and there is also no price comparison of the XT with the Ultra - *of course* it is faster ... if i need more AA i can get another XT or pro for about the price of an ultra
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
1) AA works in games that directly supports AA. The driver will not force AA into a game that does not have it coded. That's how it is.
This is simply nonsense as driver AA has nothing to do with the game supporting AA. Absolutely nothing.

The driver will force AA into any game unless it uses a non-standard rendering path. That means GLQuake - a ten year old game that doesn't have a clue about AA - can have 8xAA if you like.

The issue here is non-standard rendering paths or in the specific case of Unreal 3 games using deferred rendering. nVidia has implemented a driver workaround that allows hardware AA to function in some of those titles.

Remember ATi's Chuck patch for Oblivion? It's exactly the same deal except now it's nVidia users enjoying AA in games ATi users can't.

?If the game supports it? is simply claptrap. I?m getting AA on my nVidia card but I wouldn?t on an ATi card. That's an IQ advantage to nVidia, plain and simple.

2) The issue I'm referring to was NOT the Alt-Tab "issue" because you cannot alt-tab out of Final Fantasy XI.
The issue was referred to Alt-Tab simply because it was the fastest method to fix the problem. It's not Alt-Tab per-se but an issue with the memory manager (i.e. restarting the game would have the same effect) and that issue has been fixed.

It is still a problem for people I talk to.
On what driver? If people are still having issues I suggest you get them to try the latest 163.71 driver just released a few days ago.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Aznguy1872
Reading your guys post makes me worried. I just ordered the 2900 pro and I got a similar set up to what you listed. Should I upgrade my PSU or am I safe?
you need to upgrade your cpu cooler before messing with your psu. sell the zalman and get a tuniq or thermalright ultra 120 extreme.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
1) AA works in games that directly supports AA. The driver will not force AA into a game that does not have it coded. That's how it is.
This is simply nonsense as driver AA has nothing to do with the game supporting AA. Absolutely nothing.

The driver will force AA into any game unless it uses a non-standard rendering path. That means GLQuake - a ten year old game that doesn't have a clue about AA - can have 8xAA if you like.

The issue here is non-standard rendering paths or in the specific case of Unreal 3 games using deferred rendering. nVidia has implemented a driver workaround that allows hardware AA to function in some of those titles.

Remember ATi's Chuck patch for Oblivion? It's exactly the same deal except now it's nVidia users enjoying AA in games ATi users can't.

?If the game supports it? is simply claptrap. I?m getting AA on my nVidia card but I wouldn?t on an ATi card. That's an IQ advantage to nVidia, plain and simple.

2) The issue I'm referring to was NOT the Alt-Tab "issue" because you cannot alt-tab out of Final Fantasy XI.
The issue was referred to Alt-Tab simply because it was the fastest method to fix the problem. It's not Alt-Tab per-se but an issue with the memory manager (i.e. restarting the game would have the same effect) and that issue has been fixed.

It is still a problem for people I talk to.
On what driver? If people are still having issues I suggest you get them to try the latest 163.71 driver just released a few days ago.

on every driver including new betas. They still can't get FFXI to play properly in Vista. Xp is unaffected now. Sucks for them.

And going back to the AA deal. The drivers won't force AA simple. DOn't bitch at me about it. go talk to AMD :roll:

personally I don't care. High rez + DX10 + AA is pretty slow anyway and I expect newer games would be even slower.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Look at some reviews of the card using Doom3 and Quake4 with AA.
This doesn?t have much relevance to Unreal 3 games though. Those games are more shader-heavy than the Doom 3 titles of yester-year and AA would impact ATi more because their shader operations are shared with AA resolve.

I'll bet ATI is only too happy to not support AA and let reviewers run AA-less benchmarks so users can see the pretty graphs and think the card competes with a GTX.

It varies game to game because AMD is forced to do game specific optimization where as Nvidia usually doesn't need to because they have dev support working with their driver team (like Crytek for example).
nVidia do application specific optimizations just like ATi. Furthermore nVidia didn't get developer assistance from Unreal 3 based games, they wrote the AA into the driver behind the applications? backs.

As for end-user experience, TWIMTBP is usually a hindrance, not an advantage.

DOn't bitch at me about it. go talk to AMD
You're the one that keeps responding with excuses.

High rez + DX10 + AA is pretty slow anyway and I expect newer games would be even slower.
Don't run DX10 then, run DX9 instead. On my 8800 Ultra Bioshock and MoH: Airborne run fine at 1600x1200 with 4xAA and all game details on full (maybe even 1760x1320 after the recent driver updates).

Turning off AA makes the games visibly uglier.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
of course .. but it does not translate to a "hotter card" ... that is going to depend on the cooling solution. Water cooling would be silent. The card would be cooler to the touch.
This would only be relevant if one wasn't using reference coolers.

Also, the xbit link shows HD2900xt is 30 watts higher consumption then GTX while AT's measurement shows 19 watts higher then the GTX [nevermind the Ultra] - perhaps because Xbit is measuring the "peak" - not the "average" load in 3D ... almost 30% less difference with AT's measurement.
Why are you comparing the 2900 to the GTX? They aren't even in the same performance class plus this thread isn't comparing said cards either.

Anand's load graphs of a 2900 vs 8800 GTS show 348W vs 287W = 61W difference.

That and XBit aren't testing total system power draw like Anand is, they're testing just the cards' loads.

if i need more AA i can get another XT or pro for about the price of an ultra
Sure but then you have to rely on Crossfire properly scaling and not having any issues, plus with two such cards you'll have a furnace in your case.

Pairing two slower cards instead of a faster single card never makes sense and this case is no exception.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'd pick the gts 640mb, mainly because the 2900xt takes too much of a performance hit doing AA, and I like my eye candy.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
of course .. but it does not translate to a "hotter card" ... that is going to depend on the cooling solution. Water cooling would be silent. The card would be cooler to the touch.
This would only be relevant if one wasn't using reference coolers.[/quote]
well, comparing the reference coolers, the 2900xt isn't a "hot card"

Also, the xbit link shows HD2900xt is 30 watts higher consumption then GTX while AT's measurement shows 19 watts higher then the GTX [nevermind the Ultra] - perhaps because Xbit is measuring the "peak" - not the "average" load in 3D ... almost 30% less difference with AT's measurement.
Why are you comparing the 2900 to the GTX? They aren't even in the same performance class plus this thread isn't comparing said cards either.
[/quote]
why are *you* comparing them?

Anand's load graphs of a 2900 vs 8800 GTS show 348W vs 287W = 61W difference.

That and XBit aren't testing total system power draw like Anand is, they're testing just the cards' loads.
They are comparing the *peak* draw ... that 'instant' of highest draw; AT compares both identical systems - the GPU is the only variable

if i need more AA i can get another XT or pro for about the price of an ultra
Sure but then you have to rely on Crossfire properly scaling and not having any issues, plus with two such cards you'll have a furnace in your case.

Pairing two slower cards instead of a faster single card never makes sense and this case is no exception.

i may or may not agree with your predetermined conclusions ... but i will try it out for myself and let you all know
... 2 - 2900xts will beat a single Ultra's performance for about the same price for me .. as to your comment about "furnace in my case" ... i doubt there would be a difference heat-wise with sli'd 2 Ultras in my case
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
a little update here which totally supports BFG10K and rips on my 'idea':



http://www.pr0ductive.com/hard...mance_update/page6.asp

the XT is Dead in the Water [pun intended] in the PT-BN DX10 demo - the AMD drivers suck to high heaven the second AA is enabled - although performance is fair without it ... and X-fire makes no difference [yet].

i guess i would be a "pioneer" [or guinea pig for AMD, depending on PoV]
-Here is their conclusion which casts *extreme doubt* on my idea:

we aren?t seeing the true potential of AMD?s DX10 hardware due to immature drivers, so we can?t even speculate on how the Radeon HD 2900 XT performs in comparison to the GeForce 8800 from NVIDIA. Quite simply, NVIDIA?s GeForce 8800 line takes the DX10 performance crown unchallenged. And as our tests show, only one DX10 app scales with CrossFire: Lost Planet. This means that AMD?s driver team not only has to tweak their DX10 driver for more performance, but CrossFire needs to be implemented as well if they?re going to mount an effective challenge to NVIDIA?s SLI.

i think i will wait being content with DX9x + AA and wait for a genuine "fire-sale" to experience Xfire.

Finally ... here is the Wattage comparison [again]:

http://www.tweaktown.com/revie...ption_tests/index.html

Here we see "idle" is fine - very close to the GTS320-M ... but 2900xt requires a good PS as it's "peak" is extreme when pushed
--so all things being equal ... game bundle, price, PS, MB ... DX10 drivers
--i'd probably recommend that the OP go for the GTS
[i would .. or maybe take a chance on the Pro]
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
Look at some reviews of the card using Doom3 and Quake4 with AA.
This doesn?t have much relevance to Unreal 3 games though. Those games are more shader-heavy than the Doom 3 titles of yester-year and AA would impact ATi more because their shader operations are shared with AA resolve.

I'll bet ATI is only too happy to not support AA and let reviewers run AA-less benchmarks so users can see the pretty graphs and think the card competes with a GTX.

It varies game to game because AMD is forced to do game specific optimization where as Nvidia usually doesn't need to because they have dev support working with their driver team (like Crytek for example).
nVidia do application specific optimizations just like ATi. Furthermore nVidia didn't get developer assistance from Unreal 3 based games, they wrote the AA into the driver behind the applications? backs.

As for end-user experience, TWIMTBP is usually a hindrance, not an advantage.

DOn't bitch at me about it. go talk to AMD
You're the one that keeps responding with excuses.

High rez + DX10 + AA is pretty slow anyway and I expect newer games would be even slower.
Don't run DX10 then, run DX9 instead. On my 8800 Ultra Bioshock and MoH: Airborne run fine at 1600x1200 with 4xAA and all game details on full (maybe even 1760x1320 after the recent driver updates).

Turning off AA makes the games visibly uglier.

I'll run DX10 anyway to get the better effects. 30fps is not unplayable in any game.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
go gts. maybe if more people stick with that then the 2900 pro will get cheaper more quickly!
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: cmdrdredd
2) The issue I'm referring to was NOT the Alt-Tab "issue" because you cannot alt-tab out of Final Fantasy XI. It is designed to disallow that to happen and will kick you from the game's server and force you to sign in again. It is still a problem for people I talk to.

Wow still harping on about problems with a crappy 6-year old PS2 port in Vista, that happens to work fine on the OS the crappy port was coded for when released. The fact FFXI does run fine in XP (always did, except for the small bleached-out overlay problem that was fixed with the 100-series drivers) and SE's continued stance they will not support Vista should tell you its an application problem and not a driver/OS problem. Sorry, expecting old games to work with new hardware and OS over the span of years with little to no support from the devs is expecting too much.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
well, comparing the reference coolers, the 2900xt isn't a "hot card"
With reference coolers the load temp is higher on the 2900 than on the 8800 GTS.

why are *you* comparing them?
I'm not really, you are. Your 19W figure comes from subtracting AT?s 2900 load draw from the GTX?s, a comparison which isn't relevant for this thread.

They are comparing the *peak* draw ... that 'instant' of highest draw; AT compares both identical systems - the GPU is the only variable
I'll say it again: XBit measure just the GPU while Anandtech are measuring total system draw.

Also if you look at the load differences AT gets 61W while XBit "only" get 55.4W which means Anand's figures show an even larger rift between the 2900 and GTS.

i doubt there would be a difference heat-wise with sli'd 2 Ultras in my case
But two SLI'd Ultras are much faster than CF 2900 (especially with AA) so again if you're going the hot & noisy route you may as well pair the fastest cards.

Here we see "idle" is fine
It may be but we don't buy these cards to run them idle.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
I'll run DX10 anyway to get the better effects. 30fps is not unplayable in any game.
What is your fixation with DX10? How many games have you played under DX10?

I'll tell you how many I've done: zero, despite having G80 hardware since last November, and I have no immediate plans to change that.

Also MoH:Airborne doesn't even have DX10 and Bioshock only has a marginal IQ gain under DX10. But AA certainly makes a huge difference in these titles.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
and SE's continued stance they will not support Vista should tell you its an application problem and not a driver/OS problem. Sorry, expecting old games to work with new hardware and OS over the span of years with little to no support from the devs is expecting too much.
To be fair he's saying his 2900 has no such issues in which case it's likely nVidia at fault. If ATi runs it fine then you can't really blame Vista or the developer.

I can understand where he's coming from as I have a large investment in legacy games and nVidia's support of such titles is generally lackluster compared to ATi.

I'm still waiting for Jedi Academy and United Offensive to be fixed and those games are only four years old.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
well, comparing the reference coolers, the 2900xt isn't a "hot card"
With reference coolers the load temp is higher on the 2900 than on the 8800 GTS.

why are *you* comparing them?
I'm not really, you are. Your 19W figure comes from subtracting AT?s 2900 load draw from the GTX?s, a comparison which isn't relevant for this thread.

They are comparing the *peak* draw ... that 'instant' of highest draw; AT compares both identical systems - the GPU is the only variable
I'll say it again: XBit measure just the GPU while Anandtech are measuring total system draw.

Also if you look at the load differences AT gets 61W while XBit "only" get 55.4W which means Anand's figures show an even larger rift between the 2900 and GTS.

i doubt there would be a difference heat-wise with sli'd 2 Ultras in my case
But two SLI'd Ultras are much faster than CF 2900 (especially with AA) so again if you're going the hot & noisy route you may as well pair the fastest cards.

Here we see "idle" is fine
It may be but we don't buy these cards to run them idle.
actually i DO run my card at idle at least 90% of its time

But ... you 'win' ... i ain't getting crossfire
- nor an ultra ... nor ultra Sli ... 2900xt is *perfect* for me at my 16x10 - for now ... just as your Ultra suits your higher resolution.

the only part of your argument i found worthwhile was the CrossFire's lack of scaling in DX10 ... i have zero use for a faster GPU - in DX9 even with 4Xaa now; and DX10 looks very little different then DX9
... yet your argument was sufficient ... thanks for helping me save money!
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
and SE's continued stance they will not support Vista should tell you its an application problem and not a driver/OS problem. Sorry, expecting old games to work with new hardware and OS over the span of years with little to no support from the devs is expecting too much.
To be fair he's saying his 2900 has no such issues in which case it's likely nVidia at fault. If ATi runs it fine then you can't really blame Vista or the developer.

I can understand where he's coming from as I have a large investment in legacy games and nVidia's support of such titles is generally lackluster compared to ATi.

I'm still waiting for Jedi Academy and United Offensive to be fixed and those games are only four years old.

By the same token, the fact the G80 is running on hundreds of titles with little to no performance difference in XP and Vista nowadays tells me it is in fact a problem with the way the game is coded or the way that code is handled by Vista. The reality of it is that if you cut corners in your coding in an attempt to please everyone with a multi-platform title, you're going to run into problems on a new platform released years later. Its not the first time we've seen poor performance from a PC title ported from a console, and it certainly won't be the last.

Some of the performance problems can be fixed over time with updated drivers, but many times its as simple as the devs being too lazy to update their .ini files to point to the proper code path for newer GPUs (an issue that has been fixed numerous times on NV on THEIR driver side). In the cases where there's significant problems with how old code runs on a new platform or new hardware, devs simply turn a blind-eye to the problem and don't do anything about it (if they're even around still 3-4 years later, much less continuing support of their title). Its the nature of the business.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
By the same token, the fact the G80 is running on hundreds of titles with little to no performance difference in XP and Vista nowadays tells me it is in fact a problem with the way the game is coded or the way that code is handled by Vista.
How did you arrive at that conclusion? There's nothing inherently problematic with Vista that makes it run games worse, it's usually drivers at fault, especially in this case where ATi runs the game fine under Vista so it's hard to blame the OS or the developer.

I have games that function perfectly on a 7900 GTX but not on a 8800 Ultra using the same system, same OS and even the same driver.

Some of the performance problems can be fixed over time with updated drivers, but many times its as simple as the devs being too lazy to update their .ini files to point to the proper code path for newer GPUs (an issue that has been fixed numerous times on NV on THEIR driver side).
This is all well and good but again ATi don't seem to have as many issues with backwards compatibility as nVidia does. Most Unreal 2 based games stutter on G80 hardware and most of them are TWIMTBP titles. You can't tell me TWIMTBP titles aren't following proper procedure if nVidia's "we worked closely with the developer every step of the way" claptrap is to be believed.

I had DEP BSOD problems on the 6800 Ultra and stuttering problems on a 8800 GTX/Ultra in Unreal 2 but I never had a single problem on the 9700 Pro or the X800 XL. Not one. And again this is a TWIMTBP title. There is no other reason here except for nVidia?s lackluster driver support.

Its the nature of the business.
Not with ATi it isn't and people are sick of it. Go over to the nVidia forums and see how many people are tired of nVidia drivers falling over in legacy games while ATi runs them fine.
 

Oyeve

Lifer
Oct 18, 1999
21,989
850
126
Originally posted by: BFG10K
and SE's continued stance they will not support Vista should tell you its an application problem and not a driver/OS problem. Sorry, expecting old games to work with new hardware and OS over the span of years with little to no support from the devs is expecting too much.
To be fair he's saying his 2900 has no such issues in which case it's likely nVidia at fault. If ATi runs it fine then you can't really blame Vista or the developer.

I can understand where he's coming from as I have a large investment in legacy games and nVidia's support of such titles is generally lackluster compared to ATi.

I'm still waiting for Jedi Academy and United Offensive to be fixed and those games are only four years old.

I too have ALWAYS had problems with legacy games and Nvidia. Even dating back to my Geforce 3. The Midtown Madness games would artifact like mad with an Nvidia card. Got a Radeon DDR card and the problems went away, stayed with ATI all the way up to the 9800 Pro, then a couple of years ago I went SLI and got 2 7800GT cards and they were nothing but nightmares. I had thought Nvidia would have fixed any legacy game issues but they were still there. I sold both cards and got an ATI X1950XTX and all my problems went away. Now I ordered my 2900Pro (should be here this morning) and I will never go back to Nvidia.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
By the same token, the fact the G80 is running on hundreds of titles with little to no performance difference in XP and Vista nowadays tells me it is in fact a problem with the way the game is coded or the way that code is handled by Vista.
How did you arrive at that conclusion? There's nothing inherently problematic with Vista that makes it run games worse, it's usually drivers at fault, especially in this case where ATi runs the game fine under Vista so it's hard to blame the OS or the developer.

I have games that function perfectly on a 7900 GTX but not on a 8800 Ultra using the same system, same OS and even the same driver.
I guess you must've missed the hot fix released by MS that explicitly deals with the differences between Vista's WDDM graphics driver vs. XP, which was also covered in Part 3 of AT's 2GB Wall article. I've posted the hot fix for those assuming its always NV's driver at fault for their problems in cases where NV claims its an application issue, with not a single person coming back to post their results as to whether or not the hot fix fixed their problems.

As for the 7900GTX vs. the 8800, that's fine and good but its also been covered that the NV4X and G80 cards are not running the same driver path, even if NV has managed to shrink the drivers to the point they are packaged in a Unified Driver. Its not the same GPU, the architecture isn't the same, the driver isn't the same. Not sure why ATI performance in FFXI is better (and when I say better, I'm still talking maybe 5 FPS difference over a 7800 with frames still dropping in the teens....the game runs like a pig), my guess is that the R600 handles DX8/R300 code better than the G80 handles whatever DX8/legacy NV code under Vista. But again, I'm leaning towards an application issue considering SE's continued stance they will not support Vista, not to mention you needed to jump through ridiculous hoops (as of March) just to get the clunky, hacked up launcher and graphics engine to even run in Vista. Oh ya, did I mention the game runs fine in XP with G80s?

This is all well and good but again ATi don't seem to have as many issues with backwards compatibility as nVidia does. Most Unreal 2 based games stutter on G80 hardware and most of them are TWIMTBP titles. You can't tell me TWIMTBP titles aren't following proper procedure if nVidia's "we worked closely with the developer every step of the way" claptrap is to be believed.
And again, the fact you say most U2-based games and not all while implicitly acknowledging the games are developed by different devs with different levels of talent suggests that the problems are on the application level. TWIMTBP is marketing fluff so no, I don't put any stock in it when judging a part's merit or performance. That doesn't de-emphasize the very real limitations every dev house faces when trying to make a title compatible with multiple hardware and platform configurations.

As for ATi not seeming to have as many issues with backwards compatibility....again, completely subjective as there's numerous examples of both, if you actually look for it. Chances are though you're not seeing them because you're only dealing with your current hardware, which atm is a G80. I couldn't run BG2 on my 9700pro....didn't bother to research a fix for it because honestly, I didn't care enough to look for a fix. And doesn't ATi have problems with OpenGL? That covers just about every legacy FPS pre-DX9....but no one seems to care that much.....because well, there's newer, better titles available that run fine.

I had DEP BSOD problems on the 6800 Ultra and stuttering problems on a 8800 GTX/Ultra in Unreal 2 but I never had a single problem on the 9700 Pro or the X800 XL. Not one. And again this is a TWIMTBP title. There is no other reason here except for nVidia?s lackluster driver support.
Again, when's the last time you downloaded a patch or update with Unreal 2? Tried a 2900 or 2600 with Unreal 2? Did your 7800 have any problems running the title? Sorry, you can blame NV drivers all you want, but the reality of it is that not all of your titles are going to be supported ad infinitum. Might as well complain about how you can't get Win95/98 drivers anymore for your current GPUs.

Its the nature of the business.
Not with ATi it isn't and people are sick of it. Go over to the nVidia forums and see how many people are tired of nVidia drivers falling over in legacy games while ATi runs them fine.
I've been there and it'd be naive to suggest there wouldn't be just as many complaints if ATI/AMD had a similar consolidated forum. But they don't. Again, I don't have any problems running any of the games I've played in the 10 months I've owned my GTS, save for the bleached out overlays in FFXI I've already mentioned. But I guess that's a trade-off with G80 for the best possible performance in current games with the possibility of problems with legacy titles. If you're fed up with NV drivers and willing to sacrifice some performance in current games for better support with legacy games, go with ATi. But I think the vast majority of people upgrade for better performance in current games.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |