Anand R420 review analysis

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: Edge3D
But why get a card for $500 (half a grand!) that doesnt have its competitors features like SM3.0? Which WILL be implemented into FarCry, Painkiller and many other games that are even already on sale floors? Hmmmm....
Seems clear to me people are buying from brand loyalty, not features AND performance. For that kind of money you could get yourself a Ultra or maybe a UExtreme with SM3.0.
I promise ye forum goers. It will be a major loss for the 800XT. You will have buyers remorse sooner, rather than later when SM3.0 is implemented into games.

Because quite simply, we dont know how PS 3.0 will affect games, if it indded does. If it going to give them such a huge boost in speed, why havent we seen any tests with it? They seemed to want to brag about it so much, why didnt they release a benchmark to show how much faster it makes the NV4x? The quality is going to be the same, so its all about speed.

ATi has features that NV doesnt have, too. 3Dc, and Temporal FSAA? While neither of these has me excited, they are still features the NV cards do not have. Will they matter? I dont know. How long till 3Dc makes its way into a game? Again, we dont know. By the time it does, Im sure Ill have another card by then anyhow.

Again, dont assume so much.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: Ackmed
Originally posted by: Edge3D
But why get a card for $500 (half a grand!) that doesnt have its competitors features like SM3.0? Which WILL be implemented into FarCry, Painkiller and many other games that are even already on sale floors? Hmmmm....
Seems clear to me people are buying from brand loyalty, not features AND performance. For that kind of money you could get yourself a Ultra or maybe a UExtreme with SM3.0.
I promise ye forum goers. It will be a major loss for the 800XT. You will have buyers remorse sooner, rather than later when SM3.0 is implemented into games.

Because quite simply, we dont know how PS 3.0 will affect games, if it indded does. If it going to give them such a huge boost in speed, why havent we seen any tests with it? They seemed to want to brag about it so much, why didnt they release a benchmark to show how much faster it makes the NV4x? The quality is going to be the same, so its all about speed.

ATi has features that NV doesnt have, too. 3Dc, and Temporal FSAA? While neither of these has me excited, they are still features the NV cards do not have. Will they matter? I dont know. How long till 3Dc makes its way into a game? Again, we dont know. By the time it does, Im sure Ill have another card by then anyhow.

Again, dont assume so much.

You havent seen it because DX9C isnt out yet. NV is working with its partners to get ALL these games to utilize SM3.0.
Lord of the Rings, Battle For Middle-earth
STALKER: Shadows of Chernobyl
Vampire: Bloodlines
Splinter Cell X
Tiger Woods 2005
Madden 2005
Driver 3
Grafan
Painkiller
FarCry
Source: FiringSquad.

I'm not assuming because its very clear that SM3.0 will be very useful in improving IQ and performance.
Most devs say so as well. Theres a few (with relations to ATI) that say "anything can be done in PS2 with more passes that can be done in PS3", but they fail to mention VS3 and its displacement mapping.
Seriously, if you are educated on the subject its very clear that SM3 is important.

I'm not dropping $500 and not getting it. I am educated on the subject, and like I said I'm unbiased.

The great thing ATI needed was vastly superior performance with the x800Pro, to make it a viable product over NV's offerings.
Since all the NV cards, especially the GT seems to be pretty much superior or on par with that card it makes it hard to go ATI without jumping to the x800XT.

One of us will be proven correct in the SM3 debate, and the other wrong.

Do you honestly think that SM3 will not improve performance and image quality when used?
Honestly?
Like I said, we'll see and one of us will get to say "I was right!" (what a joy, eh?) Sooner rather than later, so you wont have too long to find out.
Hopefully, unless your a ATI loyalist.. BEFORE you drop $500 on that x800XT.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I note you specifically mentioned PS3.0, or specifically omitted VS3.0. While I don't recall nV showing any demos of potential speed increases for PS3.0 over PS2.0, they did show a geometry instancing demo (VS3.0 exclusive, I believe) that seemed impressive. Geometry instancing may prove to be a very nice feature for end-users (not just marketers ).

I agree that I don't think SM3.0 will generally offer a huge speed or IQ increase (it's one or the other, AFAIK), geometry instancing seems promising to this (admittedly amateur) 3D fan.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Actually the resolution point is a good one. I frequently play above 1600 x 1200 and eventually I'll be picking up a 21" CRT with the desire to play at 2048 x 1536. I haven't heard of anything claiming nVidia's HSR schemes are disabled at resolutions >1600 x 1200, unlike ATi's which even now don't go that high.

Perhaps nVidia potentially has the advantage at >1600 x 1200? Given no review seems to go above that setting it's going to be pretty tough to get anything concrete out of them in that regard.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Edit- should note.. at that res you DEFINITELY shouldnt need any AA/AF
Maybe not AA but you definitely need AF to match the inherent sharpness that high resolutions give you. If anything AF is needed even more at higher resolutions to produce a balanced image in terms of sharpness.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
BFG, I updated my post with more info from B3D. It seems the XT will be able to deal efficiently with resolutions far beyond 20x15, but the Pro falls ever-so-slightly short of full 20x15 efficiency. The good news is that the R420 (unlike the R300 derivatives) will still trim unnecessary work up to the core's limit, rather than just bypassing HyperZ and the like. Sorry for the incomplete post.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: Pete
omitting the X in X800XT doesnt stop anyone from understanding me
In context, yes, but it seems a little close to a 9800P/XT to be saving one measly letter.
Good point. I glossed over that fact. I'll be using "X" from now on. Thank you BTW.

Agreed. But why get a card for $500 (half a grand!) that doesnt have its competitors features like SM3.0? Which WILL be implemented into FarCry, Painkiller and many other games that are even already on sale floors? Hmmmm....
Seems clear to me people are buying from brand loyalty, not features AND performance. For that kind of money you could get yourself a Ultra or maybe a UExtreme with SM3.0.
I promise ye forum goers. It will be a major loss for the 800XT. You will have buyers remorse sooner, rather than later when SM3.0 is implemented into games.

I think you're far too quick to paint prospective X800XT owners as fanboys, and to assume that SM3.0 will allow the 6800U to make up the quite large ground between it an the X800XT. I haven't read or seen anything that indicates SM3.0 will be a big leap in terms of either performance or IQ. I feel that the 6800GT's support for SM3.0 gives it an advantage over the relatively similarly-performing X800P, but I don't think SM3.0 will erase the X800XT's generally noticable performance lead over the 6800U with AA+AF.
Well, I dont know about all of that. I mean, I didnt state that I think SM3 will bring it up to the x800XTs level or not.
But it will improve it to some degree, and displacement mapping will give NV cards a level of IQ that ATI cards wont be able to match. From what Ive read on the technology, noticably. Displacement mapping is a very IQ oriented feature. That is the main biggie.

I dont follow. Please elaborate.
From the TechReport review:

Better performance at higher resolutions ? Each pixel quad in the R420 has its own Z compression and hierarchical Z capability, including its own local cache. ATI has sized these caches to allow for Z-handling enhancements to operate at resolutions up to 1920x1080. (The R300's global Z cache was sized for resolutions up to 1600x1200, the RV3xx-series chips' to less.) Also, on the R420, if the screen resolution is too high for the Z cache to accommodate everything, the chip will use its available Z cache to render a portion of the screen, rather than simply turning off Z enhancements.

B3D says that the four-quad X800XT will be able to fully process up to 4MP resolutions, though, so maybe just the X800P is limited to full performance at up to 3MP (2048 * 1536 =~ 3.14MP, so a bit over the 3MP limit):

Excellent info. Thanks. I will be reviewing this.

With 4 quads enabled the Hierarchical Z-Buffer has around 4 mega pixels of (lower level) Z buffer storage capabilities, which should be good for over 2048 resolutions, which means that high definition resolutions like 1920x1080 are fully covered. Should the screen resolution exceed that of the maximum capabilities of the Hierarchical Z-Buffer it is not disabled entirely, instead a portion of the Z-buffer is setup in the Hierarchical Z-Buffer to its maximum storage capability and then anything that falls out of that range falls back to the early pixel level reject, so the majority of the screen can still captured by the Hierarchical Z-Buffer.

About AF: I just read THG's X800 review, and Lars seems to say that the 6800 shows less texture shimmer than the X800 with AF. That's worth further investigation, and hopefully we'll see more in-depth IQ analysis in a month or two, once reviewers have some time with retail cards and updated drivers. Less texture shimmer may be worth less performance--at the very least, the 6800 gives you that option, along with SSAA.

Again, I'll be reviewing this info. I honestly could give a rats behind if anyone other than Anand's says it.
I dont find those other sites (BOTH Toms and OCP) to have much credibility considering past events. The others are too small to not possibly be ran by biased tyrants. Heck, OCP THINKS they are in Anands league but they are smalltime still IMO.

Anyway, its pretty much thought that so far IQ is actually in NV's favor. Not surprising.. I mean look at ATIs XT performance. Cuts were made somewhere.. not that the tech isnt great.. but I'm sure its a combo of driver cheats and great speedy hardware.

Man, thanks for info, seriously!
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: BFG10K
Actually the resolution point is a good one. I frequently play above 1600 x 1200 and eventually I'll be picking up a 21" CRT with the desire to play at 2048 x 1536. I haven't heard of anything claiming nVidia's HSR schemes are disabled at resolutions >1600 x 1200, unlike ATi's which even now don't go that high.

Perhaps nVidia potentially has the advantage at >1600 x 1200? Given no review seems to go above that setting it's going to be pretty tough to get anything concrete out of them in that regard.

You look like an old timer around here. Can you contact a big dog at Anands and use yer influence to get 2048 testing done? Or should we all bog him with requests letting him know the people want to see?

I'm very interested in such data, and if the x800XT runs away at 2048 then it will be my card. Like that guy said, its only $100 more.. and why pinch pennies at that price ($400)?

But I honestly want SM3.0 and have a hard time paying anything more ($100 in this case) and getting less. But I would pay more, and get less.. if it performs on a playable level and no SM3.0 hardware does the same. Hence more!
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
It seems the XT will be able to deal efficiently with resolutions far beyond 20x15,
That's good news. I haven't had a chance to read the B3D review and I didn't read your post carefully so I missed that tidbit.

Can you contact a big dog at Anands and use yer influence to get 2048 testing done? Or should we all bog him with requests letting him know the people want to see?
I think it's best that we all ask for it in the articles forum. I'm guessing the main reason why it isn't done is because only a tiny portion of the population are interested or in fact can even run such a high resolution.

In reality that's the first resolution I would test because it's the holy grail for GPU tests. Compared to 2048 x 1536 the likes of 1600 x 1200 is really just low resolution, CPU limited mush.
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: Edge3D
You havent seen it because DX9C isnt out yet. NV is working with its partners to get ALL these games to utilize SM3.0.
Lord of the Rings, Battle For Middle-earth
STALKER: Shadows of Chernobyl
Vampire: Bloodlines
Splinter Cell X
Tiger Woods 2005
Madden 2005
Driver 3
Grafan
Painkiller
FarCry
Source: FiringSquad.

I'm not assuming because its very clear that SM3.0 will be very useful in improving IQ and performance.
Most devs say so as well. Theres a few (with relations to ATI) that say "anything can be done in PS2 with more passes that can be done in PS3", but they fail to mention VS3 and its displacement mapping.
Seriously, if you are educated on the subject its very clear that SM3 is important.

I'm not dropping $500 and not getting it. I am educated on the subject, and like I said I'm unbiased.

The great thing ATI needed was vastly superior performance with the x800Pro, to make it a viable product over NV's offerings.
Since all the NV cards, especially the GT seems to be pretty much superior or on par with that card it makes it hard to go ATI without jumping to the x800XT.

One of us will be proven correct in the SM3 debate, and the other wrong.

Do you honestly think that SM3 will not improve performance and image quality when used?
Honestly?
Like I said, we'll see and one of us will get to say "I was right!" (what a joy, eh?) Sooner rather than later, so you wont have too long to find out.
Hopefully, unless your a ATI loyalist.. BEFORE you drop $500 on that x800XT.

As I said, we dont know if PS 3.0 make it faster, and if so, by how much. I dont think it will look any better. Most of those games are already in development quite a ways. If NV had any screens to compare, they would be spamming them all over, but they arent. Im sure they have DX9.0c, and they certainly have a PS 3.0 version of Farcry, since they showed that pic in their presentation. Why not compare it with a PS 2.0 pic? Because it will look the same.

You are assuming with comments like this; "Seems clear to me people are buying from brand loyalty, not features AND performance."

Look at the various polls on forums, its usually twice as many votes for ATi on who has the better card. Are they all ignorant? Even B3D's poll has the same ratio or more, and I think they are the most advanced forum crowd there is.

Once again, ATi has features NV doesnt have too. Not that I think they matter at this time. ATi wins in performance, the extra "features" wont be used for some time. And for the billionth time, we dont know what kind of impact they will have. ATi has the faster card for todays games, since thats all we can play, its good enough for me. If PS 3.0 makes the NV4x much faster, and improves image quality, Ill gladly eat my words. You can bump this post even if you with, and I will declare to all I was wrong.

"and like I said I'm unbiased."

That comment is pretty funny though.
 

fwtong

Senior member
Feb 26, 2002
695
5
81
Originally posted by: Edge3D
Originally posted by: BFG10K
Actually the resolution point is a good one. I frequently play above 1600 x 1200 and eventually I'll be picking up a 21" CRT with the desire to play at 2048 x 1536. I haven't heard of anything claiming nVidia's HSR schemes are disabled at resolutions >1600 x 1200, unlike ATi's which even now don't go that high.

Perhaps nVidia potentially has the advantage at >1600 x 1200? Given no review seems to go above that setting it's going to be pretty tough to get anything concrete out of them in that regard.

You look like an old timer around here. Can you contact a big dog at Anands and use yer influence to get 2048 testing done? Or should we all bog him with requests letting him know the people want to see?

I'm very interested in such data, and if the x800XT runs away at 2048 then it will be my card. Like that guy said, its only $100 more.. and why pinch pennies at that price ($400)?

But I honestly want SM3.0 and have a hard time paying anything more ($100 in this case) and getting less. But I would pay more, and get less.. if it performs on a playable level and no SM3.0 hardware does the same. Hence more!

If you want SM3.0, then just admit that you're biased because you obviously have your mind made up and are just trying to antagonize the ATI fanboys. Maybe you're not biased towards a particular company, but you're obviously biased towards SM3.0.

Edit:
When you quoted the Firing Squad report, you conveniently left out the next sentence, "It remains to be seen how far some of the developers on these titles will go with their 3.0 implementations, but it?s an interesting development nonetheless, as the first shader model 3.0 titles will hit the market much faster than its predecessors." What I get from that is that SM3.0 games will be out, but it's not known the extent of the use of SM3.0.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: fwtong
Originally posted by: Edge3D
Originally posted by: BFG10K
Actually the resolution point is a good one. I frequently play above 1600 x 1200 and eventually I'll be picking up a 21" CRT with the desire to play at 2048 x 1536. I haven't heard of anything claiming nVidia's HSR schemes are disabled at resolutions >1600 x 1200, unlike ATi's which even now don't go that high.

Perhaps nVidia potentially has the advantage at >1600 x 1200? Given no review seems to go above that setting it's going to be pretty tough to get anything concrete out of them in that regard.

You look like an old timer around here. Can you contact a big dog at Anands and use yer influence to get 2048 testing done? Or should we all bog him with requests letting him know the people want to see?

I'm very interested in such data, and if the x800XT runs away at 2048 then it will be my card. Like that guy said, its only $100 more.. and why pinch pennies at that price ($400)?

But I honestly want SM3.0 and have a hard time paying anything more ($100 in this case) and getting less. But I would pay more, and get less.. if it performs on a playable level and no SM3.0 hardware does the same. Hence more!

If you want SM3.0, then just admit that you're biased because you obviously have your mind made up and are just trying to antagonize the ATI fanboys. Maybe you're not biased towards a particular company, but you're obviously biased towards SM3.0.

Well. That Ackmed fellow seems to want to paint me in fanboy colors but I'm not. Proof is in the pudding, ATI just needs to deliver at 2048. Otherwise NV offers comparable performance with more of the features I want.
Is that fanboyism? And hilarious that Ackmed says "are the MASSES ignorant?" LOL, need I say more?
Owned by his own words.

Anyway, yes I am "biased" towards having SM3.0 in my next $500 card. Who in their right mind deep down, WOULDNT want this feature?
If its so meaningless, then why would ATI ever implement it? They should just forget about SM3.0 and DX9C should be erased from earth because it is a useless feature. I mean, seriously. You have to be a fanboy to subscribe to that kind of ideology.
Its like 3dfx and 16bit. Or Intel and 32bit. Why NOT have more features for less or equal money? Silly. Just silly.

Unless, like I said it blows it away in some res that was previously unplayable at all in a newer game like UT2K4 (2048x1536). THEN I'd forget about my SM3.0 that I hold so dear.

Im not exactly sure if anyone around here is a ATI fanboy as you say, but it does appear that there are. And it is funny watching them squirm under the SM3.0 gun. It just doesnt make sense stepping out of the fanboyistic "NV vs ATI" circle and to see people actually trying to downplay a upcoming technolgy that one company doesnt have.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: BFG10K
It seems the XT will be able to deal efficiently with resolutions far beyond 20x15,
That's good news. I haven't had a chance to read the B3D review and I didn't read your post carefully so I missed that tidbit.

Can you contact a big dog at Anands and use yer influence to get 2048 testing done? Or should we all bog him with requests letting him know the people want to see?
I think it's best that we all ask for it in the articles forum. I'm guessing the main reason why it isn't done is because only a tiny portion of the population are interested or in fact can even run such a high resolution.

In reality that's the first resolution I would test because it's the holy grail for GPU tests. Compared to 2048 x 1536 the likes of 1600 x 1200 is really just low resolution, CPU limited mush.

Hey I dont see that forum. Could you point me to it or create the post and create a poll for it?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: SickBeast
Thats why I want to see which of the cards does the best at 2048 in UT2K4. Anyone got a link?

Here it is

They also have benches of the coveted 6800U Extreme or whatever it's called. Strangely enough the X800XT soundly defeats it in both UT2004 and Far Cry. I haven't read the rest of the article. Strange yet interesting. 58FPS at 1600x1200 w/ 4XAA 8XAF is incredible, but I was honestly expecting even more than that. And only 50FPS from the 6800U Extreme? Wow is all I have to say. I mean, that's just 1 resolution step above what I'm currently running with my 9700pro.

You are telling me you are running 1280x1024 4AA/8AF with 9700Pro FAR CRY at 50FPS?

YEAH RIGHT..keep dreaming. 9700 Pro ONLY gets 49FPS at 800x600 4AA/8AF. So 6800UE and X800XT get FASTER frames at 1600x1200 4AA/8AF. That's 3 resolutions higher, and still with faster framerate. In fact, in comparison 9700 Pro ONLY gets 34.6FPS at 1600x1200 0AA/0AF. There is absolutely no comparison - the new cards represent a monumental leap for both camps.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
E3D,

Well, I dont know about all of that. I mean, I didnt state that I think SM3 will bring it up to the x800XTs level or not.
But it will improve it to some degree, and displacement mapping will give NV cards a level of IQ that ATI cards wont be able to match.

I took it to mean exactly that, mainly because I'm not sure SM3.0 will allow NV40 to do anything major that R420 can't do (aside from geometry instancing, though I don't know much beyond its ability to save some CPU time and AGP bandwidth). As for displacement mapping, I *think* R420 and R300 and even Parhelia can do it....

Again, I'll be reviewing this info. I honestly could give a rats behind if anyone other than Anand's says it.
I dont find those other sites (BOTH Toms and OCP) to have much credibility considering past events. The others are too small to not possibly be ran by biased tyrants. Heck, OCP THINKS they are in Anands league but they are smalltime still IMO.

The contempt you show for other sites is pretty surprising. I've been around here for way too long, and I don't agree with your assessment. IMO, AT hasn't been the final word on 3D for a while now--at least since Anand's 5900U review, in which he missed a few things and never (to my knowledge, and I was one of the few creating quite a fuss here) even acknowledged his errors. I'm sure he didn't have as much time to dedicate to the review as he wanted, but IMO that didn't excuse either the handful of errors in the review or the (extremely, strangely) slow (in Anand's case, non-)reaction to forum member requests at the time--especially for a site with as many hits and as much profit as this (meaning, AT has a large viewer base and relatively large income, and thus both the responsibility and the means to provide accurate info). "Biased tyrants" is an almost laughably strong term, and comical when referring to the various reviewers: Brent at H (Kyle runs the site), Lars at THG (Tom's the big cheese), or Dave at B3D. (I'm guessing you were thinking mainly of Kyle and Tom and referring mainly to THG's 5800U review and Kyle's 3DM03 stance, though.) I'm not sure one of those sites' writers has publicly purported to be in or out of AT's "league," but, IMO, AT doesn't stand at the top of 3D reviewing (for now, I prefer B3D, 3DC, and H.fr for investigative work, and TR for writing). Many sites publish good reviews (informative benchmarks with informed commentary), but only a few get things right more often and consistently offer more info to qualify as great. Anyway, I don't want to get into a ratings game, but I think you place way too much stock in AT for current 3D reviews. Derek's doing a good job so far (if I may be so bold), but he appears to be relatively inexperienced and perhaps doesn't have as many connections as compared to some other reviewers. (He's young yet. )

Anyway, its pretty much thought that so far IQ is actually in NV's favor. Not surprising.. I mean look at ATIs XT performance. Cuts were made somewhere.. not that the tech isnt great.. but I'm sure its a combo of driver cheats and great speedy hardware.
nV's the one who's been caught cheating since the 5800U, not ATi, and nV's the one with quite a few IQ problems in Far Cry (which has the kind of graphics we all want, on a D3D framework that HLSL promises to make simpler/faster to create). The R420 looks to be about as large as NV40, so I imagine they contain a roughly comparable transistor count (remember nV and ATi apparently count transistors differently when giving official numbers), so I'm not sure why you think ATi is cheating when they have about the same transistors working on fewer features, particularly since they're essentially releasing a more polished second-gen GPU to nV's (heavily?) reworked part. If you think ATi must be cheating for the X800XT to outperform the 6800U, can you offer some proof or even a hypothesis?

Those last two quotes show you as rather misinformed, making the strength of your convictions all the more surprising. You registered here a few days ago; would it be wrong for me to think you've begun following 3D cards in general and web reviews/forums in particular only recently?

As for 2048 testing, didn't Derek do some for the 6800U? I know he was asking for a 20x15 and higher monitor right before that review.

As for SM3.0, it certainly looks great, but so did PS1.4, and that wasn't exactly an instant hit.

I think the Articles forum left us right around the time we were all arguing about Anand's 5900U review (good times ). AT has since switched to a comments system for each article separate from the forums. If you want 20x15 benches, let Derek know in the AT X800 review comments thread (link at the bottom of any review page).
 

Vernor

Senior member
Sep 9, 2001
875
0
0
hat I get from that is that SM3.0 games will be out, but it's not known the extent of the use of SM3.0.

In the next 6-9 months ?

Most likely a few lighting effects that you can enable with a checkbox.
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Just let them buy what they want. We all know that both companies will be rich off of these cards and we will still all buy what we want and think what we want. We arent the marketing team for ati or nvidia and after posting a few times about our opinions, i think its time to give it a rest.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Since I am a cheap bastid I will be waiting for the 6800 GT to come down in price
 

Avalon

Diamond Member
Jul 16, 2001
7,567
152
106
Anyway, yes I am "biased" towards having SM3.0 in my next $500 card. Who in their right mind deep down, WOULDNT want this feature?

I fail to see why you keep trumpeting SM3.0 in this thread. Honestly, it really does make you look like a fanboy, at least IMO, and it seems others as well. I'm not trying to offend, just trying to be honest. Have you seen links with benchmarks of SM3.0 usage? To my knowledge, there aren't any, and it won't be in use for at least some months. Even more, we don't know if it will noticeably help or not. I personally do not want SM3.0 right now, but I don't think that means I'm not in my right mind. You can buy a $400-$500 card for SM3.0, but I'm not. I'll wait for a good OCing or softmodding card to happen by, pay a lot less, and have the same performance as you. Hurrah for bang for the buck.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Pete
E3D,

Well, I dont know about all of that. I mean, I didnt state that I think SM3 will bring it up to the x800XTs level or not.
But it will improve it to some degree, and displacement mapping will give NV cards a level of IQ that ATI cards wont be able to match.

I took it to mean exactly that, mainly because I'm not sure SM3.0 will allow NV40 to do anything major that R420 can't do (aside from geometry instancing, though I don't know much beyond its ability to save some CPU time and AGP bandwidth). As for displacement mapping, I *think* R420 and R300 and even Parhelia can do it....

Pete, I have little to say on the rest of your post (other than "thank you for trying to shut this guy up" ).

Geometry instancing could be useful, but it's also tougher to implement, and something that makes sense to build in when you're designing the engine. IMHO, developers would have a tough time just slapping it on top of an almost-finished game engine. Adding lighting and other 'special effects' is pretty easy; modifying the way your game deals with objects and models is usually much more complicated.

You *can* do displacement mapping in PS/VS2.0 -- but it's *easier* (and maybe considerably faster) in SM3.0, since the vertex shader programs can read the texture data directly, instead of you having to read it in via a pixel shader or by resending it over the AGP bus. I don't know about the Matrox Parhelia; I have heard they claim to support it -- but, then, didn't they also 'claim' to support DX9 and SM2.0 for a while?

I haven't seen any numbers on the kind of performance hit that the NV40 takes using displacement mapping on nontrivial objects (the demo video on NVIDIA's website only shows a flat plane and a sphere, with a fairly small displacement map), and I'm not sure it will see widespread use if only the 6800GT and 6800U can really do it effectively (remember, even a $400 card is out of most consumers' reach until the price drops quite a bit). Frankly, from what I've seen of it thus far, I'm more impressed by HDR lighting -- a PS2.0 feature.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
You are telling me you are running 1280x1024 4AA/8AF with 9700Pro FAR CRY at 50FPS?

YEAH RIGHT..keep dreaming. 9700 Pro ONLY gets 49FPS at 800x600

I was actually referring to UT2004...sorry...I should have been more clear. I guess you haven't read my other posts, I've basically said that Far Cry is my only reason for upgrading my graphics card at this point in time.
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: Edge3D

Well. That Ackmed fellow seems to want to paint me in fanboy colors but I'm not. Proof is in the pudding, ATI just needs to deliver at 2048. Otherwise NV offers comparable performance with more of the features I want.
Is that fanboyism? And hilarious that Ackmed says "are the MASSES ignorant?" LOL, need I say more?
Owned by his own words.

Anyway, yes I am "biased" towards having SM3.0 in my next $500 card. Who in their right mind deep down, WOULDNT want this feature?
If its so meaningless, then why would ATI ever implement it? They should just forget about SM3.0 and DX9C should be erased from earth because it is a useless feature. I mean, seriously. You have to be a fanboy to subscribe to that kind of ideology.
Its like 3dfx and 16bit. Or Intel and 32bit. Why NOT have more features for less or equal money? Silly. Just silly.

Unless, like I said it blows it away in some res that was previously unplayable at all in a newer game like UT2K4 (2048x1536). THEN I'd forget about my SM3.0 that I hold so dear.

Im not exactly sure if anyone around here is a ATI fanboy as you say, but it does appear that there are. And it is funny watching them squirm under the SM3.0 gun. It just doesnt make sense stepping out of the fanboyistic "NV vs ATI" circle and to see people actually trying to downplay a upcoming technolgy that one company doesnt have.

I didnt paint you as a fanboy, you have acted that way since you registered. I have one PC with a P4, one with a Athlon, one with a 9800XT, one with a 5900NU, one with a Creative sound card, one with a Phillips one, one with Kingston ram, one with Corsair ram. The only thing I have that is the same in both PC's is WD hd's. Oh no, Im a WD fanboy!!!

So now everyone who voted that ATi has the better card is ignorant now? Haha. Saying someone got "owned" on a forum is childish at best.

ATi didnt add it in the gen because they said they dont think its needed right now. I think they know before you or I about when it will be needed. Sure, it will be in future generations of their cards, but they dont think its needed now. While personally I think it was a bad decision, its what they did.

You seem to keep forgetting the added features that NV doesnt have, that ATi does. Im not downplaying PS 3.0, but how can I get excited about it, when we have not seen any benifit from it? If it was so great, why hasnt there been some showing of it? Its far from 16bit vs. 32bit I think. If there was some screen shots of benchmarks of PS 3.0 being better than PS 2.0, it would be a lot easier to get excited.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Ackmed
Originally posted by: Edge3D

Well. That Ackmed fellow seems to want to paint me in fanboy colors but I'm not. Proof is in the pudding, ATI just needs to deliver at 2048. Otherwise NV offers comparable performance with more of the features I want.
Is that fanboyism? And hilarious that Ackmed says "are the MASSES ignorant?" LOL, need I say more?
Owned by his own words.

Anyway, yes I am "biased" towards having SM3.0 in my next $500 card. Who in their right mind deep down, WOULDNT want this feature?
If its so meaningless, then why would ATI ever implement it? They should just forget about SM3.0 and DX9C should be erased from earth because it is a useless feature. I mean, seriously. You have to be a fanboy to subscribe to that kind of ideology.
Its like 3dfx and 16bit. Or Intel and 32bit. Why NOT have more features for less or equal money? Silly. Just silly.

Unless, like I said it blows it away in some res that was previously unplayable at all in a newer game like UT2K4 (2048x1536). THEN I'd forget about my SM3.0 that I hold so dear.

Im not exactly sure if anyone around here is a ATI fanboy as you say, but it does appear that there are. And it is funny watching them squirm under the SM3.0 gun. It just doesnt make sense stepping out of the fanboyistic "NV vs ATI" circle and to see people actually trying to downplay a upcoming technolgy that one company doesnt have.

I didnt paint you as a fanboy, you have acted that way since you registered. I have one PC with a P4, one with a Athlon, one with a 9800XT, one with a 5900NU, one with a Creative sound card, one with a Phillips one, one with Kingston ram, one with Corsair ram. The only thing I have that is the same in both PC's is WD hd's. Oh no, Im a WD fanboy!!!

So now everyone who voted that ATi has the better card is ignorant now? Haha. Saying someone got "owned" on a forum is childish at best.

ATi didnt add it in the gen because they said they dont think its needed right now. I think they know before you or I about when it will be needed. Sure, it will be in future generations of their cards, but they dont think its needed now. While personally I think it was a bad decision, its what they did.

You seem to keep forgetting the added features that NV doesnt have, that ATi does. Im not downplaying PS 3.0, but how can I get excited about it, when we have not seen any benifit from it? If it was so great, why hasnt there been some showing of it? Its far from 16bit vs. 32bit I think. If there was some screen shots of benchmarks of PS 3.0 being better than PS 2.0, it would be a lot easier to get excited.

For the 70th time, you learning disabled troll. PS3.0 DOES NOT LOOK BETTER, IT IS FASTER.
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: Acanthus
For the 70th time, you learning disabled troll. PS3.0 DOES NOT LOOK BETTER, IT IS FASTER.

First off, name calling is childish.

Second, its Edge3D that thinks it does, Ive said many times it doesnt.

If you actually read post, you would know this. From the first post on this page:

Originally posted by: Ackmed


Because quite simply, we dont know how PS 3.0 will affect games, if it indded does. If it going to give them such a huge boost in speed, why havent we seen any tests with it? They seemed to want to brag about it so much, why didnt they release a benchmark to show how much faster it makes the NV4x? The quality is going to be the same, so its all about speed.
 

Eagle17

Member
Nov 23, 2001
114
0
0
Well I voted for the X800xt and for many reasons, first it has been discussed to death about how very minute any sm3.0 visual changes will be. The big advantage is going to be speed and that is where devopers will focus on it for improving newer games.

I also not how your are very impressed about how much faster the gt is over the x800xt when playing 3 year old game engines. I am surprised you are not complaining that anand did not benchmark Counterstrike, at least people play that game. (I will agree that the main point on these games is to show just how horrible the ati icd is, and If doom 3 is openGL they will have some problems.)

Question : is doom3 openGL?

If I am going to spend $400+ on a new card i will want it to be fast in the games "I" play which right now is Farcry and BF vietnam. I have had less problems with Ati drivers than nvidia drivers over the past 2 years and I still use a 2 year old 9700pro for my main gaming rig. I do have my issues with ATI mainly with the 64bit drivers but I have 64bit drivers for my 5900 and they are not very good at all.

I am however going to wait for a newer pcix version of the x800xt and maybe one with the hdtvwonder bundled... we will see.

I prefer ati's philosfy on building the hardware to run the software fully as opposed to nvidia buid the software to run the hardware stance. I play sometimes so famous to overclock games and I do not want to wait for ati to code their drivers to make them run the game acceptably.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |