Radeon's 3D visual quality better than the GTS'!!!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OneEng

Senior member
Oct 25, 1999
585
0
0
Coki,
Crawl back under whatever rock you came from unless you have something useful to add.

In FSAA V5 still reigns supreme. In 32bit color, it would appear the ATI has edged out the previous leader nVidia.

What I found most interesting of all was that the Radeon smoothly played the nVidia special Q3 level while the GF2 could not!!!

That is impressive. Emagine if the level was specifically optimised for ATI....hmmm.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0


<< So NVidia, it now seems, has the worst 3d video quality of all the big players... >>


Oh come on! It should be obvious to anyone that this image quality comes down to one thing only: poor algorithm used for S3TC compression in Nvidia Detonator 5.xx drivers. All of the reasons why the writer chose Radeon over Geforce2 (sky artifacts, texture miscoloration etc.) were solely because of this. All Nvidia needs to do is to tune up* the on-the-fly compression of the drivers and quality improves drastically.

Don't get me wrong: Radeon really has an image quality advantage over Geforce2 GTS - it has a much wider variety of supported features. EMBM, 4-matrix T&amp;L skinning and 3D-textures are just examples of it's unsurpassed feature set.

* adding a &quot;realtime S3TC compression quality&quot; slider on driver panel would be ideal - this way those with lesser CPUs wouldn't end up waiting 10 minutes for level loads.
 

Captain Ginyu

Member
Dec 4, 1999
98
0
0
64 meg gf2's would not lose any speed with s3tc off. Nvidia's 3d quality is better than everyone else's. 2d heh matrox wins follow by 3dfx and ati.
 

zippy

Diamond Member
Nov 10, 1999
9,998
1
0
Captain, no, nVidia's 3d SPEED is better than the rest...image quality isn't determined by speed!
 

zippy

Diamond Member
Nov 10, 1999
9,998
1
0
Captain, true, just a reflex I s'pose. However, the Radeon DOES have better 3d and 2d than the GeForce2GTS.

Hopefully I will order a radeon soon and I will post some screenies.

Somone already said this...for image quality in 3d nowadays:

1. Matrox
2. 3dfx
3. ATi
4. nVidia

I agree wholeheartedly.
 

Napalm

Platinum Member
Oct 12, 1999
2,050
0
0
Jukka:

C'mon - you can't have your cake and eat it too. As Zippy said, if you blame its poor image on compression then you must also accept that it is gonna get killed in speed by the Radeon if you turn it off.

Can't believe that ATI which is just a few minutes from here has dethroned NVidia's mighty GeForce GTS in both speed and image quality. If these suckers were anywhere near affordable I might just pick one up. Anyone think that they will drop down to $100 anytime soon?

Napalm
 

MisterM

Golden Member
Oct 9, 1999
1,768
0
0
NVidias 3D quality was very good from the TNT chipset on, that`s how they got the great quality reputation they have now.

Fact is that the other vid card manufacturers have all cought up, while NVidia hasn`t done anything revolutionary for some time (Matrox had Bump Mapping, so NVidia built it into their GeForce...S3 had S3TC so NVidia build it into their drivers.....Matrox had DualHead, NVidia gets some Matrox ngineers and suddenly has got TwinView....3dfx had FSAA, so NVidia build it into their drivers....), while the other manufacturers have been constantly catching up.

Right now NVidia`s 3D quality is mediocre. The quality difference between the different vid cards is barely noticeable on screen shots, I doubt you would really notice it in-game. The speed difference isn`t that great too, at least if your like me and playing at 800x600 or 1024x768, any of the newewst vid cards are good enough for that.

IMHO you should really consider extra features if you choose an top of the line vid card:

You want raw speed, very good drivers, and you are sometimes doin professional 3D apps? GeForce 2GTS is for you.

You want Dualhead and the best bang per buck? GeForce 2 MX for you.

You want great 2D, and Dualhead? G400 MAX for you.

You want good 32 bit speed at high resolution, and the best DVD playback you can get, and you don`t care if the drivers are 100% perfect yet? ATI Radeon for you.

You want totally crappy drivers? Go S3!

I hope you`l see my point....
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,426
8,388
126
anyone mention that nvidia fakes trilinear filtering?
 

Captain Ginyu

Member
Dec 4, 1999
98
0
0
The g400 had good imagw quality in 3d if you could get the game to run Matrox did a good job fixing the drivers. The radeon is too late though with the nv20 and g800 coming out. It would be nice if the g800 was king so we could all enjoy great 2d quality(This comes from an nvidia fan)
 

pen^2

Banned
Apr 1, 2000
2,845
0
0
MisterM: you couldnt have said it better homeboy!
i personally wish G800 makes it in time to compete against the big boyz. ATi has done a great job puttin together such a great card that works, but seems like its gonna be the same old story, history repeats itself i guess? when the rage128 came out it was a short lived king only to get blasted by the geforce. nvidia being tight lipped about their next gen card doesnt necessarily mean they dont have somehting up their sleeves... remember how they shocked the world with their gf2? sure sure it aint all that much faster than gf1 in real world games but it was one amazing chip you gotta admit. now only if they can come up with some way to eliminate the memory bottleneck... then again g800 would rock... as long as matrox doesnt take too long to deliver.....
 

OneEng

Senior member
Oct 25, 1999
585
0
0
Napalm,
I hear you! My pain tolerance lies in the $130-$150 range. These $300.00 cards.....man.
 

Painman

Diamond Member
Feb 27, 2000
3,805
29
86
ATi's stumbling block has been their drivers, even way back in the Mach32 days. It's a shame, because they've made some excellent chips/boards, and Radeon is another. Let's see what they can cook up for it, I'm sure the Win2k users will be watching.

-Pain
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
I only notice a difference in the first two. The first one in the sky as has been pointed out. The second on the tiles on the wall on the right. However, the second one the radeon is farther back from the wall which is probably why it looks better.

The first one I don't trust. I've taken screenshots on many machines and graphic glitches sometimes occur during a screenshot in my experience even if it isn't noticable on the screen. It could have just been a bad screenshot while the screen was redrawing or something. I really don't know the cause, but I know it happens.

I really think someone could have created those screenshots to make either card look better. There is a lot of animation going on in most games these days, and if you catch the animation at just the right time you can find flaws in some of the animations that create a bad screenshot. I really don't trust this review.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0


<< As Zippy said, if you blame its poor image on compression then you must also accept that it is gonna get killed in speed by the Radeon if you turn it off. >>



Of course, I do accept that Radeon is faster in high 32bit resolutions than GTS. That's the whole marketing idea behind the card - unsurpassed 32bit performance.

However, as Captain Ginyu pointed out, turning texture compression off in Quake3 on a 64mb GeForce2 board results to a very minimal performance loss at best. Point of my rant was that the image quality issue of GeForce2 can be fixed by a simple driver update, and it's really not a hardware issue as PlanetHardware's story might make some people believe.

Also, ATI Radeon *propably* takes a bit longer to load a Q3A level because it's drivers have a more accurate realtime S3TC compression algorithm. It's all about compromises. For best quality using TC Quake3 should ship with pre-compressed textures. On the other hand, this way any compression method - including 3dfx's FXTC - works with the game.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
3D image quality nVidia is still king with the possible exception of ATi, Matrox isn't worth mentioning.

It is interesting to see the same Matrox backers that want ultra high precission in 2D comparisons drop back on their stance and desire incredibly simplistic tasks for 3D. Step away from the games and load up some heavy 3D work with high levels of complexity and see what you get. I haven't had any time with the V5 to see how it fares on complex scenes so 3dfx still has the benefit of the doubt, and unless ATi uses the same cheap low quality hack Matrox does with their Z-Buffer they should be at least nVidia's equal.

Napalm-

&quot;C'mon - you can't have your cake and eat it too. As Zippy said, if you blame its poor image on compression then you must also accept that it is gonna get killed in speed by the Radeon if you turn it off.&quot;

Since they have enabled AGP texturing performance is up significantly while eliminating texture compression though the Radeon still would hold an edge. I don't use it anymore unless I'm benching, the performance hit is not noticeable while playing.

MisterM-

&quot;Fact is that the other vid card manufacturers have all cought up, while NVidia hasn`t done anything revolutionary for some time&quot;

I couldn't disagree more. First off, FSAA has been part of nVidia's feature set since at least the TNT1, though admittedly it was problematic and was disabled with the 2.xx series drivers. Check out the specs on the older nVidia boards, they have supported FSAA a lot longer then 3dfx, now an option that works without issue is another matter

Hardware T&amp;L and per pixel shading offer enormous leaps in visual quality when they are used, we simply haven't seen it yet because of the slower companies. Within a month or two we should have T&amp;L boards from all the major companies save 3dfx and judging by this years E3 we will start seeing these games shipping by the end of the year. nVidia's advancements have in reality been so far ahead of everyone else that they have trouble gaining support.

With the exception of the Radeon, nVidia is way ahead of the competition. Of course we must give ATi their due as they ahve just raised the bar even higher, but again with certain companies being an anchor around the neck of the industry it will likely take some time before we see what the Radeon can truly offer.

While they are not great indicators of games, check out some of the technology demos from nVidia on either a GeForce/GF2 or Radeon and see what they are capable of now, the rest of the industry needs to catch up.

&quot;Right now NVidia`s 3D quality is mediocre.&quot;

Again I must disagree. Give the GF/GF2 some real grapihics to display. Using something like Q3 is akin to &quot;proving&quot; that one card has equal 2D to another using 640x480, it isn't a good yardstick.

On your assesment about who should buy what card I would certainly agree although I have high hopes that the Radeon will take the crown for OpenGL apps as well(not holding my breath on that one though ).
 

MisterM

Golden Member
Oct 9, 1999
1,768
0
0
BenSkywalker:

What did take you so long?

I have to admit that I did completely forget about per-pixel shading, an very good feature that NVidia introduced.

Hardware T&amp;L is very nice, but IMHO Nvidia didn`t innovate it, they just offered it first in the gamer market. That`s not what I would call innovative. (although you could probably say the same about most of the features from the other manufacturers, so my argument is kinda` weak, so point given)

I know that NVidia was offering FSAA in drivers before 3DFX even talked about FSAA, but I think that you could hardly call that feature usable.



<< &quot;Right now NVidia`s 3D quality is mediocre.&quot;

Again I must disagree.
>>



That was slightly out of contense, IMHO.
Tell me a company that doesn`t offer 3D quality that is as good as NVidias now? (Besides S3)
We`re not talking 3D feautures here, or what it could do, but how the screenshots look. (Asw I said before I doubt that there is an noticeable in game difference....exceot maybe for the blurry textures the V5 was offering, but setting the LOD will allow for very good quality on that board too)



<< On your assesment about who should buy what card I would certainly agree although I have high hopes that the Radeon will take the crown for OpenGL apps as well(not holding my breath on that one though >>



I too have hopes, but IMHO it`s too soon to recommand an card on &quot;it could eventually sometimes perform good in OpenGL&quot;.
It would be really nice to see the Radeon compared with the GeFroces in proffesional 3D apps.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
MisterM-

Heh, just had time now to respond I apologize if I quoted you improperly, I was trying to respond to the entire line of thought though.

&quot;Tell me a company that doesn`t offer 3D quality that is as good as NVidias now? (Besides S3)
We`re not talking 3D feautures here, or what it could do, but how the screenshots look.&quot;


All Matrox boards, every ATi board up to the Radeon(possibly including the Radeon), every 3dfx product up to the V5(possibly including the V5). They default to 16bit Z-buffer, fine for games, but it causes some heavy flickering of images when you have two objects in close proximity when viewed from differing angles. If you think pixel popping is annoying, imagine entire walls switching from a flat white(inside) to brick(outside) in rapid succession.

&quot;I too have hopes, but IMHO it`s too soon to recommand an card on &quot;it could eventually sometimes perform good in OpenGL&quot;.&quot;

I absolutely agree with you on this, I just posted in another thread a reccomendation they go with a GF SDR/GF2MX for application support mentioning that while the Radeon may be great, I wouldn't count on it(and I won't reccomend them until they prove themselves for any pro application uses). ATi has some real nice hardware and while they have demonstrated they can outrun the best of them in games, that is worlds apart from having some solid NT/2K drivers let alone the proper OpenGL support to go along with it.

I do have high hopes though, the feature support is killer and I think we can safely assume that the SDR boards will be in the $100 price range not too long after launch(I hope to see a SDR Radeon GF2MX price war, even though they are a very good deal as it is IMHO).
 

MisterM

Golden Member
Oct 9, 1999
1,768
0
0


<< I apologize if I quoted you improperly, I was trying to respond to the entire line of thought though. >>


No problem at all!



<< All Matrox boards, every ATi board up to the Radeon(possibly including the Radeon), every 3dfx product up to the V5(possibly including the V5). >>



I was talking top of the line products, and I have to admit I wasn`t really clearly stating taht .
I agree that the lower end NVidia boards are ofering very good quality, but my point was that the other companies are catching up rapidly, and may have even surpassed NVidia right now...

I always thought that the Matrox boards 3D quality was about equal with NVidias, although you do have a point about 16 bit Z-buffer...I was going from the screenshots I`ve seen, and the Matrox was looking about as good as NVidia to me.



<< I do have high hopes though, the feature support is killer and I think we can safely assume that the SDR boards will be in the $100 price range not too long after launch(I hope to see a SDR Radeon GF2MX price war, even though they are a very good deal as it is IMHO). >>



I agree, I just hope Radeon SDR boards will be out soon, I want an vid card for an cheap multimedia system with an Duron and both the GeForce MX and an SDR Radeon would be perfect...that should be able to play the occasional DVD perfectly well for an nice price, and it should be more than good enough for the games I play.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0


<< I loved my V2, it played games of the day hella fast, but the thing sure didn't have the best image quality. >>

--KarlHungus


<< every 3dfx product up to the V5 >>

--BenSkywalker

Ok, why is it that people feel that it's fair to compare a GTS to old 3dfx cards? Who gives a rat's ass how the V2 looked? That was a couple years ago. Should I be comparing my V5-5500 to my old Riva128? Well, by that logic, nVidia sucks.

So many times I see the nVidia loyalists put down the V5 because of some feature the V3 lacked.

C'mon people, does the phrase &quot;apples to apples&quot; ring a bell?

Btw, screenshots are not necessarily indicative of actual image quality. I've seen plenty of screenshots that look worse than the image did, while in game. And I've also seen articles talking about how some video cards are more &quot;screenshot friendly&quot; (for lack of a better term.)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
&quot;Ok, why is it that people feel that it's fair to compare a GTS to old 3dfx cards?&quot;

WTH are you talking about? The only thing I can think of that I said that could be confused was my statements about image quality in regards to Z-Buffer. Every consumer/gaming company with the exception of nVidia has a history of using 16bit Z, does the V5?? Compare the G400 to the TNT1 and the G400 loses in this instance as will any board that uses a 16bit Z.

Why did you feel the need to only defend 3dfx? I made mention of all the players, you only stick up for one? I'm not bashing them, if all you want is extremely simplistic 3D graphics such as those in current games then a 16bit Z is fine. The only benefit to a 32bit Z is higher levels of precission in high poly situations which the V5 5500 with its' AGP transfer rate limitations won't have to worry about. Using a 32bit Z hurts performance, the only benefit is one that can't be seen with games.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |