New review of Geforce3 and 2D issues.

Eug

Lifer
Mar 11, 2000
23,752
1,309
126
Don't hurt me if this has already been posted.

New Geforce3 Review

Good:


<< Overall the Geforce 3 is a must buy for those that must have the fastest PC in the world, for gaming junkies, and image quality connoisseurs. The wait for the first truly revolutionary graphics card since the Voodoo 2 is over and its name is the Geforce 3? >>

but:

<< Despite this high praise, the Geforce 3 is not perfect. DXT1 texture compression is still broken and has been in all nvidia cards since the original Geforce. This can be cured in Quake III by switching on DXT3 instead, but there are ? and will be ? titles where this is not possible. Also the 2D quality is not as good as the Radeon, and not even close to the Matrox G400. >>



I had hoped that these companies had worked out the problems with nVidia 2D. This may not be the case unfortunately, at least not with this particular card.

I guess I'll look to the Radeon II or Kyro III for my next card (one year from now?), unless other Geforce3 brands have better 2D. (I run at 1600x1200x32 @ 75 Hz.)
 

Bingo13

Golden Member
Jan 30, 2000
1,269
0
0
Eug,

Okay I will bite on this one- This is a very subjective area but my GF3s are better at 1600x1200 than my Radeon and almost equal my Matrox G400. If you play with the digital vibrance and gamma settings it is hard to tell the difference at all. I would have no issue recommending the GF3 for 2d operations at this time. You never know if a tester did a clean install, properly loaded the correct monitor *.inf file, actually runs at 75 or 85hz, or even adjusted the monitor based upon a change in the video card. It all makes a difference in the quality of the video display except for the Creative GF2 Ultra cards.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
DXT1 texture compression is still broken and has been in all nvidia cards since the original Geforce.

I'd be suspicious of any review site that posts this. Nothing's &quot;broken&quot;, it's simply that the number of bits that nVidia chose to use is less than the other manufacturers.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
the 2d is fine on the two visiontek's i have seen, and i have heard from many others that to 2d is great. realy, i dont know what they are complaining about. the radeon is like a hair beter than my geforce3 in high-rez 2d, its realy not even worth talking about.
 

Eug

Lifer
Mar 11, 2000
23,752
1,309
126


<< You never know if a tester did a clean install, properly loaded the correct monitor *.inf file, actually runs at 75 or 85hz, or even adjusted the monitor based upon a change in the video card. It all makes a difference in the quality of the video display except for the Creative GF2 Ultra cards. >>

Good point. And like I said, it may only be with this card too.


<< the 2d is fine on the two visiontek's i have seen, and i have heard from many others that to 2d is great. realy, i dont know what they are complaining about. the radeon is like a hair beter than my geforce3 in high-rez 2d, its realy not even worth talking about. >>

Well, the Radeon on my Samsung 950p is acceptable, but I wouldn't want any worse. I'm not sure if it's the Radeon or my monitor though so maybe it'd look better with a higher end monitor (as would those GF3s if my monitor is the problem). That said, our Matrox G450 combined with a Samsung 900NF is clearly worse, and think that is the monitor's fault. (Tried tweaking it already.) Either way it drives me up the wall, especially since the text on my 15&quot; laptop TFT is so damn clear.

Glad it isn't the Geforce2 level though.
 

vlieps

Senior member
Jun 15, 2000
276
0
0
I agree on that, I have seen G400 and Voodoo5 on the same monitor and to me the V5 seemed to have better 2D.
There is not really such a big difference in 2D between those modern video cards, except, maybe, the old NVidia cards, those were really bad.
And who wants to run their monitors at 1600 dots, You need magnifying glass even on a 21' monitor to see something at that res.
 

PotNoodle

Senior member
May 18, 2001
450
0
0
? I'd be suspicious of any review site that posts this. Nothing's &quot;broken&quot;, it's simply that the number of bits that nVidia chose to use is less than the other manufacturers.?

Looks like another that has swallowed the marketing drivel.

Its broken ? why would actively choose a compression method that makes your IQ look worse than anyone else?s? And if it wasn?t broken why would you then proceed to provide the means for a workaround using another compression scheme (with a lower compression ratio)?

The only reason this issue still exists in GF3 is because by the time the issue had been discovered in GF2 the development of GF3 was too far down the line for it to be changed. I?ll wager that this will be a different situation for NV25 though.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Looks like another that has swallowed the marketing drivel.

Looks like another anti-nVidia troll.

Its broken ? why would actively choose a compression method that makes your IQ look worse than anyone else?s?

For speed reasons perhaps? Why don't you ask nVidia why they did it?

And if it wasn?t broken why would you then proceed to provide the means for a workaround using another compression scheme (with a lower compression ratio)?

Uh, that question doesn't even make sense. If I make a monitor that's designed to go to 800 x 600, is it &quot;broken&quot; if it can't do 1024 x 768?
 

PotNoodle

Senior member
May 18, 2001
450
0
0
?Looks like another anti-nVidia troll.?

Wow ? you?re keen on throwing that around; are your sensibilities that delicate that anyone who says that slightest thing that you perceive as being ?bad? about NVIDIA you have to label them a troll? I guess the reviewer in question here must also be an anti-NVIDIA ?troll? for also pointing this out, along with the countless others who have pointed it out?

The point being is just because the NVIDIA publicly say they ?chose? to implement it in this fashion doesn?t mean that its actually correct ? of course their marketing will say that, its their job, but do they want to admit they screwed up a tiny implementation in hardware with results in their compression looking worse than every other implementation on the market? Of course not!

Its not a big thing, there?s hardware issues everywhere, but believing and parroting the marketing spin is a little naive in my book.

?For speed reasons perhaps? Why don't you ask nVidia why they did it??

There would be no speed difference between using 2 blocks of 16 bit as opposed to 2 ? all the other 4 compression scheme do this.

?Uh, that question doesn't even make sense. If I make a monitor that's designed to go to 800 x 600, is it &quot;broken&quot; if it can't do 1024 x 768??

The point I?m making is that NVIDIA have provided the ?hack? to change DXT1 compression routines to go over DXT3 ? which results in using both 16bit blocks, rather than one, giving a decompression quality the same a other cards DXT1 quality (albeit with a compression ratio of 1:4, rather than 1:6); If NVIDIA were happy that they had their implementation of DXT1 100% to specification, with the maximum quality why provide this hack? It doesn?t add up.
 

Eug

Lifer
Mar 11, 2000
23,752
1,309
126


<< And who wants to run their monitors at 1600 dots, You need magnifying glass even on a 21' monitor to see something at that res. >>

It's good if you adjust the font sizes, etc. The only problem is that web designers often don't keep higher resolutions in mind, and thus the pages may display incorrectly. Actually if Windows programs haven't been coded properly, they also may display incorrectly. However, a correctly configured desktop with programs that behave will simply have text that's the same size as before, but at a higher resolution.

Here's my desktop at 1600x1200: 1600

Here's that same desktop but the image has been resized to 1024x768: 1024 for those of you who have a screen size closer to that.

(Caution - big files)
 

Eug

Lifer
Mar 11, 2000
23,752
1,309
126


<< Your 2d looks pretty good. It seems to be a lot better than my MSI geforce pro card. >>

Eh? While those cards don't have the greatest reputation for 2D, you can't test 2D quality by a screen capture, since it's capturing the data not the actual analogue image. I was just demonstrating that my fonts and stuff weren't too small at 1600x1200.
 

Eug

Lifer
Mar 11, 2000
23,752
1,309
126


<< Yeah, I know, the words just look better than mine. Strange but true. >>

Well, maybe that's because I'm using a higher resolution. If you you have a text that's 4 mm high at 1280x960, it's gonna look more jaggy than text that's 4 mm high at 1600x1200.
 

Martijnos

Senior member
Mar 16, 2000
252
0
0


<<

<< Yeah, I know, the words just look better than mine. Strange but true. >>

Well, maybe that's because I'm using a higher resolution. If you you have a text that's 4 mm high at 1280x960, it's gonna look more jaggy than text that's 4 mm high at 1600x1200.
>>



Hmm, that's interesting. I think the words are just to small for me at 1280X960. I'll try a higher resolution with bigger words. Maybe the words look better than. It's quite blurry right now. It's worth a try. I tried a lot of resolutions, but 1600X1200 just looks terrible and very small. I'll try a larger font size.

/edit/

Nope, 1600X1200 at 100hz is just terrible with this card, even with a larger font size. But with the larger font size at 1280X960 I'm more happy than before the adjustments. Thanks!

/edit/
 

nam ng

Banned
Oct 9, 1999
532
0
0
The monitor used in that review isn't even as good as Eug's for 2D, though Eug's monitor is a 205Mhz spec monitor, meaning it is only designed to be best at 1280x960x85Hz, anything above that isn't guaranty to be good enough
 

richleader

Golden Member
Jan 1, 2001
1,201
0
0
Potnoodle, you look pretty biased in one direction even to those of us who stay out of the rabid arguments you start about petty issues all the time. Nvidia's Dxt1 never killed anyone and the Geforce 3 is still the best consumer level card out there. Now, if all you do is play outdoor Quake III maps that use tiny alphablends for the sky and it drives you bonkers--well, get a life. BFG's comments were his own and not some marketing strategy by a corporation: see some of Powervr2's posts if you want that kind of jazz.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81


<< It is still fairly blurry at high resolution however, so anyone who uses their PC for CAD in 1600x1200 or above should give the card a miss. That said, even for demanding users the image quality up to 1280 x 1024 is very good. >>



Well it looks like its good enough for me,I like 1152x864 to 1280x1024 resolutions on my 19&quot;,I would not use 1600x1200 even with the best 19&quot; monitor or card,too small for me,I guess it`s a personal thing.

 

PotNoodle

Senior member
May 18, 2001
450
0
0
richleader,

?Potnoodle, you look pretty biased in one direction even to those of us who stay out of the rabid arguments you start about petty issues all the time.?

Such as what ?petty? issues were started by me?

And why biased? Because I don?t happen to believe the marketing hype that that is put on everything, not just by NVIDIA -- I was under the impression that it was generally accepted that PR is evil!

?Nvidia's Dxt1 never killed anyone and the Geforce 3 is still the best consumer level card out there.?

And now I say ?Wow? to you! What is this level of defensiveness out there that people seem to speak out and level bais at people when they say something that even vaguely goes against NVIDIA? Where did I comment on the performance of GF3? What have I said that that you would perceive as disparaging GF3? I have spoken out about one issue that has thus far been prevalent on all GeForce cards, and have voiced my opinion that it is na&iuml;ve to give the, almost reflex, marketing spin on the DXTC issue.

Like I said in the previous post, this isn?t a big issue, but it is a mistake none the less ? hardware issues occur all the time; Why can?t Radeon do Ansi and Trilinear simultantiously? Why doesn?t KYROII do AGP4X when KYROI does? Hardware issues most likely. To me saying that the DXTC bug isn?t an issue would be like S3 turning around and saying ?Yes, we meant to implement a broken T&amp;L unit in Savage 2000 as we felt this would result in a far better perform chip?. <shrug>

God gave me a brain to think freely, sorry if that offends you; I will, however, continue to use it. Good day.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
PotNoodle:

Wow ? you?re keen on throwing that around; are your sensibilities that delicate that anyone who says that slightest thing that you perceive as being ?bad? about NVIDIA you have to label them a troll?

Sorry, I had you confused with PowerVR2. His comments have been quote trollish lately. Still, I have yet to see you praise anyone but PowerVR (the company).

The point being is just because the NVIDIA publicly say they ?chose? to implement it in this fashion doesn?t mean that its actually correct

nVidia never admitted to anything. The info was taken from verious tech websites. Also keep in mind that S3's boards have EXACTLY the same problem. Is their hardware &quot;defective&quot; as well? I think not.

There would be no speed difference between using 2 blocks of 16 bit as opposed to 2 ? all the other 4 compression scheme do this.

There is a speed difference when using less bits to compress textures. Less bits = less memory bandwidth = better performance.

If NVIDIA were happy that they had their implementation of DXT1 100% to specification, with the maximum quality why provide this hack?

Because the people using the cards were not happy with it (myself included).

Here's a good article about the issue from a website completely unrelated to nVidia. Article.
 

PotNoodle

Senior member
May 18, 2001
450
0
0
?Still, I have yet to see you praise anyone but PowerVR (the company).?

Errrr, from what I?ve seen of yourself its apparent that exactly the same could be said about you and NVIDIA! <shrug>.

However, Show me where I have praised PowerVR unwarranted?. When I talk of PowerVR / PowerVR products I do so as someone who has experience of the products of late; if my experiences have been generally positive, which by-enlarge they have been thus far, is it a crime to speak up about such things??? Sheesh!

Unlike some, I am not hyping to the gills anyone?s products, I am speaking out when I feel I know something. I can?t speak about GF3 performance or compatibility because I have no personal experience of it, other than what can be read on websites etc. For anyone to claim that GF3 has poor performance, and useless features would clearly be ludicrous, as GF3 is undeniably the performance leader in the high end 3D class right now. But personally, as it stands, I have no interest in forking out that quantity of cash right now when many of GF3?s real benefits won?t be seen for some time (to any great extent), and I would far rather wait for some more competition and the refreshes so that its more likely the DX8 features will be taken advantage of, and that more highend chipsets (R200, nv25, STG5000 and whatever else their may be later in the year) will promote a little more ?cutthroat? pricing. But I digress?

?nVidia never admitted to anything. The info was taken from verious tech websites. Also keep in mind that S3's boards have EXACTLY the same problem. Is their hardware &quot;defective&quot; as well? I think not.?

Well, I have seen quotes from NVIDIA employees saying the followed spec ? sorry no links.

As for S3, I happen to have a Savage2000 kicking around here still, and I tested it when the whole issue came to light ? it did not appear to exhibit the same issues; and this was a chipset released about the same time as GF256. I have also seen a developer on B3D state that he couldn?t notice the same issue of Savage4 ? if/when the B3D boards come back do a seach, and you?ll prolly find it.

?There is a speed difference when using less bits to compress textures. Less bits = less memory bandwidth = better performance.?

Ummmm ? decompression occurs in hardware, bandwidth is not an issue here; it is however externally (from the memory to the chip) and so by providing the hack to go from DXT1 (comp. Ratio 6:1) to DXT3 (Comp. Ratio 4:1) they are hurting performance far more as the textures transference size is larger.

?Because the people using the cards were not happy with it (myself included).?

In other words there is an issue!!!

?Here's a good article about the issue from a website completely unrelated to nVidia. Article.?

heheh ? I?d read that websites follow-up article to the S3TC issue.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Eug, damn that text is big!

I dont know if my eyes are wierd or something, but anything less than 1600x1200 on a 21&quot; or bigger makes my head spin, its just too damn large.

I'd love to run this 19&quot; at 1600x1200 but it can only handle 75 Hz at that res, and that gives me a headache as well
 

MysticLlama

Golden Member
Sep 19, 2000
1,003
0
0
One thing to note with the GF3 2d is that it's driven very hard.

I was talking to a tech at raritan (they make switch boxes) and this is the problem I'm having with mine. I have a Belkin switch box and get a little bit of ghosting, so I bought the Raritan thinking that a higher quality box would help, but at 1280x960 it drove that thing crazy, so I'm just using it at work now, and even on a TNT2 I need a filter.

So the issue has been around awhile with the RAMDAC driving really hard on nVidia cards, but it is very noticeable on the GF3, and could be an issue to someone wanting to use a switch box.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
PotNoodle:

Errrr, from what I?ve seen of yourself its apparent that exactly the same could be said about you and NVIDIA! <shrug>.

I don't think you've been here long enough if that's what you think. I've complained about nVidia on a number of issues and I've praised other companies where credit was due (include PowerVR and the Kyro2). I do not think the Kyro2 sucks, but I don't believe all the hype people are giving it.

As for S3, I happen to have a Savage2000 kicking around here still, and I tested it when the whole issue came to light ? it did not appear to exhibit the same issues; and this was a chipset released about the same time as GF256.

That article described that the same issue was observed with S3 boards. Anyway, you must have seen something since Quake 3 was partially responsible because it was compressing lightmaps, which it shouldn't have been doing. The newest versions don't do this so the bleeding wall problems were fixed. Apparently ATi doesn't compress 128 x 128 textures, which just happen to be the exact size of the lightmaps.

Ummmm ? decompression occurs in hardware, bandwidth is not an issue here; it is however externally (from the memory to the chip) and so by providing the hack to go from DXT1 (comp. Ratio 6:1) to DXT3 (Comp. Ratio 4:1) they are hurting performance far more as the textures transference size is larger

That's exactly what I mean. I wasn't talking about the decompression scheme itself; I was talking about the size of the textures. If you allocate more bits to compress the textures, the size of the final compressed texture is larger, which means it eats more memory bandwidth when it's transfered from the memory to the core.

In other words there is an issue!!

There is an issue, but it wasn't a technical defect in the hardware like some people claim. It was because DXT1 was open to interpretation. By using less bits than the others, nVidia were getting higher performance.
 

PotNoodle

Senior member
May 18, 2001
450
0
0
? Anyway, you must have seen something since Quake 3 was partially responsible because it was compressing lightmaps, which it shouldn't have been doing.?

Quake3 takes the easiest method of compression and ?blanket? compresses everything under DXT1. Basically it doesn?t really have any compression built in, all that happens is that as the level loads every texture is compressed, in software, so that it resides in the 3D cards memory in the compressed format. To facilitate this, short of having a texture to compression algorithm map (which would be a pain to maintain) the default scheme of DXT1. UT, and the second disk, is an example of how compression should be achieved since every texture comes precompressed in the correct scheme for that particular texture ? however this is a pain for developers as the textures have to be shipped in different formats; also those textures that utilize the DXT1 routine will still be interpolated at a maximum of 16bit on GeForce cards, regarless of whther their source art was at a higher level.

? Apparently ATi doesn't compress 128 x 128 textures, which just happen to be the exact size of the lightmaps.?

That?s another hack.

?That's exactly what I mean. I wasn't talking about the decompression scheme itself; I was talking about the size of the textures. If you allocate more bits to compress the textures, the size of the final compressed texture is larger, which means it eats more memory bandwidth when it's transfered from the memory to the core.?

You don?t understand how it works. The decompression of DXT1 (and the number of bits that it results in using) has no bearing on the size of the compressed texture; regardless of whether the output is going to use one 16bit block or two exactly the same algorithm is used to compress the texture hence with DXT1 a compression ration of 6:1 is always in operation.

There are no more bits are not being allocated to compress the textures, its just that more bits are being used in the decompression. I.E. in Quake3 as the level loads the 32bit textures are compressed in software; the size of the compressed textures under DTX1 will be the same on any hardware using S3TC.

?By using less bits than the others, nVidia were getting higher performance.?

No, they weren?t.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |