And still another reason that Nvidia Geforce 2 cards have below-average graphics

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
&quot;<< Why the F*** would anyone run an NVIDIA DEMO on a non-nvidia card? >>

&quot;So they could see how useless their VSA tech will be with new games? Of course not! Three Defects Zealots® don't like walking around holding their tail between their legs! &quot;

that's funny

I haven't seen too many games based on treemark just yet *laff*

The &quot;Three Defects Zealots&quot; was kinda cute. Please 'splain how I can be a 3dfx zealot if I have a GTS in my system now?

perhaps it is NOT I who is the zealot, but you who has his/her head buried in the sand?

 

DragonFire

Golden Member
Oct 9, 1999
1,042
0
0
I'm not even sure if I can say this in a nice way or not....but I'll try.

I think almost everyone that gave there option about S3TC is about stupid.

Since some of you people think your so good. I'd like to see you idiot build a video card with great features, the FUTURE in mind, and beable to do it all over within 6 months.

As for the ATI and 3DFX fixing there river issues, I would hope so since 3dFX had there V4/5 boards on papere for how many months? As for ATI, ATI who? I don't see ATI pumping out something new for 3D games like 3DFX or Nivida.

Thinking back, as I recall the voodoo/voodoo2 didn't have 2d built in. 3DFX can't be that great if it can't get 2d built in....oh wait! No one needed it at the time and now its standred.....same with 32-bit color.

For all the FPS freaks, get a life....while getting 500FPS at 1600x1200 would be great its just out of arms reach right now. Why can't people be happy playing at 1024x786x32 if 1289x1024x32 is to slow? Instead is blamed on the card, drivers, or something else stupid.

I have a great idea for people BFG10K, just go buy a stupid V5 6000 with 32 VSA chips (1600 watts..MUAHAHA) and a stupid GeForce2 Ultra 64 megs and run both cards so you get all the FPS you can get for everygame. Perhapes someone could write a driver that would use both cards so you used the best features from both.

As for which company is better, there all stupid! ATI needs to just drop out and 3DFX and Nvidia should join as one (under Nvidia leadership) to come up with some State-of-the-art grapics. Just like AMD and Intel should work together to make what could be amazing cpus......but nooo....there all stupid they all think there better then each other and by figthing eachother for are money, they are hurting us as a whole more then anything.

I found this whole post so stupid that I just had to say so.

If any lame people want to flame me or do some stupid quote crap, go right ahead. I don't hold your thoughts as high as post people seem to do in here.
 

shk

Banned
May 17, 2000
130
0
0
Pleae stop this nonsense about even dare comparing a v5 5.5k or 6k with a GF2. Even a good'ol TNT2 Ultra can outperform V5 easily in 3d mark2k. Now, I dont care much for 3d mark stuff and I dont depend on it but I get the idea how slow the v5 cards are.. I think.. in my opinion, speed in terms of FPS is EVERYTHING. give up some for picture quality and enjoy the pure natural or supernatural raw speed of no frame-drop game play. I think speed is everything that matters &quot;period&quot;. I would rather have 8bit color, 16bit texture running beautifully at 8000 FPS even tho normal human beings cannot tell framerate increase difference beyond 60FPS. hehe.. FPS matters all.....
 

daffy

Junior Member
Jul 3, 2000
10
0
0
All i can say is that i had an original voodoo 4meg pci, i now have a tnt2m64[ crap i know but video cards are expensive in australia], it runs ok for what i use it for. All u people should get out some more , get a life and see beyond your damn monitor's.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Good point by daffy, though if I went out more than I do, my liver would probably cease to function
 

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
RoboTECH,
I am glad to see that someone else with a GeForce card recognizes the texture compression problem. Texture compression is even more important for me since I have an MX; I lost 20 fps in timedemo 1 (down from 75) since I have only 32mb of SDR memory. It was well worth the hit, though. I couldn't stand seeing the sky and textures on walls, floors, ceilings, in the water - EVERYWHERE - look worse than on my wife's Voodoo 3 card. The more people that recognize the problem, the greater the chance NVidia will actually do something about it. Come on people - the texture compression problem is obvious and the drivers that NVidia releases are buggy! NVidia is not perfect; until people start recognizing the problems that their cards have (instead of blindly worshiping the company) then NVidia will never have an incentive to fix the problems.
 

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
BTW, Dragonfire is a complete idiot, I don't even know where to start ripping into his stupid post.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
shk:

&quot;Pleae stop this nonsense about even dare comparing a v5 5.5k or 6k with a GF2. Even a good'ol TNT2 Ultra can outperform V5 easily in 3d mark2k. &quot;

If that were true, it would only show what a complete waste of time 3dMark2k is. As it is, you're wrong, completely wrong. Nice uniformed post tho. thanks for proving how dense you are. Have you used a 5500? Didn't think so. Go away.

&quot;I get the idea how slow the v5 cards are&quot;

they're far from slow. Not even close. I've used it and compared it directly to a 64MB GTS. Now go away.

&quot;I think.. in my opinion, speed in terms of FPS is EVERYTHING.&quot;

so save your $$$, buy a used SDR, and run @ 512x384xUgly. Now go away.

BlvdKing:

&quot;I am glad to see that someone else with a GeForce card recognizes the texture compression problem.&quot;

yeah. Great card, but it's not perfect, no matter what the nvidiots wanna say. I can't figure out how some of these buffoons *DARE* mention anisotropic/trilinear filtering and 32-bit quality, yet totally ignore the most OBVIOUS graphical abhorration among today's 3d accelerators...the pisspoor texture compression quality.

&quot;the texture compression problem is obvious and the drivers that NVidia releases are buggy! NVidia is not perfect; until people start recognizing the problems that their cards have (instead of blindly worshiping the company) then NVidia will never have an incentive to fix the problems. &quot;

exactly! But people are too busy worshipping at the Altar of the GeForce. *shakes head* Density, pure density....

&quot;BTW, Dragonfire is a complete idiot, I don't even know where to start ripping into his stupid post. &quot;

LOL, no kidding. I was going to do the same, but gave up since half of it made no sense anyway. At least sdk's BS was readable (tho ridiculous)

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
RoboTECH-

&quot;oh for God's sakes, it's the same damn thing, quit with your semantics argument. The Q3 engine makes full use of it. Straight up, the GeForce sucks for it. JUST ADMIT IT!! It's okay! I admitted it! I sold my 32MB GTS so I could get a 64MB one BECAUSE TC on the GTS sucks (that and the more overclockable RAM).&quot;

S3TC under Quake3 svcks, who is arguing that point? Everyone playing UT under OpenGL running Linux seems to think it rocks.

&quot;There is NO reason for nvidia to allow this to happen. WTF? They release 900 goddamn driver revisions a month (Derek perez sez :&quot;we don't leak drivers&quot;...ha!), why the hell cant' they fix this?&quot;

This seems to be where you are having problems, what is the issue, that is what we are trying to look at. Have you ever thought of the fact that it could be a problem with S3TC and not nV's particular implementation? S3's boards have the same problem, if it is native to the compression method then they need a rather drastic reworking of compression and possibly an entirely new method.

&quot;WTF? 40 fps in quaver with TC disabled on a DDr? nice. 6.18 I assume? I have a 64MB GTS and I get ~60. the 32MB board just plain died (tho it was using the 5.32) Yeah, don't forget to mention those nice teenage and low 20s framerates you get every time you enter the RL area. Ugh.&quot;

Have you tried it with the 6.18s? It doesn't have nearly the problem the older drivers did with fluctuating FPS.

&quot;oh for god's sakes man. WTF does this have to do with the price of tea in china? ARe you on dope? Lordie...this has NOTHING to do with 3d-hardware accelerated computer games (and yes, I read your long-winded intro)
<rolls eyes>&quot;


The original statement as it was worded was wrong. I'm sorry if you don't like the truth, not much that can be done about that. You don't think the invention of 3D graphics has anything to do with 3D games? Odd take.

&quot;the i740 was, is, and always will be a complete, utter, absolute piece of poop. Really now Benji-boy, remove your head from your hindquarters. It smells much better out here.&quot;

The 1740 launched before the TNT, before the Banshee, before the G200. At the time it was by a decent margin the fastest 2D/3D accelerator out, and it also shipped with video capture, video out etc. When the Startfighter launched it was by far the fastest 2D/3D accelerator out. It wasn't comparable to a Voodoo2, but it certainly had better 2D image quality

&quot;It's not Q3, or both 3dfx and ATi would have problems.&quot;

3dfx uses FXTC, not S3TC. People are still looking into what exactly ATi is doing. By figuring out why ATi doesn't have a problem, it could explain how to fix what is wrong with the nV boards. There is a reason why we look at things like this, the extremely poor results of S3TC under Quake3 using a GF based board speak for themselves, whining about it doesn't do much if we don't have any idea what is wrong and then how it can be fixed.

&quot;Don't try to pass the buck. My graphics company needs to get on the ball and fix this poop. I've seen a gazillion &quot;leaked&quot; driver sets. Not one has even bothered to address this in any way, shape or form. In fact, most of them have been miniGL's for Quake3 performance.&quot;

There is a big difference between us it seems. I want to know what the problem is, you want a reason to bash nVidia. S3's boards have the same problem, perhaps it is S3TC itself that we should be looking at. Then again, UT shows no problems using the GF running S3TC, that indicates something wrong with on the fly compression which if you cared to look into things you would see that there is very little that can be modified and still be within specs.

&quot;I don't care about the *compressed* numbers, because they mean nothing to me. I find it ironic that so many GTS owners like to pimp the &quot;drastic image quality improvement&quot; of 32-bit over 16-bit, or trilinear/anisotropic over bilinear, yet many ignore how butt-ass ugly the TC is. Humorous....&quot;

Then turn it off. I have stated in many threads numerous times to disable texture compression and take the performance hit if you want better visual quality. That is how I play as I have already mentioned. I do care about visual quality, and gladly drop my resolution to utilize the highest quality overall images. Most people are not going to care, those that do can disable it.

If there is a driver workaround for problems with S3TC then please tell us why S3 hasn't issued a fix yet? They have had years with it and still they can't fix the same problems nVidia is having with the standard that S3 invented. The GF/GF2 boards render compressed textures in an ugly manner in Quake3, not one person is trying to argue that, why not try to figure out what the exact issue is so we can correct it instead of trying to bash another company? Perhaps nVidia's best bet would be to license FXTC from 3dfx to solve the problem, clearly it is better at the very least for Q3 engined games which are increasing in number.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Ben:

&quot;S3TC under Quake3 sucks, who is arguing that point? Everyone playing UT under OpenGL running Linux seems to think it rocks.&quot;

agreed, it looks darn good. But again, why can't my GTS look *good* in Q3 with TC, since everyone else's card looks good with TC?

&quot;Have you ever thought of the fact that it could be a problem with S3TC and not nV's particular implementation?&quot;

I couldn't care less.

FACT: TC in q3 with my GTS looks like bloody hell. Fix it. Simple as that.

If we all just say &quot;aw shucks, it's S3's fault, nvidia doesn't need to fix it&quot;, then THEY WON'T!!!

If enough people bitch, maybe they'll do something about it. Is that such a bad thing?

&quot;S3's boards have the same problem,&quot;

hey, if you own an S3 board, you DESERVE to feel pain.

&quot;if it is native to the compression method then they need a rather drastic reworking of compression and possibly an entirely new method.&quot;

Looks like ATi and 3Dfx have already done that. If nvidia is so damn revolutionary, why can't they fix this issue?

<snip Quaver benchmarks>

&quot;Have you tried it with the 6.18s? It doesn't have nearly the problem the older drivers did with fluctuating FPS.&quot;

yeah bro, but the 6.18's have been screwing with my system. They just don't stay stable long enough for me to be bothered. Blah.


' &quot;oh for god's sakes man. WTF does this have to do with the price of tea in china? ARe you on dope? Lordie...this has NOTHING to do with 3d-hardware accelerated computer games (and yes, I read your long-winded intro)
<rolls eyes>&quot; '

&quot;The original statement as it was worded was wrong. I'm sorry if you don't like the truth, not much that can be done about that. You don't think the invention of 3D graphics has anything to do with 3D games? Odd take.&quot;

Christ dude, if we're going to go back that far, let's talk about the first transistors, eh?

The &quot;truth&quot; is that 3Dfx was the first company to bring fast, high quality 3d-hardware accelerated graphics to personal computer gaming.

&quot;The 1740 ...<snip>...It wasn't comparable to a Voodoo2, but it certainly had better 2D image quality&quot;

har har!! Yeah, the V2's 2d image quality was really.....nonexistent?

&quot;3dfx uses FXTC, not S3TC. People are still looking into what exactly ATi is doing. By figuring out why ATi doesn't have a problem, it could explain how to fix what is wrong with the nV boards. There is a reason why we look at things like this, the extremely poor results of S3TC under Quake3 using a GF based board speak for themselves, whining about it doesn't do much if we don't have any idea what is wrong and then how it can be fixed.&quot;

I'm not whining, I'm bitching. There's a difference. &quot;Whining&quot; is when you moan and complain about something for no apparent reason with no end result other than to release your feelings. &quot;bitching&quot; is when you moan and complain about something to bring about some logical result - in this case, I wanna make sure that nvidia is aware there is an issue, and that this nvidia customer is UNHAPPY about it.


&quot;There is a big difference between us it seems. I want to know what the problem is, you want a reason to bash nVidia.&quot;

if they fix the TC problem, then guess what? I won't have a reason to bash them, would I?

I honestly don't care HOW or WHY there is a problem, only that they FIX it. That's their job. That's why I pay $### for their cards. To have the best piece of equipment possible. It won't be perfect, but it should only have *minor* flaws, especially in comparison to their competitors.

And while they're at it, tell them to fix the damn ugly 2d quality at hi-res in windows. yuck....

' &quot;I don't care about the *compressed* numbers, because they mean nothing to me. I find it ironic that so many GTS owners like to pimp the &quot;drastic image quality improvement&quot; of 32-bit over 16-bit, or trilinear/anisotropic over bilinear, yet many ignore how butt-ass ugly the TC is. Humorous....&quot; '

&quot;Then turn it off.&quot;

I did. I'd rather see websites use the &quot;real&quot; numbers without TC tho. 16-bit w/TC off, even on the GTS, looks better than it's 32-bit w/TC on.


&quot;If there is a driver workaround for problems with S3TC then please tell us why S3 hasn't issued a fix yet? &quot;

I didn't buy an S3. I bought an nvidia board. They need to fix it. ATi and 3Dfx both &quot;fixed&quot; it (by developing their own/modifying the S3TC). nvidia can release 900 driver leaks per month, why hasn't ANY of them addressed it?

&quot;They have had years with it and still they can't fix the same problems nVidia is having with the standard that S3 invented.&quot;

so allow nvidia to live up to their reputation as &quot;innovators&quot;. Develop/modify the TC to MAKE IT WORK.

&quot;Perhaps nVidia's best bet would be to license FXTC from 3dfx to solve the problem, clearly it is better at the very least for Q3 engined games which are increasing in number. &quot;

that is an excellent point. However, I'm pretty certain that each and every nvidia employee would rather eat cow mucuous than license a technology from 3dfx. Agreed?

 

KarsinTheHutt

Golden Member
Jun 28, 2000
1,687
0
0
Who the hell cares anymore? Buy a godam GeForce2 GTS if you need those raw frames. Buy a Maxtrox if you need image quality. And get a Radeon or Voodoo if you need something in between.
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Robo is a troll. Look at what he writes. NVidia's S3TC QUALITY IS JUST AS GOOD AS ANYONE ELSES IF THE GAME USES PRECOMPRESSED TEXTURES(MORE GAMES DO THEN DONT). THEY ADOPTED S3'S TC BECAUSE IT WAS THE STANDARD AND MOST WIDELY USED. If you get a voodoo 5 you have to live with fake trilinear filtering and blurry FSAA. If you want the radeon you gotta deal with crappy drivers. NVidia is still on top. This isnt nvidia's fault moron. They adopted s3tc perfectly. Its a just a flaw of S3TC's onthefly compression. Get over it you dumbass troll.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
more wonderous didactic from the Doomguy:

&quot;Robo is a troll.&quot;

No, I just don't bury my head in the sand like you do. I think my GTS kicks ass, but it isn't perfect.


&quot;NVidia's S3TC QUALITY IS JUST AS GOOD AS ANYONE ELSES&quot;

actually, it's a known fact that it isn't as good as FXT1. That's evident from compression ratios and resultant video quality.

&quot;IF THE GAME USES PRECOMPRESSED TEXTURES(MORE GAMES DO THEN DONT). &quot;

name &quot;them&quot; (aside from UT in Linux)

&quot;THEY ADOPTED S3'S TC BECAUSE IT WAS THE STANDARD AND MOST WIDELY USED. &quot;

and it obviuosly has issues, as EVERYONE with a GTS or GeForce will admit (except you, of course)

&quot;If you get a voodoo 5 you have to live with fake trilinear filtering and blurry FSAA.&quot;

really? and you know this....how? I must've gotten lucky. The V5 that I had a few weeks back had a lodbias slider that eliminated the blurriness in FSAA.

Now you wanna talk blurry FSAA, let's talk my GTS here!

&quot;This isnt nvidia's fault moron. They adopted s3tc perfectly.&quot;

yes, let's make more excuses for nvidia. They see there is a problem. Regardless if they &quot;adopted s3tc perfectly&quot;, S3TC is obviously pretty flawed. They should improve upon it, since it has a pretty dramatic impact on how games that use it look.

&quot;Its a just a flaw of S3TC's onthefly compression.&quot;

so? ATi and 3dfx saw fit to improve upon S3TC, why can't nvidia?

&quot;Get over it you dumbass troll. &quot;

Uh, no, I won't get over it. I paid $$$ for the card, I want to see them fix something that is broken.

AS it is, I just turn it off.

BTW, a &quot;troll&quot; in newsgroup/internet forum discussions is one who merely tries to incense the denizens of the newsgroup by posting purposely infuriating subject matter. I'm not doing that. I'm encouraging and participating in intelligent discourse with several of the members here.

What is it about what I say aggravates you so much? That I'm not afraid to tell the truth about the hardware that I own? That I don't worship at the Altar of the GeForce? Clue in kid. It's a video card, not a frickin' religion.

Am I aggravating your pubescent hormones there Doomguy? Don't worry. Mommy's making macaroni and cheese and chicken nuggets for dinner tonight. YIPPEE!!!!

Of course, I'm the jackass for trying to have a rational discussion with a hairy-palmed kid half my age. Oy vey.....

 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Guess you dont realize puburty happens around 13 guess because you never went through it. Score one for me! I believe SOF may use precompressed textures. I wish ben would get over here to help me!
 

KarsinTheHutt

Golden Member
Jun 28, 2000
1,687
0
0
&quot;Guess you dont realize puburty happens around 13 guess because you never went through it. Score one for me!&quot;

How original, Doomguy. Is Ben going to wave his lightsaber at me now?

shk, I thoroughly agree with you. Robotech should just give it a rest. Its only a video card.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
RoboTECH-

&quot;agreed, it looks darn good. But again, why can't my GTS look *good* in Q3 with TC, since everyone else's card looks good with TC?&quot;

This could be explained because of business choices. Without the cross lawsuits and resultant cross licensing by nVidia and S3, we probably never would have seen S3TC enabled on any of the GF based boards. Because of the license issues 3dfx developed their own method of texture compression and while I'm not yet sure what ATi has done, it appears that they at least are using a modified version of S3TC if they didn't develop their own compression method. It doesn't seem that nVidia ever had any plans on supporting S3TC under OpenGL, but since the hardware was already their(same as DXTC), and they obtained the cross licensing agreement they enabled it.

Why the long winded answer, it is possible that they simply can't &quot;fix it&quot;. 3dfx developed a new solution from the ground up and ATi also made design choices that enabled a different type, or at least a modified version of, S3TC. nVidia is using the hardware designed to handle DXTC for S3TC, and it doesn't handle on the fly compression well.

&quot;If we all just say &quot;aw shucks, it's S3's fault, nvidia doesn't need to fix it&quot;, then THEY WON'T!!!

If enough people bitch, maybe they'll do something about it. Is that such a bad thing?&quot;


But can they? That is what I'm interested in, and why I find it more important to look at what is causing it. If they can fix it then I'll gladly start sending emails on it and start up threads griping about it, if they can't due to hardware design then it won't do any good.

&quot;Looks like ATi and 3Dfx have already done that. If nvidia is so damn revolutionary, why can't they fix this issue?&quot;

3dfx pretty much had two choices, license S3TC or develop their own method for texture compression. They chose the latter and pulled it off with style, besting what was available in terms of visual quality. Without AGP texturing having their boards in a situation where they are forced to swap from memory would be horrendous for performance, far worse then anyone else. I'm not sure what ATi did, but like 3dfx they also deserve a big pat on the back for going above and beyond in improving on standards that were available.

Looking at the time frame when the GF was launched, many people were saying quite loudly that 32MB was too much/plent RAM and that more wouldn't be needed(ignoring TC). That was before Q3 final shipped and the desing was finalized before the Test version was out. Where should the R&amp;D dollars go? In retrospect I think it is safe to say that more time devoted to R&amp;D would have been well worth it when looking at TC, but at the time most people would have said it was a waste.

&quot;yeah bro, but the 6.18's have been screwing with my system. They just don't stay stable long enough for me to be bothered. Blah.&quot;

Hmm, haven't had any problems myself, what kind of issues are you having?

&quot;Christ dude, if we're going to go back that far, let's talk about the first transistors, eh?

The &quot;truth&quot; is that 3Dfx was the first company to bring fast, high quality 3d-hardware accelerated graphics to personal computer gaming.&quot;


If I recall properly the first 3D hardware ran on vacum tubes Yes, 3dfx was a pioneer in PC 3D gaming, not arguing that. They weren't the first, but they were the first with a worth the money viable option.

&quot;if they fix the TC problem, then guess what? I won't have a reason to bash them, would I?&quot;

If the problem is hardware related due to proper support of the S3TC standards then it wasn't nVidia's &quot;fault&quot;. It is easy to look back and say could of, should of, would of, but who knew at the time? If nVidia can fix it then I'll join in your b!tching though quite quickly, I have a 32MB card, I need it more then you do

&quot;I did. I'd rather see websites use the &quot;real&quot; numbers without TC tho. 16-bit w/TC off, even on the GTS, looks better than it's 32-bit w/TC on.&quot;

Agreed, but I think you and I both know that won't stop most people from ignoring the slower numbers and only paying attention to the fastest. When the 6.xx series drivers launched that is one of the first things I looked for. Comparing the non compressed numbers for the 6.xx to the compressed numbers from the 5.xx they are fairly close.

&quot;I didn't buy an S3. I bought an nvidia board. They need to fix it. ATi and 3Dfx both &quot;fixed&quot; it (by developing their own/modifying the S3TC). nvidia can release 900 driver leaks per month, why hasn't ANY of them addressed it?&quot;

They may not be able to.

&quot;that is an excellent point. However, I'm pretty certain that each and every nvidia employee would rather eat cow mucuous than license a technology from 3dfx. Agreed?&quot;

Agreed, though I'm not sure if there is another way, or for that matter if they could implement it in their hardware. I'm not sure if the design is close enough to utilize the current nV hardware by simply releasing a driver revision.

BTW- I don't think you talk too much, I rather enjoy discussing things point by point in long winded threads
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
1. S3TC has six different compression algorithms, in which compressor is supposed to choose the most suitable one for the task at hand. Thus compressing at optimal quality is very computationally intensive, but if done properly, S3TC quality is very good. A high-quality non-realtime compressor is shipped with DirectX SDK, try it and see how long compressing a 512x512x32bit image with color gradients (Quake III sky is like this) takes.
2. GeForce shows precompressed textures correctly. UT under Linux is proof enough. Furthermore, I have seen several tech demos with precompressed DXTC and compressed texture quality was indistinguishable from non-compressed.
3. DXTC and OpenGL/S3TC are technically 100% identical, only the name and implementation API differs.

If all that I've claimed here is true, faulty part of GeForce + Quake III combination causing bad image quality is the 32bit RAW -> S3TC texture compressor. Nvidia should be able to fix this in a future driver release if it chooses to.

BTW, am I the only one to think this, but wouldn't it be neat if next-gen hardware supported JPEG textures? Imagine the compression ratios!
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Doomguy:

&quot;Guess you dont realize puburty happens around 13 guess because you never went through it. Score one for me! &quot;

uh, yeah bud. You &quot;scored&quot; *rolls eyes and chuckles*

shk: &quot;Robotech, you talk too much. &quot;

yes, way too much. I need some FXT1 for my posts, eh? Sorry.


Ben:

&quot;This could be explained because of business choices ....<snip>....and they obtained the cross licensing agreement they enabled it.... it is possible that they simply can't &quot;fix it&quot;.&quot;

good point. It just seems odd that they could whip up FSAA in their drivers in like a month, yet haven't been able to figure this issue out for almost a year now. *shrugs shoulders*

&quot;&quot;yeah bro, but the 6.18's have been screwing with my system. They just don't stay stable long enough for me to be bothered. Blah.&quot;

Hmm, haven't had any problems myself, what kind of issues are you having?&quot;

aw man, weird stuff. Q3 blinks off and the system freezes, requiring I remove the frickin' plug. UT *in-game* is pitch black, despite my best efforts to up the brightness (the &quot;windows menu&quot; is just fine)
If I try to change the resoltuion without ctrl-alt-del'ing EVERYTHING except systray and explorer, I get &quot;this program has performed an illegal operation&quot;, and until I reboot, I can't do ANYTHING in 3d, not even Fishmark! 3dMark was freezing during the adventure/action...icons would get all goofy colored after closing some windows....problems shutting down windows to restart (requires I hit the &quot;restart&quot; button on the 'puter), just a ton o'garbage. ugh...I think you might be able to add the MSI BX Master nobo to the list of board the GTS don't like (Along with the via and AMD ones)

&quot;If nVidia can fix [the texture compression problem] then I'll join in your b!tching though quite quickly, I have a 32MB card, I need it more then you do&quot;

D'oh!!! Yep, I hads the 32MB, and ditched it right quick cuz of that.

&quot;BTW- I don't think you talk too much, I rather enjoy discussing things point by point in long winded threads &quot;

I'm with ya man. I have no problems with peeps disagreeing with me, but at least make a decent point. Funny how I'm engaged in this silly little war with ole' Doomboy, and he's clamoring for you to come to his aid, yet here we're engaged in nice, rational discourse (DIS-, not INTER-, you perv!!! )

but you're wrong. I *DO* talk too much. WAAAAY too much.

jpprod:

&quot;BTW, am I the only one to think this, but wouldn't it be neat if next-gen hardware supported JPEG textures? Imagine the compression ratios&quot;

hey, damn good idea man. what's the compression ratio of a jpeg, anyway? I'ts pretty crazy, isn't it?
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
JPEG compression ratio at very good image quality for 32bit RAW data is something like 1:16. Even 1:20 and higher are feasible if image isn't a high-contrast one. I believe the problem with JPEG as texture format is the immense amount of computation required by FFT algorithm used to decode image. But perhaps in the future there are graphics chips advanced enough to do this at realtime speeds.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Jukka-



<< If all that I've claimed here is true, faulty part of GeForce + Quake III combination causing bad image quality is the 32bit RAW -> S3TC texture compressor. Nvidia should be able to fix this in a future driver release if it chooses to. >>



Would they be able to do it and keep load times down? I'm curious as I'm sure that if they pre compressed everything while loading every level it is possible, but what kind of time are we talking about to compress ~30MB of textures using the high quality compressor? You must have a rough idea I hope



<< BTW, am I the only one to think this, but wouldn't it be neat if next-gen hardware supported JPEG textures? Imagine the compression ratios! >>



Why stop there, let's go to entirely calculated textures ala Renderman At some point I imagine that we will be there, but that is probably even further off then being able to use JPEGs. Have you attempted using texture compression along with any per pixel or bump mapping effects? I am very curious as to what types of artifacts we may get with that combo, particularly with both starting to catch a bit of support. I would assume that precompressed textures would be a requirement in such a situation.


RobobTECH-



<< good point. It just seems odd that they could whip up FSAA in their drivers in like a month, yet haven't been able to figure this issue out for almost a year now. *shrugs shoulders* >>



If you look at some older nV products, they have listed FSAA support for years. The TNT1 had it as a listed feature it was a matter of enabling a fully working implementation in the drivers. May sound odd, but their hardware has been built to support FSAA for longer then it has been for any texture compression.



<< aw man, weird stuff. Q3 blinks off and the system freezes, requiring I remove the frickin' plug. UT *in-game* is pitch black, despite my best efforts to up the brightness (the &quot;windows menu&quot; is just fine)
If I try to change the resoltuion without ctrl-alt-del'ing EVERYTHING except systray and explorer, I get &quot;this program has performed an illegal operation&quot;, and until I reboot, I can't do ANYTHING in 3d, not even Fishmark! 3dMark was freezing during the adventure/action...icons would get all goofy colored after closing some windows....problems shutting down windows to restart (requires I hit the &quot;restart&quot; button on the 'puter), just a ton o'garbage. ugh...I think you might be able to add the MSI BX Master nobo to the list of board the GTS don't like (Along with the via and AMD ones)
>>



Are you getting rundll errors along with the other problems you listed? Sounds like ACPI/power savings issues, which the 5.xx drivers were known to have problems with. I'm running a K7M which is one of the more &quot;problematic&quot; mobos when using GFs, but the 6.xx has been very solid, perhaps the &quot;fixes&quot; that they have built into the 6.xx &quot;broke&quot; some of the code from the 5.xx.

Try disabling all power saving features and see if you still are having problems. The performance boost of the 6.xxs is well worth giving it a shot I would say. If it is an ACPI problem then I would expect them to have it resolved in the next revision.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0


<< Would they be able to do it and keep load times down? I'm curious as I'm sure that if they pre compressed everything while loading every level it is possible, but what kind of time are we talking about to compress ~30MB of textures using the high quality compressor? >>


That's the tradeoff I'm afraid. Compressing a 512x512x32bit texture with S3TC in high quality can easily take more than a second, level load times would easily go trough the roof. For 32 megabytes and one second per texture the compression would take 32 seconds - add that to the current Q3 level load time and you're in for a quite an irritating wait.

It would be optimal if video card driver could &quot;cache&quot; the compressed textures somewhere so they could be re-used, but AFAIK that would require application support.


<< Why stop there, let's go to entirely calculated textures ala Renderman. At some point I imagine that we will be there, but that is probably even further off then being able to use JPEGs. >>


Indeed. Procedural water textures of Unreal (especially Deus Ex) and Serious Sam engines is an example of how good mathematically formed patters can look. And these are being transformed into traditional textures, with hardware support for procedural surfaces speed and quality would be quite different.


<< Have you attempted using texture compression along with any per pixel or bump mapping effects? I am very curious as to what types of artifacts we may get with that combo, particularly with both starting to catch a bit of support. I would assume that precompressed textures would be a requirement in such a situation. >>


I haven't played with DXTC, and my experience in field of realtime 3D graphics programming is lacking to say at least.

You should ask LeoV for this, he's quite a bit better programmer than I and has actual first-hand experience on GeForce/DXTC in conjunction with Dot3 bump mapping. S3TC on bump texture will certainly intoroduce artifacts, but I believe increased bump texture detail, which can be brought in if texture compression is used, makes up for them. Just a guess, but I'd say the arfifacts would look considerably worse on Dot3 specular highlights than on solely diffuse lighted surfaces.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |