What is Nvidias problem?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

crsgardner

Senior member
Apr 23, 2004
305
0
0
Personally, I'm not a fan of AA or AF. I'm running at 1600x1200: why should I care? (And for those of you saying "Well, I wish I could run at 1600x1200", you can. If you're spending the money to get decent AA and AF, you've likely got a card that's more than capable of running at 1600x1200).
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Gamingphreek
What are you talking about!
Just for the record i have no degrees (yeah im still in high school)

I'm not holding a lack of a degree (or even a high school diploma) against you here, but seriously, you seem to have little grasp of circuit and/or microchip design.

Anyways, im not calling them stupid, but why dont they fix this!?!? This has been a problem since AA and AF came out. Nvidia has always had trouble implementing these two as effectively as ATI. If all they have to do is shrink the die add some stuff what is possibly so hard? Granted this is out of my league as i dont know what is involved in die shrinks and stuff (getting into the highly technical forum there).
I know im asking for more or less an almost perfect Video Card but what is so hard about all the stuff listed?

-Kevin

Becuase "shrink<ing> the die <and> add<ing> some stuff" is almost as hard as designing a whole new GPU. In particular, process shrinks are a HUGE effort -- even Intel and AMD, the two biggest CPU producers in the world, are having problems with 90nm CPUs that have less than *half* the transistors of the NV40 GPU. ATI is apparently shooting for 110nm on the R500 (that's what the X300 uses now, much like the 9600 was built on 130nm in the last generation). Given the problems that Intel and AMD seem to be encountering at 90nm, this is probably all they can do for now.

The problems with NVIDIA's AF implementation are not an 'error' -- they're a limitation of the way they designed the NV20, NV30, and NV40 GPUs (basically, AA and AF steal processing time from 'regular' rendering). If they could just magically flip some switch and make it run twice as fast, they would. Everything in processor design is a tradeoff -- they might be able to make a new GPU that had better AA/AF performance (and didn't sacrifice anything else), but it would likely be bigger, hotter, and significantly more expensive to make.
 

edmundoab

Diamond Member
Apr 21, 2003
3,223
0
0
www.facebook.com
lets just say if Nvidia have all the advantage over ATI, they would have eaten ATI and be the sole producer of Video Cards by today. Like how they bought over 3dfx
 
Apr 14, 2004
1,599
0
0
Personally, I'm not a fan of AA or AF. I'm running at 1600x1200: why should I care? (And for those of you saying "Well, I wish I could run at 1600x1200", you can. If you're spending the money to get decent AA and AF, you've likely got a card that's more than capable of running at 1600x1200).
This is untrue for ATI cards. You can enable 16xAF with a ~10% performance hit.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Thanks Matthias,
Sry i didn't mean for my lack of a degree to seem tat way. I just dont know much about microprocessor design.

The second part is extremely well written, nice job!!

What then did ATI tradeoff so that this wouldn't happen?

Also what kind of problems are Intel and AMD having, just producing enough processors or is there something else?

-Kevin
 

eastvillager

Senior member
Mar 27, 2003
519
0
0
...if you can get true trilinear out of ati, the performance difference between nvidia and ati isn't that much.

;-)


This seems like a really bad time to be complaining about nvidia, since they seem to be providing the same performance and more 'features' for less or equal money. Maybe I'm missing something, lol.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: eastvillager
...if you can get true trilinear out of ati, the performance difference between nvidia and ati isn't that much.

;-)


This seems like a really bad time to be complaining about nvidia, since they seem to be providing the same performance and more 'features' for less or equal money. Maybe I'm missing something, lol.

ati's cards 'do' full trilinear - however afaik it can only be forced on the first stage.. there are always "trade-offs" in design - which is why r420 supports sm2b and not sm3....

at any rate there's little to complain regarding the nv40, or r420 for that matter, other than perhaps their availability (or lack thereof).
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
at any rate there's little to complain regarding the nv40, or r420 for that matter, other than perhaps their availability (or lack thereof).

You can probably complain a little about lack of feature support
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: rbV5
at any rate there's little to complain regarding the nv40, or r420 for that matter, other than perhaps their availability (or lack thereof).

You can probably complain a little about lack of feature support

why?

yea, i suppose some ppl could gripe about the lack of 3dc and temporal aa....

i have 2 x800s and haven't seen a game where i'm missing a feature..... course i suppose if i didn't want antialiasing, far cry will allow mdr (sorry guys technically it's not hdr) lighting options sometime in the future... tho it's hard to imagine with all the big deal made over jaggies the last few years that no aa is really an option people will appreciate...
 

Illissius

Senior member
May 8, 2004
246
0
0
Here's an interesting review. In some of the benchmarks, the performance hit from AF was completely minimal, while in others (Far Cry and UT2004, for example), it nearly halved the speed. There was nothing in between. And it's odd that it comes out in two such different games, because afaik UT2004 uses huge textures, while Far Cry goes all out on shader effects...
That's some seriously disturbing numbers. I'd like to know what's causing it myself. (I assume there's a thread about this at B3D, but B3D makes my head hurt on more than one level so I'm not going to go looking for it.)
 

ponyo

Lifer
Feb 14, 2002
19,688
2,810
126
Originally posted by: rbV5
at any rate there's little to complain regarding the nv40, or r420 for that matter, other than perhaps their availability (or lack thereof).

You can probably complain a little about lack of feature support


Yeah I'm still waiting on Nvidia to release official drivers for 6800 series. Not to mention driver support for video encoder chip. The built-in encoding chip was the major reason I went with Nvidia this round along with superior multi-monitor support. It also helped that I was able to get in on the $299 BFG 6800 GT deal.

But seriously the performance of these cards are so good and similar that feature set and price was more important to me than raw FPS speed. But I haven't heard or seen anything about the video encoder since the launch press release. I just want some kind of demonstration that it works! :frown:
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Gamingphreek
But the benefits of Low-K are enough that going to it would override the downsides.

I understand about shrinking the process but it cant be that hard. 90nm is pretty mature considering processors are being released with it,albeit GPU's are much different from processors but they really do seem to need it.

-Kevin

Im sure you know that more than NVIDIA.

The increase in RAW cost for low-k is over $5 per wafer, not to mention lower yeilds and needing to re-tool the fab.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: edmundoab
lets just say if Nvidia have all the advantage over ATI, they would have eaten ATI and be the sole producer of Video Cards by today. Like how they bought over 3dfx

(Flamebait, but I'll bite anyways).

Uh, first of all, Nvidia didn't "buy over 3dfx" ... they acquired their intellectual property and some key Engineers from 3dfx, which is the best of both worlds (in theory) of a buyout/takeover/merger/etc - they get all of the good stuff from 3dfx and none of the bad (ie their debt).

3dfx wasn't a prospering company that Nvidia bought out; they were a horribly managed group of money wasters who would miss deadline after deadline, and a company that was in decline since the Voodoo 2 came out several years before their demise. After Nvidia trumped 3dfx's VSA-100 architecture not once, but twice with the Geforce 1 and then the GF2 GTS (not to mention the GF 1 DDR in between there), 3dfx was a doomed company; their saving grace, the "Rampage" chip was too far back in production to save them.

Nvidia picked apart the little meat that was left on the bone, they didn't purchase a competitor, but a fallen titan, whose glory days were long past.

If Nvidia had the advantage today over ATI, they wouldn't just "buy over" ATI. ATI is not the wavering 3dfx company of yore, but a competent competitor to Nvidia. If they did have a sizable advanage over ATI, then obviously they'd get much more sales, as would be the case in the reverse scenario. That goes into the 'well, duh' category.

Right now it's not really a matter of one company having an advantage over the other either, performance is VERY similar on competing chipsets and there are cases where one company trumps the other in certain games.



Back to the main issue,

And what is ATI doing that is making it so efficient?

I think the basic reason for ATI being a bit better at AA + AF than Nvidia is the fact that they've always done AF better than Nvidia (this goes back to the Radeon days), and they've done AA + AF together a bit better (from the 9700 Pro days onward), and that fundamentally, aside from doing higher levels of AA and AF, and adding some new features (like TAA), Nvidia and ATI are doing AA and AF the same way they have been since day one.

There have been optimizations, and improvements along the way; more pipelines, more memory bandwidth and GPU clock speed, sure, but fundamentally, they're still doing AA and AF in a similar way to before.

ATI has always had the edge in AF. However, right now their lead is as slim as ever (in comparing the Radeon 1 vs the GF 1 / GF2, the Radeon was significantly faster at AF). Nvidia has basically always had the edge with AA. Right now, their lead is even slimmer with AA than ATI's is with AF. So, you have two companies that are marginally better at doing one thing or another, and the net result is that ATI does both together a bit faster due to their faster AF.

Another factor which probably helps ATI is their sizable clockspeed advantage. Kudos to Nvidia for building such an efficient design, but kudos to ATI for building a chip that runs faster, as well.


Or at least that's my wacky take on it .
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Illissius
Here's an interesting review. In some of the benchmarks, the performance hit from AF was completely minimal, while in others (Far Cry and UT2004, for example), it nearly halved the speed. There was nothing in between. And it's odd that it comes out in two such different games, because afaik UT2004 uses huge textures, while Far Cry goes all out on shader effects...
That's some seriously disturbing numbers. I'd like to know what's causing it myself. (I assume there's a thread about this at B3D, but B3D makes my head hurt on more than one level so I'm not going to go looking for it.)

it's only logical the impact would be more noticeable in games which makes heavy use of shaders, as it takes away the one of the 2 ALUs (arithmetic &amp; logic unit) per "pipe" in nv40. the ALUs are part of the shader units, so the impact will be greater on games which require a large amount of shader processing.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: jiffylube1024
Originally posted by: edmundoab
lets just say if Nvidia have all the advantage over ATI, they would have eaten ATI and be the sole producer of Video Cards by today. Like how they bought over 3dfx

(Flamebait, but I'll bite anyways).

<snip>

Or at least that's my wacky take on it .

actually your observations are pretty accurate... the downfall of 3dfx was when the purchased STB in it's entirety, which caused many problems that only compunded the impact of their later engineering delays... but that's a story which belongs in another thread
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: CaiNaM
Originally posted by: jiffylube1024
Originally posted by: edmundoab
lets just say if Nvidia have all the advantage over ATI, they would have eaten ATI and be the sole producer of Video Cards by today. Like how they bought over 3dfx

(Flamebait, but I'll bite anyways).

<snip>

Or at least that's my wacky take on it .

actually your observations are pretty accurate... the downfall of 3dfx was when the purchased STB in it's entirety, which caused many problems that only compunded the impact of their later engineering delays... but that's a story which belongs in another thread

Yeah, although you could also take the other side of that argument there and say that the merger with STB was just another example of bad management. But that debate is pointless and moot; either way, once they merged with STB, they didn't know what the heck they were doing, they PO'ed their board partners (Creative and like six billion other companies not named STB), they missed deadline after deadline, and oh yeah, the exorbitant spending of management continued (I've heard their office had an all-marble floor).
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Acanthus
Originally posted by: Gamingphreek
But the benefits of Low-K are enough that going to it would override the downsides.

I understand about shrinking the process but it cant be that hard. 90nm is pretty mature considering processors are being released with it,albeit GPU's are much different from processors but they really do seem to need it.

-Kevin

Im sure you know that more than NVIDIA.

The increase in RAW cost for low-k is over $5 per wafer, not to mention lower yeilds and needing to re-tool the fab.

I never said i know more than Nvidia. Im just saying why in the world would ATI have goine through with it unless the benefits outweigh the downsides. Higher clock speeds, lower electrical usage...

I agree with you on the second part, but why dont they do that with the fall refresh. If they did it with lasty years fall refresh (ie 5800-5900) then they can do it again.

-Kevin
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gamingphreek
Originally posted by: Acanthus
Originally posted by: Gamingphreek
But the benefits of Low-K are enough that going to it would override the downsides.

I understand about shrinking the process but it cant be that hard. 90nm is pretty mature considering processors are being released with it,albeit GPU's are much different from processors but they really do seem to need it.

-Kevin

Im sure you know that more than NVIDIA.

The increase in RAW cost for low-k is over $5 per wafer, not to mention lower yeilds and needing to re-tool the fab.

I never said i know more than Nvidia. Im just saying why in the world would ATI have goine through with it unless the benefits outweigh the downsides. Higher clock speeds, lower electrical usage...

I agree with you on the second part, but why dont they do that with the fall refresh. If they did it with lasty years fall refresh (ie 5800-5900) then they can do it again.

-Kevin

why would nv go the sm3 route and ati pass? everyone has a different opinion of what's more advantageous, what downsides are, and how it all fits within their marketing as well as their engineering.

frankly, i don't see the "low k" process as that advantageous.. it's not like i wouldn't have on if it wasn't... same with sm3 - a good feature but again not a "make or break"... at least not at this time.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
I see your point however would that help Nvidias power usage issues, and also allow them to clock the core a bit higher.

Is it possible that we will see this used in the fall refresh.

-Kevin
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
ATi's inferior AF quality seems to be ignored here. Particularly when looking at the R100 vs the NV10 where ATi's quality was an embarassment to primates

ATi does less work when AF is enabled- they sample less, they ignore certain angles, they use tri/bi hacks most of the time and they use lower levels of precission blending their samples then nV does(ATi uses bare minimum DX spec, nV exceeds it). Unfortunately all of the press and fan reactions supporting lower IQ for better performance has nVidia heading in ATi's direction continually decreasing the quality of their AF where now it is almost as bad as ATi's. My R9800Pro has very poor filtering at best, been using a rig with a GF4 as the primary display driver for a bit and it is a bit of a slap in the face going back to the psuedo AF on the R9800Pro(although it is a LOT faster).

On the more general question- in order to properly do AF with less of a performance hit you need to increase the amount of samples taken per clock. This increases your transistor count a significant amount- likely a 4 pixel pipe part with comparable feature set capable of doing full AF on everything would have a transistor budget close to what the 16 pipe parts have all else being equal.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
well, r100 is irrelevant; in the current gen we're discussing the both use all kinds of hacks and frankly they're about even.

id agree tho, the gf4 series had the best af, and in last gen i felt nvidia had better af than ati (tho they comprimised quality compared to gf4), whie ati had better aa, but this year nv's af is every bit as crappy as ati's, but nv's aa has improved to match the quality of ati.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: BenSkywalker
ATi's inferior AF quality seems to be ignored here. Particularly when looking at the R100 vs the NV10 where ATi's quality was an embarassment to primates

ATi does less work when AF is enabled- they sample less, they ignore certain angles, they use tri/bi hacks most of the time and they use lower levels of precission blending their samples then nV does(ATi uses bare minimum DX spec, nV exceeds it). Unfortunately all of the press and fan reactions supporting lower IQ for better performance has nVidia heading in ATi's direction continually decreasing the quality of their AF where now it is almost as bad as ATi's. My R9800Pro has very poor filtering at best, been using a rig with a GF4 as the primary display driver for a bit and it is a bit of a slap in the face going back to the psuedo AF on the R9800Pro(although it is a LOT faster).

On the more general question- in order to properly do AF with less of a performance hit you need to increase the amount of samples taken per clock. This increases your transistor count a significant amount- likely a 4 pixel pipe part with comparable feature set capable of doing full AF on everything would have a transistor budget close to what the 16 pipe parts have all else being equal.

AF isn't really usable on the GF4, unfortunately, and 16X ATI AF looks fine compared to 4X "GeForce 4 quality" AF.

I haven't sat there nitpicking the differences, but I do notice the difference of 8X/16X AF on my 9800 Pro. On my GF4 Ti4800 (yes, Ti4800), AF wasn't playable in pretty much every game. I remember how bad it made Halo tank, for example.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gamingphreek
I see your point however would that help Nvidias power usage issues, and also allow them to clock the core a bit higher.

Is it possible that we will see this used in the fall refresh.

-Kevin

i wouldn't expect nv to re-engineer anything.. about all i'd expect is some better clock speeds and pcie. both ati and nv should keep it simple and just get this generations parts out in decent #'s.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Gamingphreek
Everyone knows how strenuous it is to run AA and AF on a graphics card. However ATI seems to take much less of a performance hit than Nvidia does when these are enabled. What is the main problem(s) that are holding Nvidia back from the title. And what is ATI doing that is making it so efficient?

-Kevin



In terms of AA, NV and ATi are pretty much equal in both speed and image quality. When it comes to AF though, image quality is about equal, but ATi's drivers are a good bit more mature, and their AF algorithm(s) more optimized, so that's why they have the speed advantage. NV will likely catch up in time as they go back and work out the kinks in their drivers.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
but the problem isn't the software, and not something driver can change - and i really don't think calling it a "problem" is very accurate either... rather it's a hardware design. i'm not sure how driver "tweaks" will change that high af (iirc it's only higher than 2x af) takes gpu cycles from shader processing. i suppose if significant gains are made in other areas, the impact of cost of af could be lessened.. but again, i'm rather surprised anyone would spend so much time complaining about it as performance isn't that bad to begin with. if you want to gripe, gripe over the continuing drop in af quality in every generation from nvidia since the gf4 series...

also afaik, the alogirthm is in hardware, and not programmable. i'm pretty sure nv's aa is not programmable either, while ati's is (which allows them to do "temporal" aa as they can reprogram aa samples).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |