Discussion Nvidia Blackwell in Q1-2025

Page 67 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beginner99

Diamond Member
Jun 2, 2009
5,281
1,697
136
View attachment 114535
Also don't really get the Dune III, thing here. One scenario is a movie full of CGI pre-rendered on million dollar clustered-computing systems, and the other is a real-time game played on your 1k-2k pc, even the "highest end graphics" for consumers is nowhere near a hundred million dollar movie's CGI.

The issue is about Smart TV motion smoothing which leads to terrible experience. First thing I do on TVs is turn that crap off. It's also known as the soap opera effect as it makes that 200 mio hollywood movies look like soap operas. And in essence that smoothing is nothing else than fake frames.

But I agree that I see value in simple upscaling. But running old hardware that doesn0t support any of these techs, do fsr or dlss even have a option that is just upscaling (without rt)?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,479
8,253
136
All this AI talk actually has me wondering...

Why is anyone trying to "calculate" rays and bounces and stuff at all? Why can't you just feed an AI algo the sources of light in the scene, and the algo layers a 99% accurate lighting pass on the frame before displaying it?

I wonder if that's where this is all heading anyway...

Hell call it AI-Tracing (takes another shot, dies)
 

coercitiv

Diamond Member
Jan 24, 2014
6,956
15,595
136
I mean all I'm getting from that thread is that DLSS4 isn't artifact free, but it's still superior to DLSS 3 in pretty visible ways. I don't get why he's crapping on it so hard though as a general product. Isn't DLSS4 and even DLSS3 good enough for most people, like personally I can't spot 99% of the artifacts in the DF video even with video speed at 50%. I don't even usually notice much difference between FSR3 quality and native at 1440p, and according to most reviewers DLSS3 is superior to FSR3 and DLSS4 is so far shaping up to be another improvement. I don't exactly have the sharpest eyesight/perception but neither do many consumers so I don't get the outrage here with upscaling artifacts. Upscaling isn't "free frames" but it's significantly better frames at a nonequal lesser image quality. Not sold on MFG/FG but upscaling is definitely useful for consumers.
The crapping is justified once you notice the way DLSS3 is marketed as the "better than native" alternative. Even you fell in the trap of thinking there's not much difference between FSR3 Quality and native (temporal) AA. The real comparison you should be making is FSR3 Quality versus FSR Native AA, and the difference is there in loss of detail and shimmering artifacts. The same applies to DLSS3 versus DLAA, though DLSS3 admittedly fares better at 1440p than FSR.

If you don't understand the DUNE III reference, you probably haven't been paying attention to new AAA games system requirements. The games with the latest engines and visual effects require upscaling and FG even for 1080p60 experience, which is the most vulnerable config to loss of detail and artefacts from both the upscaler and frame gen. On the other end of the spectrum, people with $2000 cards use their PCs with huge TVs for home theater cinematic experiences. For some games are the new movies. There's even a game out there with a "Cinematic" performance preset

I use FSR Quality too, there are games where the added FPS or lower power consumption matters more than the price in IQ. That does not mean I fool myself thinking those shimmering artifacts would still be there in a native render with good AA.

Last bu not least, if you can't spot the artifacts in the DF video even with 50% speed, why are you using FSR/DLSS at all? Just lower game IQ settings to a medium/low blend, based on your own reporting you're probably unable to tell the difference once things are in motion. You should also never enable RT in games for the same reason, the massive compute cost would simply be wasted.
 

Win2012R2

Senior member
Dec 5, 2024
647
610
96
They can do DLSS all day long for all I care, but top end GPU should be getting 70-80%+ real perf to NOT need fake frames, for a price obviously, but the deal right now is pretty poor - they did not move it to N3E only because they wanted to keep a likely 70% gross margin on 5090, and that's the problem - they could have had 50% margin but offer N3E.

It's rubbish really - should be around 70 chips per wafer at (729 sq mm, assuming size 27*27), that's 300 bucks per chip on 20k wafer or just over 400 for future N2 - how can it be that Nvidia can't sell THAT for $2k and still make very very nice gross margin? It's totally possible, of course they prefer to sell same chips for 10x price, which I'd say fair enough, but on another hand it's not ok to have even crazier margin while giving 2 year old tech.
 

gdansk

Diamond Member
Feb 8, 2011
3,768
6,020
136
It's rubbish really - should be around 70 chips per wafer at (729 sq mm, assuming size 27*27), that's 300 bucks per chip on 20k wafer or just over 400 for future N2 - how can it be that Nvidia can't sell THAT for $2k and still make very very nice gross margin? It's totally possible, of course they prefer to sell same chips for 10x price, which I'd say fair enough, but on another hand it's not ok to have even crazier margin while giving 2 year old tech.
I know people have a very high opinion of TSMC manufacturing excellence but defect density may be >0.
But also remember those frame generation technologies don't come for free or anywhere near free - as people often say. There are engineers who need to eat. And the software and hardware guys at the intersection of perception, rendering, ML, and performance demand quite a lot of compensation compared to most.
 

Win2012R2

Senior member
Dec 5, 2024
647
610
96
I know people have a very high opinion of TSMC manufacturing excellence but defect density may be >0.
But also remember those frame generation technologies don't come for free or anywhere near free - as people often say. There are engineers who need to eat. And the software and hardware guys at the intersection of perception, rendering, ML, and performance demand quite a lot of compensation compared to most.
Totally agree - but 50% gross margin should cover that nicely, it did for a very long time and gross margin on top end product like 5090 is far more than 50% - top product should get top tech, not 2 years old stuff at higher price.

I disagree with investment into fake frames - would have been far better if Nvidia made chiplets work properly, or ray tracing finally actually getting enough real hardware to work without (much) fakery, at least on top end card for 2 grand.

They still failed to add super cheap hardware decompressor on the card - even Switch 2 is getting it, but not 50 series? Cost of that extra chip is probably 50 cents, no super research needed either - it already exists in Nvidia's IP.
 

MoogleW

Member
May 1, 2022
95
44
61
All this AI talk actually has me wondering...

Why is anyone trying to "calculate" rays and bounces and stuff at all? Why can't you just feed an AI algo the sources of light in the scene, and the algo layers a 99% accurate lighting pass on the frame before displaying it?

I wonder if that's where this is all heading anyway...

Hell call it AI-Tracing (takes another shot, dies)
Grounding AI in proper physics with proper behaviour is challenging. Doing it in real time, something else. So using a ground truth of the frames you are already generating seems like a good compromise. Then you can add speedups like denoising, neural radiance caching, etc. That's how I understand it
 

MoogleW

Member
May 1, 2022
95
44
61
Totally agree - but 50% gross margin should cover that nicely, it did for a very long time and gross margin on top end product like 5090 is far more than 50% - top product should get top tech, not 2 years old stuff at higher price.

I disagree with investment into fake frames - would have been far better if Nvidia made chiplets work properly, or ray tracing finally actually getting enough real hardware to work without (much) fakery, at least on top end card for 2 grand.

They still failed to add super cheap hardware decompressor on the card - even Switch 2 is getting it, but not 50 series? Cost of that extra chip is probably 50 cents, no super research needed either - it already exists in Nvidia's IP.
Chiplets and so on don't add performance right? So on 4NP, Nvidia would have had to make a chip the size of GB200 and sell for a tenth of the price. Not happening.

The only alternative for performance is to use TSMC N3E, which I assume was too late (Apple and Intel didn't get satisfactory results with N3B from what I see, and it costs more, would it have been worth it?) and that's likely why all 3 companies did not launch N3E based GPUs at the same time, and were all delayed. Too much coincidences for me.

As for RT, yeah, preliminary benchmarks are disappointing. I guess die size constraints hit hard with the other changes. In a way, they for the first time had to choose which of the three pillars of RTX would give them the most returns
 

MoogleW

Member
May 1, 2022
95
44
61
DLSS 4 FG is claimed to be more efficient, slightly faster, and use slightly less memory. I didn't see anything about latency either.

The testing I saw for DLSS 3 FG, show added latency almost exactly equal to 1 frame time, so they already do a good job of not adding any extra latency beyond the time to buffer one frame.

There is no real way to reduce latency unless they actual stop buffering a frame.

So, it will be pretty easy to tell.

Given what DF has said, I expect, 2X added latency will equal to frame time ( the same as DLSS 3 FG), 3x will equal that plus some additional fraction of that, and 4x will add an additional fractional amount.

Here is where DF discusses DLSS 4 latency:
This is with Reflex 1 from the looks of it. Could reflex 2 be added? Will that make any difference?
 

ajsdkflsdjfio

Member
Nov 20, 2024
171
117
76
The crapping is justified once you notice the way DLSS3 is marketed as the "better than native" alternative. Even you fell in the trap of thinking there's not much difference between FSR3 Quality and native (temporal) AA. The real comparison you should be making is FSR3 Quality versus FSR Native AA, and the difference is there in loss of detail and shimmering artifacts. The same applies to DLSS3 versus DLAA, though DLSS3 admittedly fares better at 1440p than FSR.
What?? I simply stated that in my experience through my eyes I didn't notice that much of a difference and that I mostly don't notice artifacts. These are my own personal experiences ?? How are you trying to falsify my own personal experiences and saying I fell into some type of marketing trap. Maybe you have a sharper eye than me and can notice all these horrible artifacts in all of your games, but in my experience it's not that big of a deal, and arguably to most less educated and less picky consumers than people on these forums it's also likely not that big of a deal. I understand FSR/DLSS isn't better than native, but I am willing to take the relatively minor image quality hit to gain more FPS. I'm not "ignoring" artifacts just because I say in my experience I usually don't see them while playing games or viewing DLSS/FSR comparisons.

And no, like I said in my original post, Nvidia upselling their technology in typical Nvidia fashion is NOT an excuse to crap unnecessarily on anything and everything Nvidia. I understand the mindset of trying to educate more people to not fall into marketing traps, but irrationally criticizing and ranting against product A or B isn't productive for that purpose or any other purpose.
If you don't understand the DUNE III reference, you probably haven't been paying attention to new AAA games system requirements. The games with the latest engines and visual effects require upscaling and FG even for 1080p60 experience, which is the most vulnerable config to loss of detail and artefacts from both the upscaler and frame gen. On the other end of the spectrum, people with $2000 cards use their PCs with huge TVs for home theater cinematic experiences. For some games are the new movies. There's even a game out there with a "Cinematic" performance preset
I completely understand the increasing AAA games' system requirements, I quite literally addressed it further down in the post. I agree, devs should not make upscaling or FG technologies a requirement to have playable framerates in their games. I said that basically verbatim.

The DUNE III reference he was making wasn't in regards to a home threatre system, or game rendering, he was talking about walking into a MOVIE THEATRE and finding upscaling technology in future showings of pre-rendered movies with multi-million dollar budgets. He was using this outlandish scenario to point out why consumers might not want more upscaling technology in PC hardware, which is a stretch to say the least.
Last bu not least, if you can't spot the artifacts in the DF video even with 50% speed, why are you using FSR/DLSS at all? Just lower game IQ settings to a medium/low blend, based on your own reporting you're probably unable to tell the difference once things are in motion. You should also never enable RT in games for the same reason, the massive compute cost would simply be wasted.
If lowering game IQ settings to medium/low was as effective in increasing performance AND retaining image quality as upscaling technologies, then upscaling technology would simply be useless. I also don't see how not being able to spot minute artifacts on the screen for a fraction of a second is equivalent to not being able to notice the entire game being turned down from high to medium/low. Don't kid yourself, those two things are not equivalent at all.

The difference RT makes on an image is also hugely noticeable on the look of a game too, I personally don't turn it on since I don't have high-end enough hardware, and generally ray-tracing is still too taxing for most people's systems. Nonetheless RT is in fact a tangible increase on the graphics quality of most games that employ it, so yes turning on RT with the trade-off of having to turn on DLSS/FSR isn't necessarily a horrible trade-off, especially since the effects of RT are a lot more noticeable than upscaling to many people.

You're post is like a hardened audiophile muttering about the subpar mids and the whiny highs of the airpods pro and how anyone who likes it fell into the "apple marketing trap". No they didn't, the sound quality is simply "good enough" for most people while offering other features that make it convenient and enjoyable to use more than your equivalently priced 250$ chinese IEMs, and that's coming from a person who bought those 250$ chinese iems. There is more nuance to the discussion of image quality than you might realize.
 
Last edited:

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,281
96
Maybe someday either AMD or Intel will say "Enough!" and get a card that focuses purely on raster over x8 or x16 fake frames.
RT+DLSS+FG is gaming (with RTX cards). Try playing, say Black Myth Wukong, without these, we'll be struck dead in the water. The days of pure raster is over. RT/DLSS/FSR/XeSS/FG/etc are the future.

... those frame generation technologies don't come for free or anywhere near free ...
True. Those RT cores, Tensor cores, etc take up significant die space.

Grounding AI in proper physics with proper behaviour is challenging. Doing it in real time, something else. So using a ground truth of the frames you are already generating seems like a good compromise. Then you can add speedups like denoising, neural radiance caching, etc. That's how I understand it
True. Nvidia said, CNN Optical Flow has reached it's limit already. But they're just getting started with Transformers and there's lot of room for improvement. Their training data is also new. The more they train, much better it'll be in the future.

All this AI talk actually has me wondering...

Why is anyone trying to "calculate" rays and bounces and stuff at all? Why can't you just feed an AI algo the sources of light in the scene, and the algo layers a 99% accurate lighting pass on the frame before displaying it?

I wonder if that's where this is all heading anyway...

Hell call it AI-Tracing (takes another shot, dies)
Welcome to Jensen's vision of the future. Full Multimodal Neural Rendering.

... once you notice the way DLSS3 is marketed as the "better than native" alternative. ...
I think you're mistaken. When Nvidia says DLSS is "better than native", you shouldn't lose context. A RTX 4080 is better off with RT+Upscaling+FG, than without these. Thats a fact. Like I mentioned before, in many cases, recent games won't even work without these. And once you turn these on, the visual quality, the performance, the gameplay, etc they all increase by a huge margin. And thats what they mean by "better than native".

... devs should not make upscaling or FG technologies a requirement to have playable framerates in their games. ...
I don't think the devs have a choice. When a studio picks UE5 to build a game, the rest automatically follows.
 
Last edited:

Thunder 57

Diamond Member
Aug 19, 2007
3,283
5,390
136
They can do DLSS all day long for all I care, but top end GPU should be getting 70-80%+ real perf to NOT need fake frames, for a price obviously, but the deal right now is pretty poor - they did not move it to N3E only because they wanted to keep a likely 70% gross margin on 5090, and that's the problem - they could have had 50% margin but offer N3E.

It's rubbish really - should be around 70 chips per wafer at (729 sq mm, assuming size 27*27), that's 300 bucks per chip on 20k wafer or just over 400 for future N2 - how can it be that Nvidia can't sell THAT for $2k and still make very very nice gross margin? It's totally possible, of course they prefer to sell same chips for 10x price, which I'd say fair enough, but on another hand it's not ok to have even crazier margin while giving 2 year old tech.

You are smarter than that. This sounds like the people bitching because the 7950X cost $70 to make. Maybe. But how much did it cost to develop? And then they have make a profit on top of that.
 

Win2012R2

Senior member
Dec 5, 2024
647
610
96
This sounds like the people bitching because the 7950X cost $70 to make.
Not at all, because I am bitching from the other end of the stick

AMD chiplets priced well, my only problem with them is that they are not using latest tech which might cost 50% more but for top end products market will easily bear it.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,283
5,390
136
RT+DLSS+FG is gaming (with RTX cards). Try playing, say Black Myth Wukong, without these, we'll struck dead in the water. The days of pure raster is over. RT/DLSS/FSR/XeSS/FG/etc are the future.

It's the future because developers learned to use them as a crutch. If they didn't exist, they would have come up with other ways. Maybe more optimization. Just look at what they've done with idtech engine. Crazy efficient. But you go ahead and believe what you want.
 

Win2012R2

Senior member
Dec 5, 2024
647
610
96
that's likely why all 3 companies did not launch N3E based GPUs at the same time, and were all delayed. Too much coincidences for me.
Apple is using N3E to great success for their fairly large GPUs in M4s, it's a great process, probably not entirely suitable for 600-700 sq GPU, but N3P will surely be.

Now if we knew somehow that 50 series would be just a gap filler for a year and then proper stuff comes out... but it does not look that way, so we'll be stuck on same thing for another 2 years, and the way things go in 2026 at best we'll get N3 goodies, not even N2.
 

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,281
96
It's the future because developers learned to use them as a crutch. If they didn't exist, they would have come up with other ways. Maybe more optimization. Just look at what they've done with idtech engine. Crazy efficient. But you go ahead and believe what you want.
The latest idtech engine is awesome. But it works better with RT+DLSS3.5+FG than without it.

A side note: The latest idtech engine has severe issues with larger maps and in many places in the game, the game crams the player into smaller spaces way too frequently (esp. during intense gunplay). Idtech is terrific, I love it. But it's not the best in the world.
 

coercitiv

Diamond Member
Jan 24, 2014
6,956
15,595
136
You're post is like a hardened audiophile muttering about the subpar mids
How about you keep to proper arguments instead of poorly chosen analogies?

What?? I simply stated that in my experience through my eyes I didn't notice that much of a difference and that I mostly don't notice artifacts. These are my own personal experiences ?? How are you trying to falsify my own personal experiences and saying I fell into some type of marketing trap.
Falsify your own personal experience?! Oh dear. You're the one getting offended at someone's personal take on the flaws of DLSS and more importantly upscalers in general. You're already falsifying their personal experience because it's incompatible with your own, and accusing them of trashing the tech "unnecessarily". Upscaler flaws have been documented up the wazoo, and we accept them for what they are because they bring a good perf/iq loss ratio. That does not mean we don't get to criticize the tech, no matter the company that promotes it.

The difference RT makes on an image is also hugely noticeable on the look of a game too
Are you sure about that? Would you pass a blind test?

PS: notice the poor quality headset he's using, terrible mids and boomy lows. And the mic, don't get me started on the mic... /s
 

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,281
96
Would you pass a blind test?
No one has to. Games that leverage the full power of RT, tend to have distinct visuals. Following is something raster can't do:



RT is here to stay. No future AAA UE5 (or competition) titles are gonna come without RT. And with RT, the rest follow automatically (Upscaling, FG, etc). Whether we like it or not, the future of Nvidia/AMD/Intel graphics is Neural Rendering.
 
Reactions: xpea and pcp7

ajsdkflsdjfio

Member
Nov 20, 2024
171
117
76
Falsify your own personal experience?! Oh dear. You're the one getting offended at someone's personal take on the flaws of DLSS and more importantly upscalers in general. You're already falsifying their personal experience because it's incompatible with your own, and accusing them of trashing the tech "unnecessarily". Upscaler flaws have been documented up the wazoo, and we accept them for what they are because they bring a good perf/iq loss ratio. That does not mean we don't get to criticize the tech, no matter the company that promotes it.
Yes, at some point when you extend your personal ideas to try and invalidate something that objectively most other people enjoy, you are in fact in the wrong. Unless you'd like to argue that the majority of consumers would be benefited by not using upscaling technologies and making it the "norm". Even by your own admission, you use FSR meaning that in some scenarios upscaling is worth the IQ hit for extra performance, meaning it is in fact beneficial. Just because you can spot more artifacts in upscaled gameplay does not validate the idea that upscaling is somehow useless and a detriment to consumers.
Are you sure about that? Would you pass a blind test?
I was able to guess right most of the time after staring for 30 seconds, but yea it's not much of a difference at all. F1 is a bad showing of raytracing and at the same time an excellent showing of traditional lighting. Maybe my statement of Ray-tracing being a massive difference in MOST games was wrong, but to me it definitely is a larger increase in image quality than upscaling is a hit to image quality. Plus there are games where ray-tracing revolutionizes the look of games like in cyberpunk, contrary to your idea that ray-tracing should NEVER be turned on for people who aren't nitpicking every single frame.

Compare this to the same graph comparing native vs DLSS 3 quality:


Clearly, HUB reviewer on average sees RT being beneficial to IQ, while also seeing DLSS on average being a net zero to image quality, meaning DLSS + RT should be better than Native + no RT in most scenarios image quality wise.

Obviously the results will vary by person, but this test is a lot more comprehensive than one blind test in one single game. Interesting how DLSS is perceived to have better image quality than native in some scenarios. Obviously it's one individuals opinion, but the fact that it's not entirely lopsided or obvious even to a reviewer who is intentionally analyzing maybe shows that upscaling is not in fact useless or a horrible boon to image quality.

Also as a side note, being able distinguish subtle differences between RT on and off in F1 does not suddenly make non-RT horrible compared to RT and vice versa. It's the same with upscaling, sure you can freeze frame and slow down and zoom in on artifacts, but what is the actual visual difference when a player is actually playing the game and not hounding for visual bugs.
 
Last edited:

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,281
96
Plus there are games where ray-tracing unironically revolutionizes the look of games like in cyberpunk, contrary to your idea that ray-tracing should NEVER be turned on for people who aren't nitpicking every single frame.
True. I would never touch any recent game without RT. The thought of bland visuals and comparatively poor frame rates are a major turn off. Yikes! 😱

... F1 is a bad showing of raytracing ...
Very true.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |