Discussion Nvidia Blackwell in Q1-2025

Page 65 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jul 27, 2020
22,309
15,576
146
Not clear from that if it's talking about neural texture compression. Neural rendering might be something else.
 

Win2012R2

Senior member
Dec 5, 2024
647
610
96
It will take 18-24 months for Microsoft to get something out in DirectX update, and AMD will surely won't be playing ball to slow it down.

Still, this sort of stuff is real good - unlike fake frame generation.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,653
6,113
136
Yes, but despite all of that reasoning against, DLSS works with some minor inconsistancies, which are most promininant with text alignment or movement.
In the same way Simpon's rule can estimate the area under a curve with greater and greater precision as delta x gets smaller, the same with frame prediction as delta t gets smaller. Again, the fundamental theorem of calculus is in fact correct and as these time slices approach zero so does the error. If the time slice is small enough and the AI prediction is good enough the error becomes exceedingly small. nVidia of course figured this out and decided it was worth pouring tens of millions or perhaps hundreds of millions of dollars into developing. I'm running some rough numbers, to get a feel for it. And yes, I do like to play with the numbers!

I'm not sure you're fully comprehending how short a time interval 3ms is in terms of human perception. For example, in the Olympics a false start is called if an athlete is shown to have a reaction time of less than 100ms. 10ms would be an order of magnitude faster than human reaction time. 3ms a third of that. As the time interval gets smaller the error between rendered and predicted frames becomes smaller.

The theory behind what nVidia is doing is solid. We may not like it but at the end of the day it does produce meaningful fps improvement with very little visual artifacts as Linus pointed out with this short demo of the 5090.

Interesting theory crafting, but that isn't what NVidia is doing. They are still holding the current frame and past frame, generating the in between frames and only after displaying the fake ones do they show you the now quite late "current" frame. The goal of this technology is smoothing. So they can't use predictions, that WILL get further off track, and need to snap back when they get a correct frame, that would actually introduce micro-stutter. Instead they use interpolation between two known frames, that way there is ALWAYS a smooth transition, and no abrupt corrections. Smoothing is paramount...

You need to stop taking the word of corporate mouth pieces at face value.

This is NOT an actual FPS booster, that part is fake. It's a screen smoothing feature - the same basic thing that many TVs do. As misleading as some of the TV sellers are (cough Samsung cough), the still didn't go as far as claiming it would double (now quadruple) your FPS, they called it what it is - smoothing.

I'd have ZERO issues if NVidia added this and called it motion smoothing. The problem is, they use it to mislead people about actual frame rate. Mislead isn't a strong enough word. They use it to lie to people.

When the released 40 series the bombarded YT with tons of extremely misleading (lying) ads show DLSS 3 doubling or tripling FPS with no explanation at all. Just pure deception.

Say you have a game running at 60 FPS and you turn on FG or MFG, to someone not playing, the action on screen just got smoother, to someone playing the lag time and response time, just increased to be nearly as bad as running at 30 FPS.

This could have just been another (marginally) useful tool in the toolbox, but instead NVidia is mainly using it for deceptive marketing.
 

poke01

Diamond Member
Mar 8, 2022
3,040
4,031
106
Not clear from that if it's talking about neural texture compression. Neural rendering might be something else.
Neural rendering includes texture compression.
The applications of neural shading are vast, including radiance caching, texture compression, materials, radiance fields, and more.

 

Heartbreaker

Diamond Member
Apr 3, 2006
4,653
6,113
136
That still doesn't explain what it's actually doing. Is it compression of the shader binaries, which will then need decompression at runtime? Or is this some sort of AI driven code optimization (in which case, good luck 😬)? Very vague.

EDIT: There's a paper on the texture compression, if anyone is interested in the details: https://research.nvidia.com/labs/rtr/neural_texture_compression/

I read somewhere that this one is back to needing training in each game...
 

Meteor Late

Senior member
Dec 15, 2023
266
293
96
Interesting theory crafting, but that isn't what NVidia is doing. They are still holding the current frame and past frame, generating the in between frames and only after displaying the fake ones do they show you the now quite late "current" frame. The goal of this technology is smoothing. So they can't use predictions, that WILL get further off track, and need to snap back when they get a correct frame, that would actually introduce micro-stutter. Instead they use interpolation between two known frames, that way there is ALWAYS a smooth transition, and no abrupt corrections. Smoothing is paramount...

You need to stop taking the word of corporate mouth pieces at face value.

This is NOT an actual FPS booster, that part is fake. It's a screen smoothing feature - the same basic thing that many TVs do. As misleading as some of the TV sellers are (cough Samsung cough), the still didn't go as far as claiming it would double (now quadruple) your FPS, they called it what it is - smoothing.

I'd have ZERO issues if NVidia added this and called it motion smoothing. The problem is, they use it to mislead people about actual frame rate. Mislead isn't a strong enough word. They use it to lie to people.

When the released 40 series the bombarded YT with tons of extremely misleading (lying) ads show DLSS 3 doubling or tripling FPS with no explanation at all. Just pure deception.

Say you have a game running at 60 FPS and you turn on FG or MFG, to someone not playing, the action on screen just got smoother, to someone playing the lag time and response time, just increased to be nearly as bad as running at 30 FPS.

This could have just been another (marginally) useful tool in the toolbox, but instead NVidia is mainly using it for deceptive marketing.

This, it's a tech to reduce motion blur with sample and hold displays, it decreases performance (higher input latency) and attempts to increase image quality by reducing blur if the display is capable of a higher refresh rate than native fps.
This tech would be very cool if it achieved this without at least introducing about 1 frame of latency (previous RTX 4000 FG), according to DF MFG introduces even more latency than x2 FG.

So you can think of this as the opposite of DLSS Super Resolution, which attempts to increase performance while reducing quality, from not noticeable to noticeable depending on game implementation, distance to monitor, people tolerance; FG is the opposite, can increase image quality by reducing blur (but can also introduce artifacts) while decreasing performance.
 

Timmah!

Golden Member
Jul 24, 2010
1,554
904
136
Some questions for those of you with 4090's.
Are you considering upgrading to the 5090 or another GPU in the near future?
Why or why not?

It seems as though the 4090 can provide quite good frame rates at 4k in pretty much every game available today so I'm wondering what people with 4090's might be considering an upgrade?

I don't know a lot about gaming and GPU requirements so I'm curious to hear from actual owners of these super powerful GPU's.
I do, 5090 to replace one of my 4090s. Octane Render. I am bit "worried" by the speed-up in there though, seems most of the progress this time around went into software side, which will show-up mostly in games.
32GB RAM is still enticing reason on its own. But after doubling the perf first going from 1080 to 2080Ti, then to 3090 and then doing the same by going to 4090, anything less than at least 50 percent speed-up will be disappointing, since already that would be half of what i have been used to far past 3 generations. In light of that, increasing price by another 300 euros is actually bit offensive. But lets see.
 
Reactions: Hulk and Win2012R2

CastleBravo

Member
Dec 6, 2019
174
406
136
The PS5 has been using Liquid Metal for 5 years now and no major issues occurred. But I get what the above poster is saying it’s annoying to repaste Liquid Metal, so
I think it’s matter of convenience for them rather than worrying about a future manufacturing defect.

Yeah, repasting with LM will be a pain, but that shouldn't be required unless they cheap out on the VRAM/VRM thermal pads and they need to be replaced. I don't get why they are doing LM in the first place. Sure, its a 575W card, but it is also a huge 744mm^2 die with the thermal load spread fairly evenly, so liquid metal would have little benefit over something like PTM7950. This isn't a 400W OC 14900k with most of the heat coming from a small fraction of the already much smaller 257mm^2 die.
 

gdansk

Diamond Member
Feb 8, 2011
3,768
6,020
136
FWIW the clocks for the 4090 were not 2.8ghz. Aftermarket. cards clocked higher, sure, but you can’t compare those since we have no info on clocks for 3rd party 5090s. The 4090’s official clock was 2.52ghz. The 5090 is 2.41 ghz (about a 5% difference). (numbers may be slightly off, pulled them from memory, but they are on NVIDIA’s site)

We don’t know how much OC headroom these parts have.
My stock 4090 FE regularly hits 2720 or so. Even the advertised boost clocks can be misleading. I think they pick safe/pessimistic figures.
 
Reactions: Tlh97 and coercitiv

gdansk

Diamond Member
Feb 8, 2011
3,768
6,020
136
There is a certain perceptual index which mere specifications cannot convey.

I think it favors DLAA personally but Nvidia argues it makes a 5070 as good of a gaming experience as a 4090. And they're the experts.
 

jpiniero

Lifer
Oct 1, 2010
15,634
6,111
136

Alienware laptop with a 5080 Laptop gets 190k in GB. Just slightly slower than the average 4070 Super. So that's 60 SMs vs 56 for the 4070S vs 58 for the 4080L (which gets 160k)

That makes it 18% faster.
 
Reactions: Win2012R2

Heartbreaker

Diamond Member
Apr 3, 2006
4,653
6,113
136

Alienware laptop with a 5080 Laptop gets 190k in GB. Just slightly slower than the average 4070 Super. So that's 60 SMs vs 56 for the 4070S vs 58 for the 4080L (which gets 160k)

That makes it 18% faster.

Not a great benchmark, but if that's where it lands I'm not surprised.

This generation is a very small upgrade as measured by traditional (non fake frame) means.
 

PJVol

Senior member
May 25, 2020
792
776
136
I haven't looked at the 4000 rtx reviews in depth, but I'm wondering if any of the reviewers called the FG frames "fake frames"?)
 

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,281
96
... They are still holding the current frame and past frame, generating the in between frames and only after displaying the fake ones do they show you the now quite late "current" frame. ...
Nope. That happens only in DLSS3 & FSR. Unlike DLSS3, DLSS4 uses transformer to predict future frames. There's no "holding" a rendered frame and then interpolating like in DLSS3. DLSS3 holding a already rendered frame and displaying it later causes a lag. DLSS4 lag should be less than DLSS3. And many here assume transformer input is limited to a few rendered frames like in a optical flow accelerator, but it's multimodal. It is continuously fed rendered frames and possibly even user input to predict future frames with high accuracy. Also, I keep seeing posts saying raw (raster) perf is more important. It's not. It is close to impossible and usually horrible to play recent games without RT/DLSS/FSR/XeSS.

And when Jensen said 5070 (with DLSS4) will be very close to 4090 (with DLSS3 FG), he wasn't bluffing. DLSS4 is here to stay and it looks like it's better than DLSS3.

Note: Intel too is working on forward prediction with ExtraSS. And I'm sure AMD has something similar in pipeline. After all, MFG is no big secret. In the future, it's gonna be even more... what Jensen keeps referring to as Neural Rendering.
 

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,281
96
Hasn't it been measured already and lo it was worse?

It seems hard to mesh what Jensen is saying and some videos I have seen. But maybe those are AI generated videos...
I don't think anyone has done precise measurements yet. I remember seeing a video where one reviewer was measuring frame times. But he did mention that its hard to say which exactly is the predicted frame and which one is rendered.
 

NTMBK

Lifer
Nov 14, 2011
10,377
5,520
136
Ah nice, DirectX is getting an API for running neural nets in shaders: https://devblogs.microsoft.com/dire...rectx-cooperative-vector-support-coming-soon/

I don't think this initial Nvidia version will see widespread support- looks like you need to work with their "Slang" shader language, so you're going to need to maintain two totally different implementations of all your shaders. It's an interesting concept though, train a small net on your shader, and hopefully the net execution is faster than actually running the shader. Will probably be true for sufficiently complex shaders!

Once this gets supported on more hardware and integrated into game engines it's going to be more interesting. I can see Unreal Engine adding something based on this as a new backend for their Material compiler - user has the same interface for Material graph editing, then at cook time the engine will train a small network, and output it to use at runtime.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,653
6,113
136
Nope. That happens only in DLSS3 & FSR. Unlike DLSS3, DLSS4 uses transformer to predict future frames. There's no "holding" a rendered frame and then interpolating like in DLSS3. DLSS3 holding a already rendered frame and displaying it later causes a lag. DLSS4 lag should be less than DLSS3. And many here assume transformer input is limited to a few rendered frames like in a optical flow accelerator, but it's multimodal. It is continuously fed rendered frames and possibly even user input to predict future frames with high accuracy. Also, I keep seeing posts saying raw (raster) perf is more important. It's not. It is close to impossible and usually horrible to play recent games without RT/DLSS/FSR/XeSS.

And when Jensen said 5070 (with DLSS4) will be very close to 4090 (with DLSS3 FG), he wasn't bluffing. DLSS4 is here to stay and it looks like it's better than DLSS3.

Note: Intel too is working on forward prediction with ExtraSS. And I'm sure AMD has something similar in pipeline. After all, MFG is no big secret. In the future, it's gonna be even more... what Jensen keeps referring to as Neural Rendering.

You can't take the utterances of Jensen as literal truth: He's doing a marketing/sales job for his companies latest product.

According to DF Hands on, MFG works exactly the same as it did before. Big Latency hit for 2X mode as it buffers the current frame, small incremental hits for 3x and 4x, as they are all just inserted before they finally display the real buffered frame, after the faked ones.

Soon enough, we will see the latency tested by unbiased third parties with unrestricted access, and I am certain it will show what DF reported.

But I'm ready to complete change my point of view if the evidence shows the lag is gone.

How about you if the lag is still there? Will you still believe the salesman despite the evidence, and struggle to come up with a new theory of why it lags just like it's doing interpolation, while believe it's not interpolating?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |