Discussion Nvidia Blackwell in Q1-2025

Page 72 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

poke01

Diamond Member
Mar 8, 2022
3,037
4,018
106
think people need to stop giving Jensen the benefit of the doubt just because he's the CEO of the most valuable company at time when the AI bubble is hyped like crazy. Just because he's successful doesn't mean he's automatically right. I'd argue it's more likely he's wrong because the people who are going to be the most knowledgeable in a particular subject matter are his engineers who developed it. CEOs tend to have a broad, general understanding of stuff. This error in his statement about how DLSS 4 MFG works is likely a result of that.
i would argue Jensen and Lisa are the two most competent CEOs ever in the tech space, that are also very technical and smart.

But they are not gods, don’t take whatever they say as truth.
 
Reactions: Tlh97 and Saylick

adroc_thurston

Diamond Member
Jul 2, 2023
4,714
6,503
96
i would argue Jensen and Lisa are the two most competent CEOs ever in the tech space, that are also very technical and smart.

But they are not gods, don’t take whatever they say as truth.
Intel OGs would like to have a word with that statement.
Cool as these two are, they didn't create the very concept of Si Valley lmao.

Noyce is the most important person in the history of semiconductors and it's not even ~close~.
 

poke01

Diamond Member
Mar 8, 2022
3,037
4,018
106
Intel OGs would like to have a word with that statement.
Cool as these two are, they didn't create the very concept of Si Valley lmao.

Noyce is the most important person in the history of semiconductors and it's not even ~close~.
I obviously meant the current landscape. The current Intel co-ceos or the previous one are not par with Lisa or Jensen.

Yes, Intel OGs 100% deserve the highest praise
 

Elfear

Diamond Member
May 30, 2004
7,145
767
126
Here is a detailed post that’s researched well about all the RTX 50 Blackwell improvements so far:



**Media and Display Engine Changes**

Display:

”*Blackwell has also been enhanced with PCIe Gen5 and DisplayPort 2.1b UHBR20, driving displays up to 8K 165Hz.”*

Media engine encoder and decoder has been [upgraded](https://videocardz.com/newz/nvidia-geforce-rtx-50-series-adds-support-for-422-color-format-video-decoding-and-encoding):

”*The RTX 50 chips support the 4:2:2 color format often used by professional videographers and include new support for multiview-HEVC for 3D and virtual reality (VR) video and a new AV1 Ultra High-Quality Mode.”*

Hardware support for 4:2:2 is new and the 5090 can decode up to 8x 4K 60 FPS streams per decoder.

5% better quality with HEVC and AV1 encoding + 2x speed for H.264 video decoding.
Has anyone heard which version of HDMI Blackwell will have?
 

insertcarehere

Senior member
Jan 17, 2013
703
694
136
I think people need to stop giving Jensen the benefit of the doubt just because he's the CEO of the most valuable company at time when the AI bubble is hyped like crazy. Just because he's successful doesn't mean he's automatically right. I'd argue it's more likely he's wrong because the people who are going to be the most knowledgeable in a particular subject matter are his engineers who developed it. CEOs tend to have a broad, general understanding of stuff. This error in his statement about how DLSS 4 MFG works is likely a result of that.
Jensen founded the company though and by that nature alone he's almost certainly going to be more knowledgable with how his products work better than the average CEO. All probability better than say, Lisa Su, who joined as a senior exec from another company. Whether he chooses to disclose this to the public is another matter.
 

insertcarehere

Senior member
Jan 17, 2013
703
694
136
That's kinda wrong since she loves diving in the pits.
'Loves' still doesn't equate to 'required', and for much of Nvidia's history Jensen falls into the latter category, which is most likely still reflected in the org structure/culture of these respective companies.

If nothing else, the amount of stake (financial or otherwise) that JHH has in Nvidia vs Lisa in AMD should be plenty indication by itself..
 

SiliconFly

Golden Member
Mar 10, 2023
1,925
1,280
96
That's kinda wrong since she loves diving in the pits.
I don't think it's fair to compare Lisa with Jensen. Both are very talented and charismatic. Push-ups Pat was kinda okay too. But the current Intel co-CEOs are just a total waste of O2 imho.
 

coercitiv

Diamond Member
Jan 24, 2014
6,956
15,589
136
Is predictive frame gen even feasible? When the user starts an attack, or turns the camera, or an enemy moves, or whatever, any predicted frames are going to be mispredicted. It would add stuttering, and it wouldn't reduce latency where it matters.
In theory you don't even need to predict as you can integrate tightly with the game engine and "know" the most up-to-date positions for objects. I would argue the prediction itself is not the problem, the content of the frame is though. There's no way to represent parts of the scene that are not visible yet. You can compensate and paint around a few pixels like with object/camera movement in Reflex 2, but there's no way to compensate for lack of a rendered corridor around the corner.

I think people need to stop giving Jensen the benefit of the doubt just because he's the CEO of the most valuable company at time when the AI bubble is hyped like crazy. Just because he's successful doesn't mean he's automatically right. I'd argue it's more likely he's wrong because the people who are going to be the most knowledgeable in a particular subject matter are his engineers who developed it.
I would argue he's more likely to be dishonest. Let's not beat around the bush, the man is a brilliant engineer to start with, understanding the basics of the tech is trivial.

Do this for me, take a toy from around the house, hold it in front of the mirror, and say the following with a straight face and convincing tone:
"4090 performance, at $549"

If you blinked or smiled, you're not modern CEO material. Try to do some pushups and maybe the board will keep you around for another 6 months.
 

Tup3x

Golden Member
Dec 31, 2016
1,180
1,249
136
DLAA doesn't get enough love.
The nice thing is that if game supports DLSS, you can force DLAA. Very handy in games that do not officially support DLAA.

Also I really hope that games would start supporting DirectSR. While games usually support DLSS/DLAA, the rest though... It's also important not to get stuck with ancient version.
 
Reactions: Tlh97 and S'renne

basix

Member
Oct 4, 2024
41
75
51
Is predictive frame gen even feasible? When the user starts an attack, or turns the camera, or an enemy moves, or whatever, any predicted frames are going to be mispredicted. It would add stuttering, and it wouldn't reduce latency where it matters.
No, I don't think so. People were only entertaining it because they gave Jensen the benefit of the doubt.
At least Intel is (publicly) trying

Frame Extrapolation papers:
- Original Intel Paper (December 2023, ExtraSS): https://dl.acm.org/doi/pdf/10.1145/3610548.3618224 -> G-Buffer guided warping
- New Intel Paper (May/December 2024, GFFE): https://arxiv.org/pdf/2406.18551 -> G-Buffer free frame extrapolation

GFFE with video comparisons (ExtraSS also included):
https://poiw.github.io/publication/gffe/

Compared to ExtraSS, GFFE shows massive improvements regarding artifacts. It looks pretty decent to me and that it works so well with 1080/30fps is remarkable. It should work much better at e.g. 60fps or 1440p.

I mean it is all logical regarding interpolation vs. extrapolation and research & development effort:
  • Interpolation is easier than extrapolation, so do that first. All people in the industry know, that extrapolation would be better regarding user experience, but it's just much harder to do.
  • MFG is easier than extrapolation, so do that next. MFG just requires enough compute resources, more compared to 2x FG but it is not harder or more difficult than 2x FG
  • A little frame warping (Reflex 2) on the rendered frames does extrapolate just a little (a fraction of a frametime), so do that next. It is also known from VR so you have already some R&D and experience on it. With DLSS4, Nvidia does MFG and Framewarping at the same time, but the have resources and know how. Not sure if Reflex 2 is compatible with FG/MFG but when it is, Framewarping will only be applied to the rendered frames. For FG nothing changes, but the input latency decreases due to frame warping on the rendered frames. It does not make sense to frame warp as is (frame warp immediately before displaying the picture) with FG interpolation, because you would basically warp from frame N to frame N+1 (which you already have rendered).
  • Final goal: Frame extrapolation replaces interpolation (Intel shows that it is doable but at the same time it is much more difficult), do frame warping on all frames (rendered and generated). This is the holy grail of FG, because you do not add any latency. And with frame warping you get the lowest possible amount of latency (does not matter if using FG or not). The frame warping algorithm does not have to change at all, you can reuse it from your prior development. You just apply frame warping also on the FG frames. If Intel can extrapolate a full frametime it will also be possible to do MFG with frame extrapolation. The only prerequisite for 4x MFG vs. 2x FG is enough compute horse power. That's all. If you have infinite amounts of compute resources you could also do 6/8/10/12x extrapolation. I assume some base framerate will always be required (also depending on the upsampling factor due to convergence rate) but 30...40fps should be a reasonable minimum. More is always better, that's clear. But 40fps with MFG extrapolation and frame warping to 480fps would also feel like 480fps. Extraordinary. If picture quality can hold up, I can't wait to get this into my gaming experience.

So from my thoughts above, what would be the next logical step for DLSS 5? Exactly, frame extrapolation. Might bei RTX 60 exclusive, because, well, Nvidia. In best case its available from on RTX 40 series on. RTX 40 maybe only with 2x Extrapolation (no MFG) and RTX 50 and RTX 60 with 4x MFG.
 
Last edited:

basix

Member
Oct 4, 2024
41
75
51
Addition to my post above:
Intels GFFE also generates depth buffer & motion vectors of the generated frames. So you could rather easily apply super resolution or frame warping on top of the generated frames. I would do super resolution before warping but anyways, the possibility exists.

Best case regarding DLSS 5 and extrapolation:
- FG interpolation gets replaced with extrapolation algorithm on all FG capable cards
- RTX 40 = 2x FG
- RTX 50 = 4x FG
- RTX 60 = 6...8x FG
 
Last edited:

poke01

Diamond Member
Mar 8, 2022
3,037
4,018
106

Design of 5090 FE is amazing. I really want to see the reviews of the FE card.

 
Last edited:

SolidQ

Golden Member
Jul 13, 2023
1,068
1,457
96
That's...not good.
I have only one answer for now


unless driver is bad

De8auer estimates the 5090 is approx. 30% faster in pure rendering (no DLSS or FG) vs 4090, but also 30% more power consumption
 
Last edited:

TESKATLIPOKA

Platinum Member
May 1, 2020
2,565
3,121
136
Not surprising considering RTX 5070 Laptop has 4608 Cuda cores. It looks like GB206 has the same specs as AD106 and that one is ~190mm2.
And once more RTX 5070 laptop with only 8GB Vram.

Can't say I am impressed with what Nvidia released so far, but I will wait for reviews to see how It really performs.
If 24Gbit modules will be used for the Super refresh, then In my opinion It's better to wait for that, because asking $549 for a 12GB Card is a lot in my opinion.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,653
6,108
136
That's...not good. No wonder the prices were lower than people were expecting, Blackwell is underwhelming.

As much as I like to see NVidia misleading marketing shot down, I'm thinking that's probably not a rock solid methodology.

Capture cards can't match frame rates beyond 120. So it could be clipping out more frames on the one with higher fake frame count, so IMO it's really hard reverse engineer from there, to know what is going on.

But this will almost certainly be the smallest real performance upgrade of any NVidia generation, once the smoke and mirrors are shed in real reviews.

At least most of the big reviewers like GN and HWUB will focus on true frame rates in their testing.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |