Discussion Nvidia Blackwell in Q1-2025

Page 118 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dr1337

Senior member
May 25, 2020
439
714
136
I mean I'm happy for you that it's improving your experience but it's not my experience at all.
Like if FG could turn something with unplayable frame rates into something playable even with a bit of image degradation I could see the attraction but going from a very playable amount of FPS to another playable amount of FPS with that additional latency and artifacts doesn't really work for me.
Personally I think if you have 70 fps you're better off with a VRR monitor than adding generated frames.
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.

And personally I much prefer input response for performance, Im pretty sensitive to frame rates with my eyes and can pretty handily tell the difference between 120 and 60 fps like its 30 and 60. But like with all setting tweaks, if the trade off is a net gain, then its worth it is what I'm trying to say.

And I don't want to come off as like I support this idea of frame gen and fake frames as how a game/gpu is supposed to be made. But in reality if we have unoptimized games being made, then frame gen tech can really help those games specifically. I spent a lot of time in my life playing games like ArmA 3 in 2014-2016 at below 60fps, because it was always an unoptimized game. These days my 5800x3d can do 70-90fps on average, so turning on frame gen doesn't even remotely hurt meanwhile my eyes get to see a perfectly smooth image.
 

WelshBloke

Lifer
Jan 12, 2005
32,128
10,148
136
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.

And personally I much prefer input response for performance, Im pretty sensitive to frame rates with my eyes and can pretty handily tell the difference between 120 and 60 fps like its 30 and 60. But like with all setting tweaks, if the trade off is a net gain, then its worth it is what I'm trying to say.

And I don't want to come off as like I support this idea of frame gen and fake frames as how a game/gpu is supposed to be made. But in reality if we have unoptimized games being made, then frame gen tech can really help those games specifically. I spent a lot of time in my life playing games like ArmA 3 in 2014-2016 at below 60fps, because it was always an unoptimized game. These days my 5800x3d can do 70-90fps on average, so turning on frame gen doesn't even remotely hurt meanwhile my eyes get to see a perfectly smooth image.
I can only go on my experience but every time I've used frame generation it's made playing the game feel worse in some way. Like disconnected and floaty rather than sharp and involved.
That's apart from things like WH3 where that sort of thing doesn't matter. But then the additional fake frames don't really help there either!
 

Win2012R2

Senior member
Dec 5, 2024
647
609
96
The one you mentioned yourself, with 60-70% margins.
Inflated (over) market pricing means (way) above market price, ie - if Micron sold Nvidia GDDR6 for 10 bucks something that is on open market is 2-3 now, that would be inflated. There is no public market as such for GDDR7 just like there wasn't for GDDR6X at start, but we know it's new, fast and used exclusively right now by one company - Nvidia.

Everybody in semi wants (needs even!) 50%+ margins, and this is a brand new product with lots of R&D put into, 60% margin won't be "inflated" - looks like Nvidia got exclusivity also, which always comes for $$$.

It could easily be 300, that's not implausible in my view.
 

Hitman928

Diamond Member
Apr 15, 2012
6,524
11,796
136
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.

And personally I much prefer input response for performance, Im pretty sensitive to frame rates with my eyes and can pretty handily tell the difference between 120 and 60 fps like its 30 and 60. But like with all setting tweaks, if the trade off is a net gain, then its worth it is what I'm trying to say.

And I don't want to come off as like I support this idea of frame gen and fake frames as how a game/gpu is supposed to be made. But in reality if we have unoptimized games being made, then frame gen tech can really help those games specifically. I spent a lot of time in my life playing games like ArmA 3 in 2014-2016 at below 60fps, because it was always an unoptimized game. These days my 5800x3d can do 70-90fps on average, so turning on frame gen doesn't even remotely hurt meanwhile my eyes get to see a perfectly smooth image.

FG always adds latency. To get only 1 ms of added delay, you’d need to be up in the 1000 fps range to begin with.
 

Golgatha

Lifer
Jul 18, 2003
12,310
790
126
Inflated (over) market pricing means (way) above market price, ie - if Micron sold Nvidia GDDR6 for 10 bucks something that is on open market is 2-3 now, that would be inflated. There is no public market as such for GDDR7 just like there wasn't for GDDR6X at start, but we know it's new, fast and used exclusively right now by one company - Nvidia.

Everybody in semi wants (needs even!) 50%+ margins, and this is a brand new product with lots of R&D put into, 60% margin won't be "inflated" - looks like Nvidia got exclusivity also, which always comes for $$$.

It could easily be 300, that's not implausible in my view.
It's funny that when costs come down by GDDR6 etc. coming into full production where it is on the open market, we never see the price of GPUs come down.
 

steen2

Junior Member
Aug 21, 2024
9
22
41
There's a push-back ready for the lower tier SKU VRAM belly aching. DLSS4 (TNN/MFG) reduces VRAM usage so 8-16GB is "fine". Just wait for the <=12GB SKUs.
I'd like to know who was the genius who though that this was a good idea.
Tag team effort from technical marketing.
 

dr1337

Senior member
May 25, 2020
439
714
136
FG always adds latency. To get only 1 ms of added delay, you’d need to be up in the 1000 fps range to begin with.
Adrenalin could be lying to me, but when a game is at a avg of 60fps, with afmf2 on my 6900xt, it isn't more than a few ms of latency. it definitely goes up when the native fps dips, but it also seems pretty reflective of how it feels.

Like I truly know how ArmA 3 feels at 30fps, and when there's a ton of AI on screen even on my modern rig fps will dip. AFMF might be at 60fps, and looks maybe smoother, but it feels totally like 30fps, which is reflected by the 99% minimum fps readout. But otherwise when ArmA 3 is running well, the frame generation does an acceptable to me job of turning 70-90fps into 100-120 from a visual standpoint.
 

Win2012R2

Senior member
Dec 5, 2024
647
609
96
It's funny that when costs come down by GDDR6 etc. coming into full production where it is on the open market, we never see the price of GPUs come down.
Large scale manufacturers would not buy on spot market, they'd have contracts spanning long time, so if they buy at wrong time (like couple of years ago) they are stuck with higher than spot market cost.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,653
6,107
136
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.

Where are you getting 1ms?

16.6ms added delay minimum for adding fake frames at 60 FPS (1/60).

RF = Real Frame, FF = Fake Frame.

Without FG: RF1 (0ms) - RF2 (16.6ms) - RF3 (33.3ms)...

With FG: RF1 (0ms) - ( 16.6ms - RF2 held) - FF1 (25 ms) RF2 ( 33.3ms - previously held can now be shown - at the same time RF3 has arrived and is held) - FF2 (41.6ms) - RF3 ( 50ms .....


You are constantly one frame behind (minimum) because it's held that long to create new in between frames. That's why most decent examinations say 60 FPS minimum, and better with 80-100 FPS.
 

MrTeal

Diamond Member
Dec 7, 2003
3,748
2,136
136
Where are you getting 1ms?

16.6ms added delay minimum for adding fake frames at 60 FPS (1/60).

RF = Real Frame, FF = Fake Frame.

Without FG: RF1 (0ms) - RF2 (16.6ms) - RF3 (33.3ms)...

With FG: RF1 (0ms) - ( 16.6ms - RF2 held) - FF1 (25 ms) RF2 ( 33.3ms - previously held can now be shown - at the same time RF3 has arrived and is held) - FF2 (41.6ms) - RF3 ( 50ms .....


You are constantly one frame behind (minimum) because it's held that long to create new in between frames. That's why most decent examinations say 60 FPS minimum, and better with 80-100 FPS.
Probably from testing like this.

Additional frames add a couple milliseconds in latency each.
 

blckgrffn

Diamond Member
May 1, 2003
9,501
3,816
136
www.teamjuchems.com
Adrenalin could be lying to me, but when a game is at a avg of 60fps, with afmf2 on my 6900xt, it isn't more than a few ms of latency. it definitely goes up when the native fps dips, but it also seems pretty reflective of how it feels.

Like I truly know how ArmA 3 feels at 30fps, and when there's a ton of AI on screen even on my modern rig fps will dip. AFMF might be at 60fps, and looks maybe smoother, but it feels totally like 30fps, which is reflected by the 99% minimum fps readout. But otherwise when ArmA 3 is running well, the frame generation does an acceptable to me job of turning 70-90fps into 100-120 from a visual standpoint.

I am using AMFM2 + Adreline in Wonderlands plus 154 fps frame cap in game so that the GPU “only” has to run at 77 FPS (which it can do handily) and it is super smooth in a super twitch looter shooter. The only frame dips I get are when the CPU chokes out or when it (I think this is on purpose?) spawns a legendary loot drop and the game does this big “kabang” effect to really trigger the dopamine. From what I can tell Adrenaline is doing some nifty lifting syncing output for a very tight response feel while the frame gen keeps the GPU from starting on fire.

I am all for real frames too, but it seems like a cocktail of software - and not even FSR - can impact the efficiency of your PC and feel of a game.
 

coercitiv

Diamond Member
Jan 24, 2014
6,956
15,589
136
Adrenalin could be lying to me, but when a game is at a avg of 60fps, with afmf2 on my 6900xt, it isn't more than a few ms of latency.
That extra one frame latency is inescapable, though you might end up with less if you have Anti-Lag enabled. Both AMD and Nvidia deal with the excess latency by dynamically adjusting frame timing to reduce perceived latency. The catch here is one can reduce latency independently of enabling frame generation, so arguing that FG does not add latency is misleading at best (not your argument, I understand your PoV, but rather any marketing claim)

Both you and @blckgrffn are using frame generation the way it's meant to be used though, with a 60FPS+ baseline to ensure a tolerable latency penalty and also minimize visible artifacts. You are using it to make the best of your powerful hardware and get the smoothest motion possible. Frame generation is essentially a "win more" technology for people who already pass a hardware check (fast refresh monitor + powerful GPU relative to game requirements). This is what @WelshBloke decries when arguing it does nothing to improve performance for the folks in need of raw FPS, and this is also why the term "fake frames" was quickly coined to criticize early marketing efforts that presented FG as capable of drastically improving low FPS scenarios.

Here's a segment from a video review which highlights the reasoning above when applied to Nvidia's MFG. The reviewer in question is someone who loves competitive shooters and fast response monitors, so he's at the very least reasonably sensitive to the metrics involved. Watch as his perspective changes when comparing 75FPS+ baseline for MFG with 50FPS+ baseline. (timestamped video, 2 minute watch)


I will quote one of my older posts which contains screenshots from CP2077 and Alan Wake in the DF video about the 5080. You can see how the relative latency cost of 2x FG goes up as baseline FPS goes down, making this tech a poor choice for people with underpowered hardware. The numbers in the screenshot are for total system latency, not just frame latency.
The added latency is a partial sum of a rendered frame + generated frames, assuming the card can generate the inserted frames much faster than it can render another full frame. All of this is offset by the rest of the system latency and also Reflex. If Reflex is not available to enable independently of DLSS/FG, then the comparison can be made to seem much more favorable for frame generation in terms of latency cost.

Here's some screenshots for FPS+latency from the DF video so you can see the scaling, though keep in mind the latency numbers fluctuate quite a bit. A screenshot with equal numbers for 3x MFG and 4X MFG does not mean the 4th generated frame adds zero latency.
View attachment 115856

View attachment 115865

Frame generation got it's bad name because it was introduced by Nvidia as a performance enhancing mechanic (and AMD obviously joined the choir). Had it been presented as the natural continuation of Vsync and VRR, everyone would have cheered. But we can't put Vsync or VRR on FPS charts, so things went out of whack very fast.
 

basix

Member
Oct 4, 2024
41
75
51
FG is just not the same as more performance (and everyone knows it, therefore the "Fake Frames" tag on it). It is motion smoothing. With the introduction of Smooth Motion even Nvidia agrees to that. That's why Smooth Motion and AFMF (AMD Fluid Motion Frames) is the correct wording for it and should also be applied to DLSS/FSR FG.

But especially Nvidia marks FG as a performance enhancing feature. They should abandon that as quickly as possible, because it is simply misleading and results in misinformation. As good as FG is (I use it regularly, I like the "motion smoothing" ), Nvidia marketing worsens the customer perception of a good feature. Engineers craft a decent and useful technology and marketing undermines that. Shouldn't marketing be doing the opposite?
 

CakeMonster

Golden Member
Nov 22, 2012
1,575
754
136
Funnily, despite all this smoothing marketing, NVidia has not yet given us interpolation for video. I know this is personal preference and what you're sensitive to, but I think VSR is a smudgy mess, and I'd much rather have AI interpolation for smoother video playback on YouTube or local video files, than upscaling. You can do this with SVP (RIFE) now and its surprisingly good. For video its just a binary question if you think its good enough or not since you're not dealing with latency, so it should be much less controversial to offer compared to interpolation for games.
 
Reactions: Tlh97 and coercitiv

dr1337

Senior member
May 25, 2020
439
714
136
For video its just a binary question if you think its good enough or not since you're not dealing with latency, so it should be much less controversial to offer compared to interpolation for games.
It really is different when you're directly in control of media. Maybe some day AI can be good enough to fill all those voids but as is just interpolating video isn't remotely the same as real time interpolation of new things that are constantly happening. Thats part of why having a high frame rate for frame gen is so important from the getgo. There really isn't enough information in 30fps teir video to create fulsome video.

And another problem, imagine watching a video of someone reacting to a film that was only ever 24fps. Should the AI artificially smooth that frame rate too or not? Interpolation starts to be annoying when when you consider modern media and even artist intention. A lot of content in modern times is actually mixed in frame rate, and when you apply a straight up amplification algo you can lose information. This is something that you never get in real time rendered content like a videogame.

I've seen a lot of TVs with interpolation and I've hated all of them pretty much. Makes everything look like its a soap opera. But at the same time, I've never played a game like as if I were filming a bad TV show. So the actual feel from being in control vs being on a ride is quite different IMO.
 

CakeMonster

Golden Member
Nov 22, 2012
1,575
754
136
Sure, I implied that it was both personal preference, and a case by case basis when its preferable to use video interpolation. I just would like to see those options to process video offered though.

There's a variety of reasons why people don't like it. I'm one of those who want more video to be shot at higher frame rates, as I think its kind of silly not to a larger extent use the technological advancements for producing new material now, similar to higher resolutions. At the same time I of course respect artists and directors to choose whatever expression they prefer, either generally or for a specific work of art. Publish in 24 or 120fps, or 480i NTSC, or black and white, or whatever you prefer.

The 'soap opera effect' is a term I don't like because its too vague. It might refer to the interpolation of a low end TV from 2014, or it might refer to the state of the art AI interpolation model from this year, or it might refer to video shot originally in 60 or 120fps. I think its unreasonable if its applied to the latter, but I respect those who grew up on film in 24fps and just like the effect. Taste is just taste, even if its fueled by nostalgia, so I can understand if they want to watch even brand new material in 24fps. I don't agree, but I understand, whatever floats your both.

But like MFG, there is a proper (although admittedly limited) use case for video interpolation, and the option of native support by newer graphics cards is a reasonable request IMO. AI models can also improve a lot on it, who knows that even in the future it could train an on-the-fly AI model on a specific input file and improve the quality (this might also work for upscaling).

Thanks for coming to my TED talk (I like this topic because I've been fascinated by this for a long time and finally advances are being made, like MFG and RIFE which I referenced in the previous post).
 

xpea

Senior member
Feb 14, 2014
458
156
116
What is an inflated market pricing for GDDR7? It's not even listed on DRAMexchange! Whatever Nvidia pays - is market pricing.

I don't know if it's 300 bucks, but it is certainly WAY more expensive than GDDR6 - totally plausible that $10 per GB is real since it is not at "commodity" level yet, maybe next year.
NVDA should pay a bit less than $70 for 32GB GDDR7 28GB/s
 

Win2012R2

Senior member
Dec 5, 2024
647
609
96
NVDA should pay a bit less than $70 for 32GB GDDR7 28GB/s
Average session price shown by DRAMexchange for GDDR6 (last gen introduced Aug 2018 - almost 7 years ago) was $2.3 per GB, so $73.6 for 32 GB

I am not privy to their contracts, but double that price should be a bare minimum for this new gen that seems time exclusive to Nvidia.
 
Reactions: coercitiv

CP5670

Diamond Member
Jun 24, 2004
5,633
733
126
Sure, I implied that it was both personal preference, and a case by case basis when its preferable to use video interpolation. I just would like to see those options to process video offered though.

There's a variety of reasons why people don't like it. I'm one of those who want more video to be shot at higher frame rates, as I think its kind of silly not to a larger extent use the technological advancements for producing new material now, similar to higher resolutions. At the same time I of course respect artists and directors to choose whatever expression they prefer, either generally or for a specific work of art. Publish in 24 or 120fps, or 480i NTSC, or black and white, or whatever you prefer.

The 'soap opera effect' is a term I don't like because its too vague. It might refer to the interpolation of a low end TV from 2014, or it might refer to the state of the art AI interpolation model from this year, or it might refer to video shot originally in 60 or 120fps. I think its unreasonable if its applied to the latter, but I respect those who grew up on film in 24fps and just like the effect. Taste is just taste, even if its fueled by nostalgia, so I can understand if they want to watch even brand new material in 24fps. I don't agree, but I understand, whatever floats your both.
I was reading more about why people prefer 24fps for movies. People think 60fps feels more like a live play than a movie and takes you out of the movie world. I have never seen a 60fps or 120fps movie outside game cutscenes but live TV is often 60fps. But I'm surprised there are not more 60fps or 120fps movies. It's something you would expect to see in a Nolan movie.
 

lightmanek

Senior member
Feb 19, 2017
489
1,141
136
I was reading more about why people prefer 24fps for movies. People think 60fps feels more like a live play than a movie and takes you out of the movie world. I have never seen a 60fps or 120fps movie outside game cutscenes but live TV is often 60fps. But I'm surprised there are not more 60fps or 120fps movies. It's something you would expect to see in a Nolan movie.
There are 48FPS cinema screenings of certain movies, so yes, they do exist.
 
Reactions: igor_kavinski
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |