coercitiv
Diamond Member
- Jan 24, 2014
- 6,956
- 15,588
- 136
The one you mentioned yourself, with 60-70% margins.What is an inflated market pricing for GDDR7?
The one you mentioned yourself, with 60-70% margins.What is an inflated market pricing for GDDR7?
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.I mean I'm happy for you that it's improving your experience but it's not my experience at all.
Like if FG could turn something with unplayable frame rates into something playable even with a bit of image degradation I could see the attraction but going from a very playable amount of FPS to another playable amount of FPS with that additional latency and artifacts doesn't really work for me.
Personally I think if you have 70 fps you're better off with a VRR monitor than adding generated frames.
I can only go on my experience but every time I've used frame generation it's made playing the game feel worse in some way. Like disconnected and floaty rather than sharp and involved.If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.
And personally I much prefer input response for performance, Im pretty sensitive to frame rates with my eyes and can pretty handily tell the difference between 120 and 60 fps like its 30 and 60. But like with all setting tweaks, if the trade off is a net gain, then its worth it is what I'm trying to say.
And I don't want to come off as like I support this idea of frame gen and fake frames as how a game/gpu is supposed to be made. But in reality if we have unoptimized games being made, then frame gen tech can really help those games specifically. I spent a lot of time in my life playing games like ArmA 3 in 2014-2016 at below 60fps, because it was always an unoptimized game. These days my 5800x3d can do 70-90fps on average, so turning on frame gen doesn't even remotely hurt meanwhile my eyes get to see a perfectly smooth image.
Inflated (over) market pricing means (way) above market price, ie - if Micron sold Nvidia GDDR6 for 10 bucks something that is on open market is 2-3 now, that would be inflated. There is no public market as such for GDDR7 just like there wasn't for GDDR6X at start, but we know it's new, fast and used exclusively right now by one company - Nvidia.The one you mentioned yourself, with 60-70% margins.
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.
And personally I much prefer input response for performance, Im pretty sensitive to frame rates with my eyes and can pretty handily tell the difference between 120 and 60 fps like its 30 and 60. But like with all setting tweaks, if the trade off is a net gain, then its worth it is what I'm trying to say.
And I don't want to come off as like I support this idea of frame gen and fake frames as how a game/gpu is supposed to be made. But in reality if we have unoptimized games being made, then frame gen tech can really help those games specifically. I spent a lot of time in my life playing games like ArmA 3 in 2014-2016 at below 60fps, because it was always an unoptimized game. These days my 5800x3d can do 70-90fps on average, so turning on frame gen doesn't even remotely hurt meanwhile my eyes get to see a perfectly smooth image.
Then Nvidia places their 60-70 percent on top of that. I hope by the time the 6000 generation hits, the retailers are going to expect 60-70 percent as well 😛The one you mentioned yourself, with 60-70% margins.
It's funny that when costs come down by GDDR6 etc. coming into full production where it is on the open market, we never see the price of GPUs come down.Inflated (over) market pricing means (way) above market price, ie - if Micron sold Nvidia GDDR6 for 10 bucks something that is on open market is 2-3 now, that would be inflated. There is no public market as such for GDDR7 just like there wasn't for GDDR6X at start, but we know it's new, fast and used exclusively right now by one company - Nvidia.
Everybody in semi wants (needs even!) 50%+ margins, and this is a brand new product with lots of R&D put into, 60% margin won't be "inflated" - looks like Nvidia got exclusivity also, which always comes for $$$.
It could easily be 300, that's not implausible in my view.
Thats an interesting saying... cheers!Negotiating with these people is often a dream in the morning and and a nightmare in the afternoon.
There's a push-back ready for the lower tier SKU VRAM belly aching. DLSS4 (TNN/MFG) reduces VRAM usage so 8-16GB is "fine". Just wait for the <=12GB SKUs.Edward Snowden slams Nvidia's RTX 50-series 'F-tier value,' whistleblows on lackluster VRAM capacity
Blackwell consumer GPUs offer 'F-tier value for S-tier prices, ' moans the naturalized Russian.www.tomshardware.com
Tag team effort from technical marketing.I'd like to know who was the genius who though that this was a good idea.
Adrenalin could be lying to me, but when a game is at a avg of 60fps, with afmf2 on my 6900xt, it isn't more than a few ms of latency. it definitely goes up when the native fps dips, but it also seems pretty reflective of how it feels.FG always adds latency. To get only 1 ms of added delay, you’d need to be up in the 1000 fps range to begin with.
Large scale manufacturers would not buy on spot market, they'd have contracts spanning long time, so if they buy at wrong time (like couple of years ago) they are stuck with higher than spot market cost.It's funny that when costs come down by GDDR6 etc. coming into full production where it is on the open market, we never see the price of GPUs come down.
If the generated frames aren't adding latency, then it doesn't matter and it does always look better. I really like AFMF2 as an AMD user because it really does improve the smoothness of CPU heavy games. If you're already at 60fps, frame gen isn't going to hurt your input lag unless its a really bad implementation (like fsr3 frame gen in stalker 2 for example). and visually speaking it really does look better. I'd trade 1ms of input lag any day for better motion quality. As long as the game maintains its near 60fps input delay, having a 120fps image doesn't feel horrible.
Probably from testing like this.Where are you getting 1ms?
16.6ms added delay minimum for adding fake frames at 60 FPS (1/60).
RF = Real Frame, FF = Fake Frame.
Without FG: RF1 (0ms) - RF2 (16.6ms) - RF3 (33.3ms)...
With FG: RF1 (0ms) - ( 16.6ms - RF2 held) - FF1 (25 ms) RF2 ( 33.3ms - previously held can now be shown - at the same time RF3 has arrived and is held) - FF2 (41.6ms) - RF3 ( 50ms .....
You are constantly one frame behind (minimum) because it's held that long to create new in between frames. That's why most decent examinations say 60 FPS minimum, and better with 80-100 FPS.
Adrenalin could be lying to me, but when a game is at a avg of 60fps, with afmf2 on my 6900xt, it isn't more than a few ms of latency. it definitely goes up when the native fps dips, but it also seems pretty reflective of how it feels.
Like I truly know how ArmA 3 feels at 30fps, and when there's a ton of AI on screen even on my modern rig fps will dip. AFMF might be at 60fps, and looks maybe smoother, but it feels totally like 30fps, which is reflected by the 99% minimum fps readout. But otherwise when ArmA 3 is running well, the frame generation does an acceptable to me job of turning 70-90fps into 100-120 from a visual standpoint.
That extra one frame latency is inescapable, though you might end up with less if you have Anti-Lag enabled. Both AMD and Nvidia deal with the excess latency by dynamically adjusting frame timing to reduce perceived latency. The catch here is one can reduce latency independently of enabling frame generation, so arguing that FG does not add latency is misleading at best (not your argument, I understand your PoV, but rather any marketing claim)Adrenalin could be lying to me, but when a game is at a avg of 60fps, with afmf2 on my 6900xt, it isn't more than a few ms of latency.
The added latency is a partial sum of a rendered frame + generated frames, assuming the card can generate the inserted frames much faster than it can render another full frame. All of this is offset by the rest of the system latency and also Reflex. If Reflex is not available to enable independently of DLSS/FG, then the comparison can be made to seem much more favorable for frame generation in terms of latency cost.
Here's some screenshots for FPS+latency from the DF video so you can see the scaling, though keep in mind the latency numbers fluctuate quite a bit. A screenshot with equal numbers for 3x MFG and 4X MFG does not mean the 4th generated frame adds zero latency.
View attachment 115856
View attachment 115865
It really is different when you're directly in control of media. Maybe some day AI can be good enough to fill all those voids but as is just interpolating video isn't remotely the same as real time interpolation of new things that are constantly happening. Thats part of why having a high frame rate for frame gen is so important from the getgo. There really isn't enough information in 30fps teir video to create fulsome video.For video its just a binary question if you think its good enough or not since you're not dealing with latency, so it should be much less controversial to offer compared to interpolation for games.
NVDA should pay a bit less than $70 for 32GB GDDR7 28GB/sWhat is an inflated market pricing for GDDR7? It's not even listed on DRAMexchange! Whatever Nvidia pays - is market pricing.
I don't know if it's 300 bucks, but it is certainly WAY more expensive than GDDR6 - totally plausible that $10 per GB is real since it is not at "commodity" level yet, maybe next year.
Average session price shown by DRAMexchange for GDDR6 (last gen introduced Aug 2018 - almost 7 years ago) was $2.3 per GB, so $73.6 for 32 GBNVDA should pay a bit less than $70 for 32GB GDDR7 28GB/s
I was reading more about why people prefer 24fps for movies. People think 60fps feels more like a live play than a movie and takes you out of the movie world. I have never seen a 60fps or 120fps movie outside game cutscenes but live TV is often 60fps. But I'm surprised there are not more 60fps or 120fps movies. It's something you would expect to see in a Nolan movie.Sure, I implied that it was both personal preference, and a case by case basis when its preferable to use video interpolation. I just would like to see those options to process video offered though.
There's a variety of reasons why people don't like it. I'm one of those who want more video to be shot at higher frame rates, as I think its kind of silly not to a larger extent use the technological advancements for producing new material now, similar to higher resolutions. At the same time I of course respect artists and directors to choose whatever expression they prefer, either generally or for a specific work of art. Publish in 24 or 120fps, or 480i NTSC, or black and white, or whatever you prefer.
The 'soap opera effect' is a term I don't like because its too vague. It might refer to the interpolation of a low end TV from 2014, or it might refer to the state of the art AI interpolation model from this year, or it might refer to video shot originally in 60 or 120fps. I think its unreasonable if its applied to the latter, but I respect those who grew up on film in 24fps and just like the effect. Taste is just taste, even if its fueled by nostalgia, so I can understand if they want to watch even brand new material in 24fps. I don't agree, but I understand, whatever floats your both.
There are 48FPS cinema screenings of certain movies, so yes, they do exist.I was reading more about why people prefer 24fps for movies. People think 60fps feels more like a live play than a movie and takes you out of the movie world. I have never seen a 60fps or 120fps movie outside game cutscenes but live TV is often 60fps. But I'm surprised there are not more 60fps or 120fps movies. It's something you would expect to see in a Nolan movie.