The crapping is justified once you notice the way DLSS3 is marketed as the "better than native" alternative. Even you fell in the trap of thinking there's not much difference between FSR3 Quality and native (temporal) AA. The real comparison you should be making is FSR3 Quality versus FSR Native AA, and the difference is there in loss of detail and shimmering artifacts. The same applies to DLSS3 versus DLAA, though DLSS3 admittedly fares better at 1440p than FSR.
What?? I simply stated that in my experience
through my eyes I didn't notice that much of a difference and that I mostly don't notice artifacts. These are my own personal experiences ?? How are you trying to falsify my own personal experiences and saying I fell into some type of marketing trap. Maybe you have a sharper eye than me and can notice all these horrible artifacts in all of your games, but in my experience it's not that big of a deal, and arguably to most less educated and less picky consumers than people on these forums it's also likely not that big of a deal. I understand FSR/DLSS isn't better than native, but I am willing to take the relatively minor image quality hit to gain more FPS. I'm not "ignoring" artifacts just because I say in my experience I usually don't see them while playing games or viewing DLSS/FSR comparisons.
And no, like I said in my original post, Nvidia upselling their technology in typical Nvidia fashion is
NOT an excuse to crap unnecessarily on anything and everything Nvidia. I understand the mindset of trying to educate more people to not fall into marketing traps, but irrationally criticizing and ranting against product A or B isn't productive for that purpose or any other purpose.
If you don't understand the DUNE III reference, you probably haven't been paying attention to new AAA games system requirements. The games with the latest engines and visual effects require upscaling and FG even for 1080p60 experience, which is the most vulnerable config to loss of detail and artefacts from both the upscaler and frame gen. On the other end of the spectrum, people with $2000 cards use their PCs with huge TVs for home theater cinematic experiences. For some games are the new movies. There's even a game out there with a "Cinematic" performance preset
I completely understand the increasing AAA games' system requirements, I quite literally addressed it further down in the post. I agree, devs should not make upscaling or FG technologies a requirement to have playable framerates in their games. I said that basically verbatim.
The DUNE III reference he was making wasn't in regards to a home threatre system, or game rendering, he was talking about walking into a
MOVIE THEATRE and finding upscaling technology in future showings of pre-rendered movies with multi-million dollar budgets. He was using this outlandish scenario to point out why consumers might not want more upscaling technology in PC hardware, which is a stretch to say the least.
Last bu not least, if you can't spot the artifacts in the DF video even with 50% speed, why are you using FSR/DLSS at all? Just lower game IQ settings to a medium/low blend, based on your own reporting you're probably unable to tell the difference once things are in motion. You should also never enable RT in games for the same reason, the massive compute cost would simply be wasted.
If lowering game IQ settings to medium/low was as effective in increasing performance
AND retaining image quality as upscaling technologies, then upscaling technology would simply be useless. I also don't see how not being able to spot minute artifacts on the screen for a fraction of a second is equivalent to not being able to notice the entire game being turned down from high to medium/low. Don't kid yourself, those two things are not equivalent at all.
The difference RT makes on an image is also hugely noticeable on the look of a game too, I personally don't turn it on since I don't have high-end enough hardware, and generally ray-tracing is still too taxing for most people's systems. Nonetheless RT is in fact a tangible increase on the graphics quality of most games that employ it, so yes turning on RT with the trade-off of having to turn on DLSS/FSR isn't necessarily a horrible trade-off, especially since the effects of RT are a lot more noticeable than upscaling to many people.
You're post is like a hardened audiophile muttering about the subpar mids and the whiny highs of the airpods pro and how anyone who likes it fell into the "apple marketing trap". No they didn't, the sound quality is simply "good enough" for most people while offering other features that make it convenient and enjoyable to use more than your equivalently priced 250$ chinese IEMs, and that's coming from a person who bought those 250$ chinese iems. There is more nuance to the discussion of image quality than you might realize.