Let me try to explain why I think Nvidia is pushing Ray tracing and other graphical enhancements.
About a year ago I asked a question on this forum about 4k resolution and where do we go from there.
It obvious 8k is a long way off and 5k would not sell video cards.
What the next best thing to sell video cards? Better visuals.
In my opinion within the next year or so we will have a card that can play games at 4k @ ultra settings @ 100fps+ that's priced under $450. It will be a 7nm upper mid range card as fast as a 2080ti.
I think at this time, real time Ray tracing will become more mainstream and a gtx3060 will be pushing these visuals at 1440p.
You see, Nvidia and AMD need to sell you video cards, if they give you a card the runs 4k at 100fps at high settings they have nothing to sell you. I think by the time a high end video card can push 4k , with real time Ray tracing, at 100fps, we should be able to push 8k with no Ray tracing.
Bottom line, these real time movie like visuals had to be implemented sometime, and both companies need a reason to sell us new fancy expensive cards, I think this is the time, cause I dont think we will see 8k monitors for quite a while under a $1000,
Just my 2 cents,
Well yes, they're always going to push for more graphical stuff. This is not some sudden new change, so I'm not sure why you feel compelled to defend Nvidia like they're doing something really unique or that people aren't aware of what is going on.
The reason why there currently is a backlash is that we don't know how much (if any) real performance figures, while the cards also pretty dramatically increased in price. Things are same as it ever was and it boils down quite simply. Higher prices = people complaining. Simple as that. Even with good performance people will grumble about price increases when they're substantial. For some these price increases put the cards out of their market which is why there's some extra grumbling, as even if the performance is worth it, they're not gonna be able to buy.
We have plenty of history with stop-gap resolution pushes (we had QHD between FullHD/1080p and 4K, we had 720p between 1080p and old VGA 640x480/852/480, heck we had 1600x1200, 1680x1050, 1280x1024, and a whole bunch of other resolutions and aspect ratios before they seemed to fixate on 1080p, 4K, 16:9/16:10, and then 21:9), so acting like they can't figure out what to do between 4K and 8K is silly. They'll do plenty of things while 4K will be the dominant resolution target until they start transitioning to 8K, and then 16K beyond that. Plus let's not forget that we have had for years the ability to render at higher than native resolution and downscale. They could start trying to push multi-monitors again too. And we have VR and AR headsets. It'll be 3-4 years before we see real sustained 4K 120fps at ultra settings in games that aren't years old, and by that time we'll have stuff like 6K gaming monitors and 8K UHDTV, and you'll be going "what'll we do after 8K, it'll be so long before 16K is here?!?").
Actually, your argument is also flawed in that a lot of the talk so far about this stuff is it enabling higher performance, rather than really pushing beyond native 4K rendering level quality. They talked about the ray-traced lighting offering similar quality with less work, and DLSS seems aimed at upscaling and offering tailored AA (to help alleviate AA performance penalty, while improving how it handles specific games, as that's been an issue and why there's all these other AA modes that developers tried).
I have strong doubts that trying to push real ray-tracing in anything but very limited games will be feasible on these cards even with the ray-tracing hardware. And even then, unless its like a simple 2D game, framerates will probably be pretty low and they'll still look really weird (the graphics will be excessively shiny, and/or they'll be glorified tech demos with little real gameplay).
We've been hearing about "real time movie visuals" for decades (I recall people hyping the PS2 that way, and then same thing with the digitized sprites during the SNES era with stuff like Mortal Kombat, Donkey Kong Country, and of course CD add-ons turning games into movies like with the SegaCD). This is absolutely not actually accomplishing that promise either (it is an advancement to be sure though). I mean, movies are going to implement Nvidia's render boxes to accomplish far more advanced rendering, so they're moving the goal post for what "real time movie visuals" even is at the same time. In fact, you're missing that the reason they're pushing this is because the pro/HPC/etc industry wants this stuff and has the money to be able to pay for it, while the tech is finally where its more feasible. So they're making this stuff for those markets, and then letting consumers buy it as a way to defray the cost. The focus of these features is not consumers though.
My point being, there's tons of opportunity for them to keep selling newer graphics cards. In fact we'll definitely be running up against serious limits in physics of chip production before we really start pushing the limit on the graphics processing we'll need (once we start hitting the limitations of our eyes, we'll have people agumenting their vision to enable them to see higher resolutions, or working to bypass our eyes entirely and max out our brains image rendering capabilities).