FYI power consumption is at direct odds with RT in games. It may not seem that way because we tend to compare uncapped performance between raster and RT enabled settings, but enabling RT in a controlled performance environment can highlight a big power penalty.
Case in point I ran Cyberpunk 2077 with a 75 FPS cap. Then I switched on ray tracing. Reported GPU power jumped from 130-150W to 250-270W for the same net result of 75 FPS. Obviously, this wasn't the RT computation alone, but also raster performance being pushed harder to compensate. To illustrate, if you have a fixed 75 FPS thus ~13ms frame time allocation and RT comes in "stealing" 4ms, then the GPU must perform raster ops in 9ms instead, therefore pushing the equivalent of 110+ FPS. Energy usage goes up a lot even if RT power cost is zero.
I think people should take a moment and be honest to themselves about the performance /power balance they consider acceptable. Just like with CPUs recently, it's not just the vendor that dictates power, but also the user. If you think of all the levers you have available to customize your experience (power caps, FPS caps of all flavors, detail settings, DLSS/FSR, RT etc) I would argue that power consumption is ultimately under your control as long as the vendor designed a good product. The only real problem is cost, that parameter is very adverse to user customization