I don't see why. Not every game uses RT. While their marketing wants to push RT does not mean it is an essential part of the experiance. I have never once played a game with RT, and my guess is that most other people have not either since only a small handful of cards can handle it.
The number of people that have $1000 videocards is vanishingly small.
I don't care if you test with it on or off, but you can't test it one way and then makes claims about the other.
I mostly agree with what you wrote, and feel the same way. But I think you missed the point of the testing, and why it was done.
Major tech sites have conducted polls. They all have similar results. Less than 20% of respondents care about RT. Even less actively used it within a week of the survey. Which agrees with both of our sentiments. I have little use for RT beyond reflections. That is, if I play a game like cyberpunk'd or Spiderman that offers the feature.
Now, the reason for the testing is that it is intended as consumer advocacy. Letting their readers and viewers know that despite Nvidia GPUs having the fastest hardware ray tracing each generation, that buying a card lacking a sufficient amount of VRAM can potentially make the feature basically useless. This can help keep shoppers from choosing a card in their price range, that is the most likely to age poorly. That isn't just applicable to RT either. Even without it, games are using more VRAM.
The other factor, is that with high quality assets comprised of a lot of data, the games look better. And we get the presentation the devs intended. The kicker is that there is very little performance penalty for such a major visual element. All you need is a big enough frame buffer.
The rest of the discussion is just some of us pointing out that Nvidia's showcase tech being hampered by cheaping out on VRAM, is pretty damned rich.