It's funny how we have 20 pages of complaining that NVidia didn't show any non RTX benchmarks, and how they could have just shown one slide.
Now that slide with a bunch of games is released, and now we will get 20 more pages complaining that they are cherry picked, not the games you wanted to see.
Damned if they do, damned if they don't.
Back when Nvidia launched Pascal they also had a bunch of comparisons, but they used common and popular titles (Assassins creed, GTA V, Titanfall, BF4, SW BF, Thief, RotTR, and the Witcher 3).
With Turing they are using tech demos (Infiltrator), obscure chinese MMOs (JX3), a bunch of games where Nvidia felt the need to point out that they had HDR turned on (FF XV, Hitman 2, Shadow of War, ME: Andromeda), and then a handful of more relevant games (PUBG, ARK, Shadow of the tomb raider, and Wolfenstein 2). Unfortunately of the more relevant games, two are multiplayer games, that tend to be problematic to reliably benchmark (PUBG and ARK), and one is currently unreleased and thus doesn't have final drivers for either GPU (shadow of the tomb raider).
If that's not cherry picking, then what else would you call it?
Also for what it's worth I never really cared about Nvidia not providing any benchmarks during the presentation, I did however care about the fact that they didn't provide any kind of indication of non-RTX performance. A simple "Turing is X% faster in non-RTX games" statement would have been fine with me (and they did actually provide such simple statements for the Pascal presentations).
Are they replacing the built in AA method of each respective game, with DLSS? Because then that wouldn't be apples to apples comparison, although if you can switch say 8X MSAA for DLSS and have the same result, at twice the speed, then yes that would be great. I kinda find smaa good enough though. I'm a peasant!
For the Infiltrator demo at least, the comparison appears to be between DLSS and TAA (that's what they were demoing at least).