The market for ray tracing is non existing since no games really have it and nobody wants to play at 1080p. So that argument holds no ground.
Well the steam hardware survey has 1080p as the most common screen resolution at 62% of users, and 4k at 1.3%, like I said before we're talking about quite small markets here. Raytracing is one of those things where you have to actually create the demand in the first place, it's part of innovating, it's painful and awkward to get started because devs wont make features in games that gamers can't use and hardware vendors don't want to make hardware for which there are no games, so either you stifle innovation and we never get anything new again, or the hardware vendors have to take a leap of faith and build something and then go through the awkward phase of not having many games support the tech and generally support it badly, until eventually it becomes ubiquitous. I've mentioned this before i think in this thread, it's happened many times throughout the PC graphics history with all sorts of things like tessellation and shader 2.0
You don't have to buy into the tech, but what I would say is that these awkward phases always have existed and always will, all Nvidia can really do is work as closely with dev studios as possible and they're already doing that.
For the vast majority of gamers who are running 1080p and 60hz it's more attractive to have a video card that offers graphical fidelity increase rather than tons of extra power which they simply don't need.
I wonder about the veracity of this statement and if there is a disconnect between the tech community and the broader gaming market. I've noticed (and even years ago when high refresh monitors first started appearing) that at least in the circle the sentiment seemed to lean more towards high resolutions over higher refresh. But I do question whether or not that actually applies to the broader PC gaming market.
I'm aware of market research data from Digitimes which seems to indicate the growth rate in annual sales of high refresh monitors has doubled each of the last two years. Also that manufacturers in general are now heavily targeting this segment and if you look at their new releases are predominantly high refresh options. Based on steam survey data 1080p monitor market share isn't just the highest it's also growing several order of magnitudes higher compared to other resolutions (4k and 1440p both went down slightly in the last one even). Anecdotally in terms of other communities I might interact with (outside of tech circles) there seems to be a much higher interest for high refresh relative to high resolution.
In general I've always had this feeling that tech forums might lean more towards high resolution vs. high refresh more so compared to the broader PC gaming market. I actually wonder if a high refresh vs high resolution survey done in this sub forum, the CPU sub forum, and the gaming sub forum if there would be a significant difference in results.
I'd be willing to bet that a large portion of the high refresh monitors are going to competitive gamers, typically the people who see the most benefit from high refresh displays because they need every edge they can can get. And the amount of competitive gaming is on the rise with esports and steaming and whatnot. It's also something that's somewhat limited by bandwidth of the connectors and the push for high resolutions like 4k has enabled the bandwidth we need for high refresh rates, the original 4k monitors and TVs worked with multiple video connectors or worked at 30hz because there simply wasn't enough bandwidth available. As connector standards increased bandwidth for 4k that means smaller resolutions like 1080p could be run at higher refresh rates. You'll notice that it wasn't a push for faster refresh rates itself that made those connectors available, that only happened when 4k came along. We've had the technology with TN panels to make 240hz monitors for a long time now, the 240hz panels we see today are 1ms panels and we've had 1ms panels for god knows how long.
Most online tech/gaming communities are for people who are tech heads and most of us have extremely high end PCs because that's what we're into, if you averaged the spec of most people who post on these forums it would be way above average. For some perspective the target frame rate of consoles which is hardware aimed at a casual audience is 30fps for almost all AAA games because they know the consumer market for this stuff prefers the additional pretty graphics over a smoother experience, only in games where graphics don't really matter will they target 60fps.
That also rings somewhat true of people who like good quality monitors, because moving to monitors faster than 60hz basically forces you onto a TN panel and many people don't want to move away from true 10bit colour to 6bit+2 bit dithering because it looks awful, and due to the viewing angles of TNs you're also limited in display size to 27" at a push. The reason I guessed that competitive gamers would likely make up a lot of the people using these monitors is because they're generally function over form, they'll take a hit in colour accuracy if it means a faster panel that allows them to better compete, in a similar way that they'll turn down graphics settings for faster frame rates. I think it's also generally true that the competitive games are the simpler ones as well, like CS:GO and MOBAs which run on engines that are very fast. You can get 240fps out of CS easily, you cannot get that out of many other games simply because we don't have CPUs that can run modern AAA games that fast, they seem to top out around about 140-160fps when not GPU limited and using mainstream high end CPUs, as always with CPU bottlenecking you're kinda stuffed because most settings in games that you can alter for performance mostly load on the GPU not the CPU.
The broad gaming market doesn't want super expensive stuff either.
Their overall volume is going to decrease with RTX, but it'll be more than made up by the price increase.
7nm stuff isn't going to be cheaper either. Maybe they'll keep the price same and actually increased perf/$ that gen.
Probably true. Jumping from 12/14nm down to 7nm is a big leap as well, it's nearly half the size, that means something like 3-4x more transistors on the chip? Crazy