01:57PM EDT - 'Designed for crazy amounts of overclocking'
01:57PM EDT - 'And it's just so quiet' ... 1/5 audio levels of 1080 Ti at max overclock
Heh...
We'll see. There were rumors saying that they'll OC to 2.5GHz "easily" but then we also heard that Pascal would easily hit 2.0GHz but I don't think it could sustain that without a really good water cooling setup. I'm doubtful about the 2.5GHz at all, as that's a 700MHz gap between the official boost clock (that I saw on an Ars article) and unless it just starts going crazy in power/thermals I can't see them leaving that much headroom unless they just had to try to stay near Pascal on power use.
Stoked for ray tracing. Despite the hype factor, it really is the holy grail of computer graphics and we've been steadily marching towards that reality. Since I have a 1080 Ti I will probably skip the first generation of RT hardware but its a chicken and the egg problem. The sooner they get RT hardware into the world the sooner devs who aren't tightly partnered will start baking it in.
This is relatively 'AMD like' of nVidia, pushing new features that require dev and not progressing a ton on today's games (presumably - there could be magic they didnt mention but I doubt it). Smell's like AMD's dx12 move, and AMD's earlier tesselation move. And of course the so far failed-on-PC move to the new Vega pipeline stuff.
So I think the RT stuff will be awesome, especially when its more deeply built in vs being an add-on effect in a few games. But it also presents an opportunity for competitors to come in and play the old nVidia strategy - double down on what make today's games go fast because day 1 benchmarks stick around forever. Given how long the less dramatic switch to DX12 has taken, I am not convinced RT adoption will be much faster. We're looking at a few years before it approaches anything that looks like mainstream
They've both done this type of thing. This actually makes me think of SM3.0 and Nvidia making a big fuss about HDR bloom. Few games utilized that and it was a pretty good performance hit for what, in my experience, amounted to washed out colors for a moment when you'd make a pretty drastic change in lighting (meaning like in Oblivion you'd go from a cave into the sunlight and you'd get a harsh bloom for a moment). But eventually it just became a normal thing in game engine lighting. And Nvidia adopted some newer stuff in the 8000 series first I believe, although I think AMD then did with their 5000 series (Eyefinity, first to SM5.0?), and then Nvidia were first to move to a compute heavy card with Fermi. Then AMD did Mantle. They kinda go back and forth, and it seems to be a crapshoot on if whatever new thing they're touting lives up to the hype.
We'll see on this ray-tracing stuff. It definitely has potential, but I have a hunch this is going to lead to a split. By that I mean, if you want real time rendering and control, you're gonna pay for it, while most people are going to start adopting streaming, where they can have their servers rendering at higher resolution and max eye candy, and then encoding it and sending it downstream, where it'll look like movie quality graphics and most people don't notice video artifacts and the like anyway. I kinda think it might even move to some pre-rendering (where they have their systems calculate a lot of looks, and then send it so that you have much of the information locally and don't need to stream it, but you also won't need to render everything in real time, and you'll get the benefits of higher resolution and realistic lighting model, but it won't be fully real time).
I'm not saying this happens within a couple of years, but that we're going to see it shift that way fairly quickly (within a decade) as the big cloud companies (Microsoft especially, who already has the infrastructure, and they have the Xbox, plus PC install base). The momentum is moving that direction, and I think it'll pickup steam once the big players start rolling out their streaming services in full force.
"ray tracing" stuff is very interesting but it's for the future and probably the next gen is the one that will be good enough to fully take advantage, also without the weight of a main console and without competitors being compatible with the same it's difficult to see it becoming a normal thing, so... we will see... still, exciting products, but pricing looks very wrong, specially if it's not massively faster in current games.
I don't think we'll see consoles adopt it much at all (unless it really does actually simplify things for developers that much, but even then, I'm not sure, as I think the 2020 Xbox will be the last one to really push realtime local rendering on the system itself, and AMD has not shown any ray-tracing hardware yet, and they're likely already pretty locked in on what the chip design will mostly be), as it will just take too much power for that and I think their power would be better focused in other ways. I think we'll see Microsoft instead implement it in their streaming setup where they can buy those render boxes from Nvidia and process it, and then give people full eye candy rendering streamed to them at whatever resolution their internet can handle.
No that won't happen right away, but I think it'll be much more feasible there, and happen quicker than people expect (around 2025, I think consoles will mostly if not entirely be streaming boxes; well for the big name AAA gaming Xbox/Playstation/EA/etc, I think there will be niche ones that focus on maybe indie games or something - think things like the SNES Classic but for retro-style games where they don't need a ton of horsepower but that crowd will want minimum latency as they play bullet hell and crazy platformers and the like).
Oh, I think there will be an exception - Nintendo. They'll hold on for probably another 5-10 years (by that I mean, 5 years after everyone else does, they start pushing streaming but not all games, but within 10 of when the rest of the industry moves that way Nintendo follows suit - unless they get bought by someone who forces a quicker transition).