Anubis
No Lifer
unless gaming benchmarks are 50% better than the 1080Ti with ray tracing turned off this is a hard pass for me at this price
So in June Huang said no new GPUs 'for along time'. Two months later they launch a new generation, with multiple dies ready right from the start. Isn't that illegal?
I think we are all waiting for these answers. I can afford it but I'm not dumb when it comes spending money.So many questions...
How much does RT slow down the frame rate?
Does leaving RT on affect games that don't "do" RT?
How fast is the new GPU in "conventional" mode?
How much power does the RT part use and will we save power by turning it off?
Try this- read the names as TitanT(2080Ti), 2080Ti(2080) and 2080(2070) and you end up in a very different place. Now I'm not saying that a big performance boost is going to happen, but the pricing issue is one of marketing perception more than anything..
I come from a time when some new cool new technology was coming out we'd go to tech forums to discuss it. Not run around waving flags screaming for our company of choice. This is the biggest new technology in this space in decades, and people seem to be utterly dismissive of it. If I can get a couple people thinking along the right lines than it is worth my time.
Well yeah, we want improved price-to-performance in games that actually exist. Not a degradation. New gens are supposed to improve price-to-performance (even if overall price sometimes increases). If the 2080 Ti is over 71% faster than the 1080 Ti then we will get that, but although they discussed new SM's most of us are rightly not expecting 71%+ from 21% more shaders.
If we apply the price increase multiplier that the 2080 Ti got over the 1080 Ti for each flagship from the 8800 Ultra onwards (285, 580, 780 Ti, 980 Ti, 1080 Ti) then the 2080 Ti should cost $21,000.
It's good that ray tracing tech is here, but this is Gen 1. Fermi can play DX12 games. You want to look up the benchmarks for a GTX 580 in DX12? It's not pretty. It's very possible the RTX 2080 Ti will be too weak when ray tracing becomes mainstream and the template from which games are actually built from the game up. I'd say it is all but certain the 2070 will be a joke in these games.
We're getting a 71% price increase, and looking at (baring an SM revolution) in the neighborhood of 15-25% more performance in games that actually exist, and new graphics tech that will be a stapled add on with a huge performance hit. It is disappointing unless you are price inelastic, period.
I think we are all waiting for these answers. I can afford it but I'm not dumb when it comes spending money.
The pricing issue is one of exactly what I said it was: price-to-performance. Not marketing drivel.
Why are you picking battles?
I checked last night they were all out. Nvidia website still had them for stock on 10/22.+1 for 1080Ti I need to upgrade my GTX 980 SLI computer.
is Newegg already sold out of RTX preorders or are they just not accepting them yet? I don't care either way, just wondering.
+1 for 1080Ti I need to upgrade my GTX 980 SLI computer.
is Newegg already sold out of RTX preorders or are they just not accepting them yet? I don't care either way, just wondering.
My apologies, you were kind of coming across as a raging fanboy there, I did not realize you had benchmarks to compare. Could you please link them up so we can stop all of the other idiot fanboys from talking about performance versus price without knowing what that actually is please?
My apologies again, the way you were talking came off like you had absolutely no clue what the actual performance was and you were just being a dimwitted cheerleader, very sorry about that confusion
Misinformation and lies is what I'm picking on, and trying to discuss actual things with merit.
Buyers looking at $1200 GPUs aren't worried about bang/buck, just the biggest bang...
very true, though my guess is 3080Ti is not that far off if NVIDIA is really going to make a big push on this RT stuff, be it real or not.
This is exactly the type of consumer talk worth discussing here.
You can't one hand hand proclaim you want to discuss the tech, and on the other hand say we cannot discuss possible performance that is directly a result of the tech.
Which architectural changes to the SMs do you foresee allowing the 21% more shaders to offer at least 72% more performance?
I'll wave the flag for price-to-performance increases with new generations every time.
And where is your evidence of speculation such as 2070 running better with RT on then off
The issue here is that game developers have to build ray tracing in accordance to Nvidia's cores to take advantage of it. Of course if AMD implement their own cores for ray tracing, that now adds double the work to render lightning, something that its already done without ray tracing.
Another preview of the 2080ti in tomb raider and battlefield v.
Gotta say bf V looks awesome but performance is still an unknown concern.
https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on
I think that @PeterScott
I saw the first hour and a half or so of the reveal, and I thought that it was some really amazing tech that NV has delivered. I really don't know how AMD can survive in the GPU space, if they don't already have similar tech in the pipeline. You can only survive on price cuts for so long, you need new technology too.
But I'm concerned about AMD's longevity in the GPU space, should this new "RTX" tech, catch on.
This thread is a great example of how terrible these forums have become.
We finally get real time global illumination in gaming with the hardware to push it- something we have been begging for since the dawn of real time 3D, it's finally here- and people are dropping into their hive mind fanboy idiocy to champion/vilify their respective party. Pathetic.
First off- everyone bashing the expected performance- please make sure you go on record right now stating how much slower the 2070 is going to be compared to the Vega 56- or even the Vega 64 if we want to use the inflated FE pricing. Please make sure to state for the record, clearly, how these cards are vastly inferior in terms of price versus performance compared to the competition.
Very important- if you are bashing these parts for their cost versus performance, go on record stating how much slower it is going to be compared to the competition.
Now this forum for many months turned into a cesspool of gushing over async compute and how it was going to change the industry- something that offers a very small performance improvement in certain situations with driver overhead. Months and months we saw people going off over this. Something nobody even tried to claim would give us any visual benefit whatsoever.
If we have people in this thread that believe that performance, only performance, and always performance is the only thing that matters I'm going to go ahead and save you a ton of cash. 1024x768 all settings on lowest- if you have anything over a 270 you should be good to go for almost every game for years to come. What's more- it isn't like we are seeing big improvements in gameplay lately, so just go ahead and play through all the best games of the last twenty years and stop even coming to threads discussing PC graphics hardware. You save money and people interested in technology advancing can actually have a reasonable discussion without wading through the ignorant crap being spewed by the insane Luddite mentality.
OK, so now we should have everyone complaining about the performance, which we haven't seen yet, on record with how much slower the 2070 is going to be compared to a Vega, and we have people who hate improved IQ being content so which groups do we have left?
The team green boys- these new features are taking up huge chunks of the die. These inflated costs are entirely due to the fact that they are offering real time global illumination. The fact they are doing it in as small of a space as they are is mind blowing- but it is a *huge* chunk of space. These parts are going to offer a very small performance increase over the prior generation compared to what you are used to seeing. If you people are going to try and defend it from that angle- make your calls now on performance and be prepared to be *very* disappointed.
"PC gaming is going to die because people are getting priced out" idiots- seriously, put the crack pipe down. Pull up the Steam user charts and look at what games people are playing and on what monitors- a 1050 is going to keep the masses happy for years- the people who are even thinking about these new parts are a minuscule subset of the market and for us, is the price really going to push us out? 1998 V2 SLI was all the rage, adjusted for inflation that would ring in between $900 and $1000. Really all that different? Really?
This won't see broad scale adoption...... if anyone is actually claiming this, you don't understand what is being discussed. All of the other options people are comparing this to, GameWorks, PhysX, Tesselation- all of those require extra work from developers. Some of it is cut and paste from libraries, some of it is quite a bit more involved but it is extra work. Global illumination isn't. If your engine is set up to utilize it(which all of the major engines will be) you literally just turn it on. That's it. This is *LESS* complex than supporting multiple resolutions. Seriously. If you don't have to worry about legacy parts this is *SIGNIFICANTLY* less work than *CURRENT* solutions. Looks much better- much less work. Devs won't support this..... why?
The 'this has been done before' crowd- not even remotely close to being the league of being true. All of the prior attempts were for a Ray Traced render engine setup. That is handling all of your rendering through ray tracing. That has some huge drawbacks and is simply way too slow and limited to work properly given computational limits. This is adding ray traced lighting to a rasterized rendering pipeline. Gives you the benefits of global illumination without removing the massive benefits of rasterization. No, nothing like this has ever been done in hardware before- and it isn't quite 'as done' as some of you all are thinking. Yes, their have been engines that used shaders to ray trace certain elements, this is full global illumination using ray tracing, a very, very different thing.
The AI hardware on these parts are used to calculate out the actual lighting based on very loose approximations that the actual ray calculations are doing. The effectiveness of this method is actually shockingly good for real time purposes, and the only way it could be reasonably done.
This is a big point some people may not want to here- if AMD doesn't follow within the next couple of years they are out of the graphics business.
In computer graphics this is the biggest game changer we have seen since the Voodoo 1.
That isn't hyperbolic. Pull yourself out of fanboy team red/green muck for an hour and go check out what people who work with visualization are saying. This is *HUGE* and AMD is going to follow suit or cease to be a factor. Does anyone really think Microsoft is going to launch their next console without this? We saw it with the last generation, Sony forcing AMD to change some hardware around that worked out very well for all involved- it will happen again with the next gen of consoles. They aren't shipping without this technology.
The question is when are we going to see AMD's response, to which I think we know probably not until 7nm. No matter which 'side' you are on, trust me when I say you want to see this technology succeed. For those truly rabid for team red- push them to pull a stunt like nVidia did with tessellation. They were late to the game and then smoked AMD, that is the best option for them going forward.
This *IS* the future so many of us have been waiting for for decades. The idiocy involved in this thread would be hysterical to read through if it wasn't so sad.
I really don't know how AMD can survive in the GPU space, if they don't already have similar tech in the pipeline. You can only survive on price cuts for so long, you need new technology too.