Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 43 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

happy medium

Lifer
Jun 8, 2003
14,387
480
126
You're being far, far too optimistic. As it has been pointed out, only the 2080 Ti even offers acceptable (and barely at that) levels of RT performance. Look back at previous generations and it's pretty easy to see how slow the uptake is on the high-end. The 980 Ti makes a pretty good point of comparison because it came out a little over 3 years ago and has about equal performance to a 1070.

If you check the Steam hardware survey, as of June 2018 (the 980 Ti came out in June 2015) there were only 8.5% of users who had something at 980 Ti level or better. It's even worse in this case since the 980 Ti came out at $650, whereas the 2080 Ti is about twice that price. I have no idea to what extent that slows adoption even more, but it's pretty obvious that it isn't going to speed it up.

I think it's cool that NVidia is trying this out, but you're just deluding yourself if you think that it's going to be mainstream and widely accessible in the next 5 years. Even 10 years is probably being generous. I remember when people thought that the Nintendo 64 would be capable of real time ray tracing, so this pipe dream of soon to be ubiquitous ray tracing is hardly new.
I would guess 3 years is when RT will become mainstream.
7nm Touring next year with another 35% performance gain and then 18 month's after that in 2021 another 35% performance gain.
By 2021 a 5nm rtx4060 clocking in at 3ghz will easily handle 4k RT graphics with DLSS.

AMD will have a RTX/DLSS solution in that timeframe also,
as well as the consoles.
 

Mopetar

Diamond Member
Jan 31, 2011
8,002
6,443
136
I would guess 3 years is when RT will become mainstream.
7nm Touring next year with another 35% performance gain and then 18 month's after that in 2021 another 35% performance gain.

Way too optimistic. Look at the numbers I gave earlier where only 8.5% of gamers have the top level of performance that you could get 3 years ago and tell me that it's going to be better for ray tracing. It doesn't matter if AMD comes out with some version of it, the performance isn't good enough on the high-end NVidia cards and AMD doesn't come close to touching those right now.

35% performance gain means that the 3080 will be able to do adequate levels of RT at 1080p, and another 35% on that, would mean that the 4070 will be able to do adequate RT at 1080p. The problem is that when you look at the historical numbers, only 8.5% of gamers will have a 4070 level or better, and that's assuming the old prices for that level. Those prices went up quite significantly.

8.5% is not mainstream. VR had even more hype behind it, as well as multiple companies involved in supplying it and it's pretty much faded, just like motion controls before it. Here's a list of Oculus Rift games according to Wikipedia. The Vive has a slightly better list, but if you look at it, pretty much all of the games came out over a year ago. Sony seems to have a few titles in the pipeline, but most of them look like ports. Regardless, there's no killer apps.

And even if you get 8.5%, that's 8.5% that can play at what most would consider barely reasonable settings and frame rates. People who are used to 144 Hz (or higher) or 4K aren't going to go back. That's always been the story with ray tracing, though. It's like when Intel had a big showing about ray tracing a decade ago and had a fully ray traced demo of Quake Wars. Only it was in 720p and the frame rates were significantly lower than you could achieve if you used conventional methods.

If AMD does become competitive again, NVidia would drop this immediately as the performance is nowhere near good enough at mainstream price points and the amount of silicon dedicated to implementing these features could be much better spent in other ways.
 
Reactions: beginner99

happy medium

Lifer
Jun 8, 2003
14,387
480
126
f AMD does become competitive again, NVidia would drop this immediately as the performance is nowhere near good enough at mainstream price points and the amount of silicon dedicated to implementing these features could be much better spent in other ways.

where do you come up with this stuff?
Nvidia will not waste a billion dollars on RT and cancel it because AMD becomes competitive.

What games are you basing your evidence with?
There are none.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
where do you come up with this stuff?
Nvidia will not waste a billion dollars on RT and cancel it because AMD becomes competitive.

What games are you basing your evidence with?
There are none.
Who said they spent anywhere near that. These are probably broken tesla chips off the factor floor reject bin
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It seems like a lot of people don't understand the concept of the hybrid RT that NV is pushing.

People keep acting like this will just be full RT rendering in gaming, and thus have low frame rates and low resolution.

It's really going to be mostly conventional rendering, with some RT thrown in where it will look really good.
 
Reactions: Muhammed

coercitiv

Diamond Member
Jan 24, 2014
6,383
12,798
136
It seems like a lot of people don't understand the concept of the hybrid RT that NV is pushing.

People keep acting like this will just be full RT rendering in gaming, and thus have low frame rates and low resolution.
People do understand the concept of hybrid RT since it is the hybrid FPS numbers they are discussing.

They also understand it takes 50% die area in Turing to even enable hybrid rendering @ 1080p, meaning full RT rendering is decisively out of the question when half of your silicon budget barely manages to handle global illumination and reflections.
 
Reactions: beginner99

coercitiv

Diamond Member
Jan 24, 2014
6,383
12,798
136
Gamers Nexus made public their affiliate sales numbers in the US since the RTX launch. While not representative for consumers in general, they are arguably relevant for enthusiasts.

 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Interesting the Perlmutter Supercomputer will be featuring a new Nvidia GPU called "Volta-Next." I wonder if this means Volta and Turing will be developed in tandem, or perhaps Nvidia will have two separate lines from now on, one with ray tracing cores for graphics and one with more cuda cores/tensors for AI/compute?
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
Great! Now we need to test which one among the two offers greater competitive advantage in a competitive first person shooter - 60 FPS with RTX on or 144 FPS with RTX off.

NV could have simply avoided all this ridicule and actual get praised for RT by pricing the cards in a sane manner.

It's the epitome of "there are no bad products just bad pricing"
 

coercitiv

Diamond Member
Jan 24, 2014
6,383
12,798
136
Great! Now we need to test which one among the two offers greater competitive advantage in a competitive first person shooter - 60 FPS with RTX on or 144 FPS with RTX off.
Depends whether you're assessing enemy movement and positions through mirrors, windows, or puddles. Ray Traced combat will gain far more depth than traditional raster tactics, with emphasis on real time information and measurement. It takes one wrong move to bend light around you in a way that will enable an RTX warrior instantly asses your position, vectorized movement and combat potential. The age of the fast and brutal digital warrior is rapidly fading away.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
It takes one wrong move to bend light around you in a way that will enable an RTX warrior instantly asses your position

Ignoring your sarcasm, if the games audio system isn't crap and you have decent headphones you can easily "detect" player not actually visible without ray tracing. Only thing that would help with is with campers but they would just learn to not camp at such spots so not really all that useful.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,755
751
136

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
glad Intel is entering dGPUs just to bring another product cycle besides AMD and nVidia into the mix. Because this is what you get in that situation. DXR will be awesome in a couple of generations when its usable
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
43 pages of nonsense only to conclude that those who were sensible and rational were 100% correct whereas naysayers where smoking hopium or some other exotic ad mixture. This along with many other life experiences is a reminder to never waste time arguing w/ people who exhibit certain characteristics. GTX 20 series is a disaster, a rip-off, and a beta level feature-set not ready for prime-time. While offline rendering will see a boon from this feature, it looks completely like they chucked this over the wall to Geforce and hacked together a marketing gimmick to lure in people who like giving away money to subsidize the Quadro dev cost. It is now proven and evident exactly why Nvidia was hush hush about the performance numbers and out-right shady. It is now proven and evident that this card and RTX are a joke.

Good thing seeing that intelligent and informed people of the GamerNexus critical variant by and large stuck w/ 1080ti instead of this trash heap.
Really nothing much more to say as informed and critical people sticking to facts and rational already said it.
Nvidia is doing exactly what Intel did leading up to AMD cutting them at the knees.

Ramped prices into la-la land and delivered a turd setting up a huge opening for their competitor to take market share.
Lastly, as I so critically outlined.. All RT cores do is a basic intersection test. The bulk of the work is in refreshing the object model which is done in cuda cores and takes a lot of computational power. Lastly is the fact that denoising is not done in parallel but at the end of the render cycle.. extending the time it takes to render a frame and tanking FPS big-time. There's nothing redeeming about this feature in a real-time pipeline in which the industry trended towards higher resolutions and FPS. It's a turd being subsidized by consumers who don't care that it's a turd so long as its the priciest turd and they can brag about it .. and frankly its a match made in heaven.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
I get what you're saying on that over the whole frame time lower refresh rates would produce on average a more accurate colour relative to the input frame, I just don't see why that would be a criteria someone would care about in a fast moving game. Even assuming that it takes the full GTG time, the faster refresh rate should produce more accurate colours relative to the scene. Static images shouldn't matter, but say you take an extreme example of a test object of a fast moving 2x2 pixel black square moving from one side of the monitor to the other side on a white background, say it moves 1 pixel per ms. Say it takes 4ms to go to black, and another 4 back to white. Let's also pick 100 Hz and 200 Hz to make the numbers easier. At t=0, the four pixels on the far left of the monitor turn on, and at t=4ms they're fully black. At this point, both refresh rates are the same, and at t=4ms the squares are full black. Now at t=5ms on the 200Hz monitor the first square will start to fade out and another 5 pixels over will start to turn black. For the 100Hz monitor, the square at the left side of the screen will still be full black. At t=9ms, the 200Hz monitor has a fully black square starting 5 pixels from the edge, and the 100Hz has one at the edge. The leftmost square on the 200Hz has finished fading out. At t=10ms, the 100Hz monitor starts fading out the left edge square and turning on one 10 pixels in, and the 200Hz fades out the square at position 5 and starts turning on the same one at position 10.

You will definitely get ghosting, but the colour is better relative to the actual scene. Sure at the instant of t=10ms the square on the left edge of the screen at 100Hz is nice as black and has been for awhile, as is proper for its input signal, but the color is wrong and it shouldn't be black; that square's been gone 8ms. At 200Hz the pixels might not stay at the colour as long as at 100Hz, but that's because they're rendering the scene more accurately.

Likewise, if you're just looking at one pixel and it's stable red at t=0 when the refresh happens, if the color changes to orange at t=8ms both refresh rates would behave the same; they're start changing it to orange at t=10ms. If the switch to orange would have happened at t=4ms though, the 200Hz one would start changing to the correct colour at t=5ms instead of t=10ms, and would be fully settled long before the 100Hz one even started switching.

If you were comparing two monitors with different response times like a fast TN vs an IPS I could get on board with that, but outside of overclocking a panel faster than it's capable of I don't see how running a monitor a lot slower than it's capable of produces more accurate colours relative to the actual scene, which is what really matters.

It's not just relative to the input frame, it's absolute. Take my example one step further and make your refresh rate even faster. Let's say you have an old 8ms g2g IPS panel and you try and drive it at 165hz so it's 6ms per refresh, that means the pixels cannot even reach the right colour before they switch again. And then imagine an input that was flicking between black and white every refresh, and an 8ms g2g probably some more like 12ms b2w, what your display would be is just grey all the time, you'd never achieve black or white it'd just pulse between varying shades of grey never achieving either of them. The more you dial up the speed the less accurate the colours become, and the more you dial down the speed the more accurate they become, you spend more of every second with an accurate colour.

I know what you're getting at but I think you're confusing latency here, sure if you refresh more times per second then there's less time between what is "real" in the gaming world (that is to say, what exists right now in memory) and your perception of it, that's always true, it's one part of why gamers like faster refreshes because it eliminates lag between your mouse and other inputs, and what your eyes see. My old 12ms IPS is terrible for that, you can quite distinctly feel it, just down to pixel response time.

That's one of the things you get as a benefit of high refresh rates, I don't deny that, but one of the trade offs is that every second of display time your display is spending less and less of that second displayign accurate colours.

You could think of it another way. Every second your display refreshes a number of times, lets say 100hz for easy numbers again. So 100 frames means 100 colour transitions. Each transition takes as long as the monitors pixel response time (well...averages with g2g, and all that) but for sake of simplicity we'll just say the g2g value. So if your monitor takes 4ms to reach the right colour and you do 100 of those a second, then in that 1 second of time you have 4x100 = 400ms of that 1 second not displaying the right colours. If you do 200hz at 4ms then you spend 800ms of that 1 second displaying the wrong colours. You want that number to be as low as possible if you're spending 80% of your time displaying the wrong colours then that's very bad.

Where you draw the line is personal preference and a lot of gamers don't even understand the quality difference between even TN and IPS, they'll just get the cheaper TNs, which is fair enough. But people who appreciate colours generally go for IPS and they spend the extra cash to get it, it's just a bit silly to run those panels at a rate that drives down the colour accuracy. if you don't care about colour and you want to focus on speed then TN is a better deal in large part just because it's an awful lot cheaper.
 
Reactions: ub4ty

ub4ty

Senior member
Jun 21, 2017
749
898
96
It's not just relative to the input frame, it's absolute. Take my example one step further and make your refresh rate even faster. Let's say you have an old 8ms g2g IPS panel and you try and drive it at 165hz so it's 6ms per refresh, that means the pixels cannot even reach the right colour before they switch again. And then imagine an input that was flicking between black and white every refresh, and an 8ms g2g probably some more like 12ms b2w, what your display would be is just grey all the time, you'd never achieve black or white it'd just pulse between varying shades of grey never achieving either of them. The more you dial up the speed the less accurate the colours become, and the more you dial down the speed the more accurate they become, you spend more of every second with an accurate colour.

I know what you're getting at but I think you're confusing latency here, sure if you refresh more times per second then there's less time between what is "real" in the gaming world (that is to say, what exists right now in memory) and your perception of it, that's always true, it's one part of why gamers like faster refreshes because it eliminates lag between your mouse and other inputs, and what your eyes see. My old 12ms IPS is terrible for that, you can quite distinctly feel it, just down to pixel response time.

That's one of the things you get as a benefit of high refresh rates, I don't deny that, but one of the trade offs is that every second of display time your display is spending less and less of that second displayign accurate colours.

You could think of it another way. Every second your display refreshes a number of times, lets say 100hz for easy numbers again. So 100 frames means 100 colour transitions. Each transition takes as long as the monitors pixel response time (well...averages with g2g, and all that) but for sake of simplicity we'll just say the g2g value. So if your monitor takes 4ms to reach the right colour and you do 100 of those a second, then in that 1 second of time you have 4x100 = 400ms of that 1 second not displaying the right colours. If you do 200hz at 4ms then you spend 800ms of that 1 second displaying the wrong colours. You want that number to be as low as possible if you're spending 80% of your time displaying the wrong colours then that's very bad.

Where you draw the line is personal preference and a lot of gamers don't even understand the quality difference between even TN and IPS, they'll just get the cheaper TNs, which is fair enough. But people who appreciate colours generally go for IPS and they spend the extra cash to get it, it's just a bit silly to run those panels at a rate that drives down the colour accuracy. if you don't care about colour and you want to focus on speed then TN is a better deal in large part just because it's an awful lot cheaper.
Furthermore :
Looking at your data you might be thinking how you compare to the human average reaction time. Here it is! The average reaction time for humans is 0.25 seconds to a visual stimulus, 0.17 for an audio stimulus, and 0.15 seconds for a touch stimulus.

No one is doing anything extra amazing at 144fps (7ms) than they are at 60fps (16ms) visual stimuli because their biology (takes you 250ms) to respond is the bottleneck. Then there's the guy w/ 10ms lower latency than you from his ISP. The more you try to inform a person who has no understanding... The more they fight you. It's all so tiresome. Ignorance is bliss and a number of people embrace that. It makes them happy to get excited due to marketing campaigns. They feel like they are part of something bigger than themselves to ride the hype train. It makes them happy to spend tons of money even if it is just a marketing gimmick. As long as they are convinced there's some grand value in it. I've learned to just let them be and be happy.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
Like I said when the initial RT performance was revealed, when you start at barely playable you aren't going to be able to "optimize" and somehow get more performance when you increase resolution. It just doesn't work that way and certainly not with RT.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Nvidia shares down 17% lol

successful launch then.
> mfw you get intel'd
They deserve it. A lot of old guard tech companies have lost their minds lately and have engaged in absolute slaughter of their consumer base.
This at a time w/ so much global economic turmoil and uncertainty. Pro timing, I look forward to true competition in the coming years in GPUs from Intel, AMD, Nvidia and potentially new comers. Until then, they can keep these flaming turds and pound salt.
Maybe the 1% of gamers who fell for it will be enough to keep the lights on.
 
Reactions: psolord

amenx

Diamond Member
Dec 17, 2004
4,005
2,275
136
I doubt many bought RTX cards for RT features. It was pretty clear from day one these cards were miserable at it, aside from the dearth of game titles. Those who bought 2080ti's just wanted something faster than last fastest card (1080ti).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |