Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 44 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
where do you come up with this stuff?
Nvidia will not waste a billion dollars on RT and cancel it because AMD becomes competitive.

History. Look back at the transition from Kepler to Maxwell where NVidia cut out a lot of their compute (look at the double precision), in part because AMD had picked up a lot of market share since launching the 4000 series. NVidia had been investing a lot into compute, but AMD had a much leaner architecture that let them gain some ground.

It's actually really funny, because right when NVidia decided to segment compute into a separate card, AMD launched their new architecture that included it in order to catch up with what NVidia was doing with CUDA. This is why AMD now has the reputation of the power hungry cards, whereas it was formerly NVidia that was getting dinged for Fermi (endearingly called Thermi) being a power hog.

Now, NVidia didn't waste that, since they just cut the technology from their mainstream cards. You can still buy cards (Volta) that're just oozing compute power, but they largely trimmed that from their mainstream cards starting with Maxwell, and it was really smart for them to do so given how dominant they've been since. AMD bet on compute being big, but there are only a handful of games where that's paid out for them.

If AMD manages to come out with a competitive new architecture, RTX is going to end up relegated to the high end TITAN style cards so that transistors can be thrown at practical performance. At the end of the day, that's what's going to move cards to the most consumers.
 
Reactions: happy medium

happy medium

Lifer
Jun 8, 2003
14,387
480
126
"AMD will definitely respond to DirectX Raytracing" - David Wang"

AMD looks to have their own RT solution. Seems RT will be around a while.
https://www.overclock3d.net/news/gp..._respond_to_directx_raytracing_-_david_wang/1


"AMD Announces Real-time Ray Tracing Support for ProRender and Radeon GPU Profiler 1.2"
http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=thread...ion-thread-reviews-and-prices.2554168/page-44

Microsoft has built in RT support in direct x 12 but no one is going to use it .....
Just the two largest GPU makers and the largest game developers.

How can anyone think this RT is just a gimmick?

We should be thanking companies like Nvidia for trying to give us MORE than console graphics, but instead people are crying about it.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,387
12,812
136
We should be thanking companies like Nvidia for trying to give us MORE than console graphics, but instead people are crying about it.
They are not giving, they are selling. Just like companies watch their bottom line, consumers want value for their money as well. It's offer and demand, not offer and awe. Nvidia isn't a startup company seeking funding on Kickstarter, the RTX lineup isn't a form of loan or stock investment, it's a bunch of finished products that need to offer better value over their predecessors in order to warrant consumer investment.

Kepler, Maxwell, Pascal - they all dominated the gaming market because they were drastically optimized to deliver gaming performance on the day of arrival. People were thanking AMD for giving us Mantle as a revolutionary step towards DX12 while spending their money on Nvidia cards. Having my gratitude does not equal having my money as well.

Face reality already: RT is the future, but this future starts with professional rendering. Professionals will be both grateful and happy to pay, because unlike gamers they will get excellent value in return.
 
Reactions: Elfear and Mopetar

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
Here's a fully-reflective floor from Unreal:

This was possible 20 years ago, on a software-rendered Pentium 166MMX.

Now we have small patches of reflective mud reducing a $1200 graphics card to 25% of its performance, and this special mud requires the failure that is DX12 running on metrosexual OS(tm).

Something's very wrong here, people.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Furthermore :
Looking at your data you might be thinking how you compare to the human average reaction time. Here it is! The average reaction time for humans is 0.25 seconds to a visual stimulus, 0.17 for an audio stimulus, and 0.15 seconds for a touch stimulus.

No one is doing anything extra amazing at 144fps (7ms) than they are at 60fps (16ms) visual stimuli because their biology (takes you 250ms) to respond is the bottleneck. Then there's the guy w/ 10ms lower latency than you from his ISP. The more you try to inform a person who has no understanding... The more they fight you. It's all so tiresome. Ignorance is bliss and a number of people embrace that. It makes them happy to get excited due to marketing campaigns. They feel like they are part of something bigger than themselves to ride the hype train. It makes them happy to spend tons of money even if it is just a marketing gimmick. As long as they are convinced there's some grand value in it. I've learned to just let them be and be happy.

Yeah human response time is interesting what you're saying is right but there's one instance I think it's more complex and that's compounding of reaction time. This is most notable to me with aiming in a FPS game. Have you ever tried aiming with a mouse where vsync is on (with no adaptive refresh rate) and there's a forced delay on the output, what happens is aiming becomes much harder because you kinda go to aim but you get the reaction late and you're off a bit and then you adjust your aim but you don't see that update until a bit later and them you need to readjust, and you're in this nasty feedback loop where you're kinda constantly chasing your own input.

I think that matters a lot more, you view the output from the screen and that informs your output to the keyboard and mouse, but that's not a discreet thing that happens once, it's continual in a loop. When you enter a motivated state, say a bad guys runs around the corner and you enter a motivated state to shoot him in the head,you have this discreet block of time where you need to move your cross hair over his head and then shoot ,but in that time you're actually on this rapid loop of move your mouse across the mousepad and constantly micro adjust the entire time. I think that the latency there is in some sense compounding and that can lead to quite staggering differences in the total time it takes to aim. The reaction to the stimulus of someone appearing in the first place is more or less just 1 discreet event and the time it takes for a pixel to flip is a very small almost negligible difference which is I think essentially what you're saying, and I definitely agree with that.

It'd actually be interesting to see some data on this, how long it takes for people to aim an accurate headshot with artifical delay added in their mouse movements, I think even 12ms or so of pixel response I had on my old IPS panel is physically noticeable when you're doing smooth elongated mouse movements.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
From TPU.

The remaining settings add a little bit of "movie-like" fidelity for people who like these effects

So they are making a movie. And movies are rendered using Ray Tracing techniques.

For an action intensive game like Battlefield, I can't see the benefits of Ray Tracing at all, even if it offered the performance people wanted. How many people notice it anyway.

For slower paced cinematic games I think it makes a lot of sense. Perhaps that's why most games are going that route?
 
Reactions: coercitiv

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Here's a fully-reflective floor from Unreal:

This was possible 20 years ago, on a software-rendered Pentium 166MMX.

Now we have small patches of reflective mud reducing a $1200 graphics card to 25% of its performance, and this special mud requires the failure that is DX12 running on metrosexual OS(tm).

Something's very wrong here, people.

Seems like the problem is we have a ton of processing power that keeps going up, but devs get worse and worse every generation at optimizing performance? Is that what you are saying?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
There's a certain delicious irony where the best way to experience BF5 is DX11, yet again: https://www.computerbase.de/2018-11...#abschnitt_day1patch_verbessert_directx12pfad

When an AAA developer like Dice backed by EA's funding fails with DX12 in two games (along with most other DX12 games from other developers), it's time to admit low-level APIs have failed and stop wasting time and resources on them.

And given RTX runs on top of DX12, it's a failure on top of a failure.

Another example from 2000, Pentium2-300 minimum:

It's similar to hardware PhysX where simple waving banners or moving debris could tank your framerate. Meanwhile back in 2001, Red Faction let you dig your own tunnels in real-time on a Pentium2-400. I've yet to see a single PhysX game have anything like this, hardware or otherwise.

Seems like the problem is we have a ton of processing power that keeps going up, but devs get worse and worse every generation at optimizing performance? Is that what you are saying?
No, I'm saying in some cases we're absolutely going in the wrong direction. Larrabee, hardware PhysX, DX12, and now RTX. They're solutions looking for problems which don't even end up being solutions at all.
 
Reactions: psolord

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Another example from 2000, Pentium2-300 minimum:

Maybe any Ray Tracing should be done on the CPU, and leave GPUs to do 3D work, to use those unused CPU cores. CPUs are actually not that terrible at it.

This may be an example where competition doesn't result in good. Nvidia wants a piece of Intel's pie, so they are pushing ray tracing. Less work for CPU means more GPU demand.

Again, Moore's Law slow death is going to bring about a change in this line of thinking or no one will benefit.

No, I'm saying in some cases we're absolutely going in the wrong direction. Larrabee, hardware PhysX, DX12, and now RTX. They're solutions looking for problems which don't even end up being solutions at all.

We had nearly free, enormous gains for decades. Now that's gone, so frantically everyone is trying to find a solution to keep up the pace of advancements. It will not come without sacrifices, unfortunately.

The difference between RTX and older changes like hardware T&L, and programmable vertex shaders is that in the old days you had enormous gains to be had from increasing TDPs and moving to a new process. Both of which are becoming a precious resource.
 
Reactions: Headfoot

Flash831

Member
Aug 10, 2015
60
3
71
I wonder if Nvidias decision to go for a ultra large chip (2080ti is ~775mm^2) is going to become a headwind for Nvidia going forward.
While AMD is lacking the RTX cores/tech they are now going 7nm based on a much smaller chip to begin with. Nvidia is not going to be able to die shrink Turing until the 7nm yields have matured considerably which opens up an opportunity for AMD. I doubt Nvidia will release 7nm cards until 2020.

Interesting times ahead boys and girls!
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
I wonder if Nvidias decision to go for a ultra large chip (2080ti is ~775mm^2) is going to become a headwind for Nvidia going forward.
While AMD is lacking the RTX cores/tech they are now going 7nm based on a much smaller chip to begin with. Nvidia is not going to be able to die shrink Turing until the 7nm yields have matured considerably which opens up an opportunity for AMD. I doubt Nvidia will release 7nm cards until 2020.

Interesting times ahead boys and girls!

Nvidia is reportedly supplying a next generation GPU called "Volta Next" along with AMD 7nm CPUs for the Perlmutter supercomputer. The die doesn't have to be that big if it's just CUDA cores + tensor for the first next generation GPU.
 

coercitiv

Diamond Member
Jan 24, 2014
6,387
12,812
136
Maybe any Ray Tracing should be done on the CPU, and leave GPUs to do 3D work, to use those unused CPU cores. CPUs are actually not that terrible at it.
Do you honestly think ~100mm2 worth of general compute silicon can come anywhere close to the RT performance of more than 300mm2 of specialized logic?

If anything, the fact that it takes so much die area for even minimal RT usage in games proves just how compute intensive this endeavor will be. Gamers Nexus just did a nice analysis on RTX in BF5 and they outlined quite a few cases where rendering quality is problematic, one example being a moving water surface, in which case the current hardware is just overwhelmed.

I must admit though, the thought of RT running on less specialized hardware is very tempting, as it eliminates one key problem with what we're seeing now: huge silicon budget for RT.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
Furthermore :
Looking at your data you might be thinking how you compare to the human average reaction time. Here it is! The average reaction time for humans is 0.25 seconds to a visual stimulus, 0.17 for an audio stimulus, and 0.15 seconds for a touch stimulus.

No one is doing anything extra amazing at 144fps (7ms) than they are at 60fps (16ms) visual stimuli because their biology (takes you 250ms) to respond is the bottleneck. Then there's the guy w/ 10ms lower latency than you from his ISP. The more you try to inform a person who has no understanding... The more they fight you. It's all so tiresome. Ignorance is bliss and a number of people embrace that. It makes them happy to get excited due to marketing campaigns. They feel like they are part of something bigger than themselves to ride the hype train. It makes them happy to spend tons of money even if it is just a marketing gimmick. As long as they are convinced there's some grand value in it. I've learned to just let them be and be happy.

Already having this discussion with someone else in another thread. All things equal the higher refresh rate will show a frame earlier meaning you can react earlier all other things equal. Yeah. Diminishing returns. if you are bad at a game no display will change that. Same as I won't magical run faster by buying the super duper jogging shoe but for an experienced jogger it could be worth it. So again all other things equal, the difference will matter. It's simply statistics. Having the lower latency will improve your score (KDR, whatever) over time (not a single fight!) by a certain amount. I noticed this myself when moving from a crappy mouse to a better mouse and from 60hz to 144 hz anti-blur. Clearly visible in the stats. The difference might be small but it is there and up to the user to decide if it is worth his money.

I mean this could easily be tested in a reaction test. one person gets to see the trigger 7ms earlier than the other and the one pressing/reacting first wins. That person won't win every single time but playing long enough a trend will arise.

And network latency should be covered by the game up to a certain degree and issues well present in many games. Every wondered why you get killed after running around the corner and the opponent theoretically not able to see you anymore? Well that is because he lags and on his screen he still saw you and still hit you but for you it look like he is using "smart bullets" shooting around corners.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Maybe any Ray Tracing should be done on the CPU, and leave GPUs to do 3D work, to use those unused CPU cores. CPUs are actually not that terrible at it.

This may be an example where competition doesn't result in good. Nvidia wants a piece of Intel's pie, so they are pushing ray tracing. Less work for CPU means more GPU demand.

Again, Moore's Law slow death is going to bring about a change in this line of thinking or no one will benefit.



We had nearly free, enormous gains for decades. Now that's gone, so frantically everyone is trying to find a solution to keep up the pace of advancements. It will not come without sacrifices, unfortunately.

The difference between RTX and older changes like hardware T&L, and programmable vertex shaders is that in the old days you had enormous gains to be had from increasing TDPs and moving to a new process. Both of which are becoming a precious resource.

What unused cores? Games like Battlefield will use all 8 threads on an i7. There really isn't any CPU available. Not to mention they would be significantly slower than a GPU at it.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
a secondary card dedicated to RTX would make more sense. Buyers that want it can purchase and the main GPU doesnt need silicon wasted on it. Its pretty, but getting turned off for performance on my system. If Dice were to implement multi-gpu in DX12 I would give it another go. No way am I dropping resolution for it.
 

jpiniero

Lifer
Oct 1, 2010
14,830
5,442
136
I'm still not convinced that the RT cores are the reason for Turing's bloat; compared to the integer or tensor cores.
 

coercitiv

Diamond Member
Jan 24, 2014
6,387
12,812
136
I'm still not convinced that the RT cores are the reason for Turing's bloat; compared to the integer or tensor cores.
Tensor cores should be counted together with the RT cores, neither is worth much without the other.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Yeah human response time is interesting what you're saying is right but there's one instance I think it's more complex and that's compounding of reaction time. This is most notable to me with aiming in a FPS game. Have you ever tried aiming with a mouse where vsync is on (with no adaptive refresh rate) and there's a forced delay on the output, what happens is aiming becomes much harder because you kinda go to aim but you get the reaction late and you're off a bit and then you adjust your aim but you don't see that update until a bit later and them you need to readjust, and you're in this nasty feedback loop where you're kinda constantly chasing your own input.

I think that matters a lot more, you view the output from the screen and that informs your output to the keyboard and mouse, but that's not a discreet thing that happens once, it's continual in a loop. When you enter a motivated state, say a bad guys runs around the corner and you enter a motivated state to shoot him in the head,you have this discreet block of time where you need to move your cross hair over his head and then shoot ,but in that time you're actually on this rapid loop of move your mouse across the mousepad and constantly micro adjust the entire time. I think that the latency there is in some sense compounding and that can lead to quite staggering differences in the total time it takes to aim. The reaction to the stimulus of someone appearing in the first place is more or less just 1 discreet event and the time it takes for a pixel to flip is a very small almost negligible difference which is I think essentially what you're saying, and I definitely agree with that.

It'd actually be interesting to see some data on this, how long it takes for people to aim an accurate headshot with artifical delay added in their mouse movements, I think even 12ms or so of pixel response I had on my old IPS panel is physically noticeable when you're doing smooth elongated mouse movements.

Over-exaggeration. If it takes you 250ms to respond to a visual stimulu, 8ms (60hz to 120hz) isn't going to make that much a difference in the grand scheme of things. You're jumping through too many hoops while avoiding the elephant in the room that is clear in numbers... You're better off improving your APM and reaction time than investing in overpriced hardware. I've been building and gaming for as long as I can remember playing a range of different games including a slew of FPS. Myself and a good number of others rank pretty high on stats and were running on bargain basement hardware. It's more about your skill as a human being than what hardware you're running on. You can't buy your way to a victory. You can use it to mitigate some portion of your lack of skill. Other than that, it's just a feel good experience. No one honestly cares about your stats. There is no prize or cookie for getting the best stats on some random CS:GO server. As for that, the majority of people are on 1050tis or less. 1-3% are on high end hardware monitors/gpus. So, you're talking into the wind IMO. It's great that you decided to spend lots of money on hardware but please don't try to make it out to be something its not. Game developers cater to the mass market to stay in business not the 1-3% who own high end equipment. You have little to no edge with such hardware.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Already having this discussion with someone else in another thread. All things equal the higher refresh rate will show a frame earlier meaning you can react earlier all other things equal. Yeah. Diminishing returns. if you are bad at a game no display will change that. Same as I won't magical run faster by buying the super duper jogging shoe but for an experienced jogger it could be worth it. So again all other things equal, the difference will matter. It's simply statistics. Having the lower latency will improve your score (KDR, whatever) over time (not a single fight!) by a certain amount. I noticed this myself when moving from a crappy mouse to a better mouse and from 60hz to 144 hz anti-blur. Clearly visible in the stats. The difference might be small but it is there and up to the user to decide if it is worth his money.

I mean this could easily be tested in a reaction test. one person gets to see the trigger 7ms earlier than the other and the one pressing/reacting first wins. That person won't win every single time but playing long enough a trend will arise.

And network latency should be covered by the game up to a certain degree and issues well present in many games. Every wondered why you get killed after running around the corner and the opponent theoretically not able to see you anymore? Well that is because he lags and on his screen he still saw you and still hit you but for you it look like he is using "smart bullets" shooting around corners.

The majority of your performance comes down to you. Something people who like blowing tons of money on things don't like to hear. There's a bit of an elitist attitude that has crept into gaming whereby people like to believe you're not a gamer, enthusiast, nor will get good stats if you don't have hardware that 1-3% of gamers own. It's marketing and you bought the bait. A player has 20ms latency.. another has 80ms. What happens to all of your thousand dollar gimmicks in such a case? It's a game after-all ... Jesus

There's 10-30ms on keyboard or mice. Add even more if you are hooked into a cheapo usb hub. Add more if wireless.. There's teen level latency all over the place and its insignificant in the grand scheme of things. You can probe this with a USB protocol analyzer.

No one is making money off of winning except for pros. You can by a 9900k/sli two 2080tis 144hz monitor and I guarantee you I can find a slew of people with 1050ti/ancient quad core on DDR3 that can kick your butt. If you wanna blow tons of money on hardware, by all means do it.. It's your money.

But don't pretend like it makes you an elite gaming enthusiast. You're just another random IP playing vidya along w/ the rest of the world likely getting dominated by someone on poverty tier hardware who logs more hours and thus has more skill. Also, lets not forget the age factor when it comes to things like reaction time. Gaming is gaming. It's for fun. If you're a pro, you run the cutting edge hardware because you're sponsored as you're a billboard for the masses to buy similar wares. If you don't realize this yet in life, it's time to wake up.

As a casual gamer.. You play to have fun. You're sometimes top ranked in a server.. sometimes last ranked. It's insignificant.

The end of this debate.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |