Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 28 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
This is a hot card. Easily hits like 57 C under water. My 1080 Ti used to never get past 50 C.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
It almost feels like they should have kept the Tensor cores out and just had Ray Tracing as a feature. In that way, the space for the Tensor cores could have been used to increase performance with increased SMs, ROPs, caches, etc. That would have also boosted the performance in Ray Tracing.

Maybe there's a reason why they did not do so. Moore's Law is being jittery nowadays, but we're still getting scaling, so more transistors in an area. What's really being slowed is in performance gains, and worse, power reductions. For the high end CPUs and GPUs though the design becomes entirely power and thermal limited.

So what do you do? You add accelerators to take advantage of the abundant area. Power limits are "solved" since accelerators only run sometimes, not always, which allows that part of the chip to power down.
Isn't the Tensor cores obligatory for practical use of the RT cores? Right now they can't do full Ray Tracing with the available hardware and must fudge (de-noise) the output. Works quite well, but still not fully accurate.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
I don't think it's a matter of taste. FXAA lowers overall IQ, edges are one thing that people focus on, but the IQ of the rest of the scene also matters. FXAA is indiscriminately lowering the quality of textures in the middle of polygons, as well as blurring the edges.

His point was that this doesn't matter for competitive shooters as you won't focus on such details at all. What matters is fps and an anti-blur display (strobbing). To show what this means look at this test. On my display I can read the street names with 0 issue. On a normal display all you see is bluuuur.
 
Reactions: psolord

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Isn't the Tensor cores obligatory for practical use of the RT cores? Right now they can't do full Ray Tracing with the available hardware and must fudge (de-noise) the output. Works quite well, but still not fully accurate.

Yes. Even without that it's likely (as per DLSS) people will find ways to use the tensor cores usefully for at least some games. Reasonably wide applicability and massive internal software development power at NV.

The real time tray tracing stuff is much more specialised I think?
 

coercitiv

Diamond Member
Jan 24, 2014
6,393
12,826
136
When you bring out the shill card, you probably have no arguments.
No shill card was brought out.

It was a joke related to the ever used "AMD card/tech will get better in time" to which the mandatory reply on this forum was "NVIDIA card gives the performance and features right now, no need for pointless waiting and hoping". While I must admit I myself was too optimistic about DX12 (by now I had hoped it would be much more utilized in AAA tiles), it's always a good idea to look back and learn from arguments used by both "camps". Truth is somewhere in the middle, isn't it?

Why not just tell me why drivers won't improve Turing performance over time?
Because I don't know whether there's any (meaningful) performance boost to unlock. What I do know though is one does not pay a premium over Pascal only to receive a fully unlocked product months before the 7nm gen launches.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
The problem is getting developer support... and that's going to be tough without the consoles supporting HW RT. And if the consoles support HW RT, that obviously means AMD will too.

This introduces an important point, dev support will typically be minimal until the console adopt, and we know from history that to keep the price of the consoles reasonable each console generation borrows tech from the PC mid range of the prior generation, which performance wise it's very bad for them compared to the biggest and best on the PC, typically something like 1/3 or 1/4 the speed. RT is only just possible in 1080p on the very best cards now, so consoles wont have this for at least 1 more generation (console generation, being like 5-6 years)

We'll see devs support it prior to then but it'll mostly be where Nvidia pays to ship engineers out to studios to help them implement it at low cost to the devs. Nvidia can only afford to do that so much, and only a handful of devs who want to be seen as the most technically proficient (like the Metro guys) will do it absent any other benefits.

But anyway as I've mentioned here before, that's just how these features evolve, it's chick/egg, it's awkward and slow performing at first and early adopters pay through their nose for it, but once it's all established and runs smoothly and looks the bomb, we'll all be glad we evolved through that phase. Same was true for DX9 shader adoption, and for tessellation and a bunch of things all throughout the history of GPU evolution. I'm still on the fence with the 2080 Ti, I actually just bought the new TR and performance at 4k maxed isn't great on my 1080 and I've got KCD in 2560x1440 to get it playable, so it's tempting to go that route anyway. No real promising news from AMD at this point.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Isn't the Tensor cores obligatory for practical use of the RT cores? Right now they can't do full Ray Tracing with the available hardware and must fudge (de-noise) the output. Works quite well, but still not fully accurate.

Argh. You are right. Thanks.
 

jpiniero

Lifer
Oct 1, 2010
14,835
5,452
136
This introduces an important point, dev support will typically be minimal until the console adopt, and we know from history that to keep the price of the consoles reasonable each console generation borrows tech from the PC mid range of the prior generation, which performance wise it's very bad for them compared to the biggest and best on the PC, typically something like 1/3 or 1/4 the speed. RT is only just possible in 1080p on the very best cards now, so consoles wont have this for at least 1 more generation (console generation, being like 5-6 years)

Both consoles are expected to not be released until 2020 at the very least though.
 

SirDinadan

Member
Jul 11, 2016
108
64
71
boostclock.com
Isn't the Tensor cores obligatory for practical use of the RT cores? Right now they can't do full Ray Tracing with the available hardware and must fudge (de-noise) the output. Works quite well, but still not fully accurate.
How do you define fully accurate and why do full ray-tracing when you can be smarter? There's no analytical solution for the full light transport equation except for some very special corner cases. In commercial 3D render engines you let the render converge to a good enough state that contains such low level of noise that human eye can't spot.

The trick with denoising is that you can get away with fewer samples and the AI can figure out the rest of it. Of course, 3D render engines and games will have different denoisers as the time budget to get a frame ready is not in the same order of magnitude - games can only afford 1-2 samples per pixel so the denoiser will have to work with a much noisier input.

DLSS is a stopgap to utilize the Tensor cores for something useful till hybrid raster/ray-tracing pipelines will be used in games. And the first few effects that use RT cores that will be found in BF5 and SotTR are just eye-candy, more accurate shadows and reflections. Metro will up the ante as the game will use it to mimick global illumination.
 
Reactions: PrincessFrosty

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
He has s 2080Ti I'm guessing.
Has anyone here got a 2080 or 2080ti?
Yes, I have a 2080 Ti with EK full cover block. I have it overclocked to 2085 Mhz on the core and 7699 on the memory with EVGA's 130% BIOS. Today it got as hot as 61 C playing Destiny 2 .

I have never had a GPU pass 50 C under water before no matter how overclocked it was.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
This is a hot card. Easily hits like 57 C under water. My 1080 Ti used to never get past 50 C.

Shouldn't be surprising. This die's power consumption is through the roof. Probably the highest of any SKU Nvidia has made, Titan, Quadro, or Geforce since Fermi.
 
Mar 11, 2004
23,175
5,641
146

That makes DLSS seem quite unimpressive, and I'm blown away that it is being touted as much as it is if this is really the intent for it. I'd think Nvidia would want to focus on how they can simply outdo AMD in brute force rendering (but then if they try and tout the 4K rendering capability, and talk up that paired with the BFG Displays, its really makes them seem like a joke, where its like sure if you can drop $5k-10k you can really show those consoles who's boss!).

And if I were Nvidia I'd have been touting it to developers, not consumers, for the time being. Let it develop and start to actually show tangible benefits. But then they had to make a case for all the extra space on the die being put forth for this stuff, otherwise they risk the RTX stuff being like Fermi. Actually this reminds me quite a bit of Fermi, exceptional performance, but kinda hot and power hungry and expensive. They had to make the case for Fermi's compute capability to justify it, which that was true of Vega and earlier pre-Polaris GCN cards too where they were pretty large, ran a bit hot and used quite a bit of power.

Plus, I wonder what the perf/W of it is when factoring in all the supercomputer analysis. Guessing that would cause a nosedive. And if it keeps trailing native rendering, then it'll be a pretty big dud except for perhaps the window between when they transition to the next jump in rendering (so say going from 4K to 8K, where we have 4K rendering down pretty well but not enough to do 8K proper yet). And this might be great for mobile and situations where you're power constrained (say for instance, consoles...) so you can try to get improved graphics at lower rendering resolution, but for high end desktop GPUs, this is not impressive. Plus if this breaks how well other "cheats" (like say checkerboarding) works or screws up other processing effects, then this will be more limited.

Which, have we seen Turing be fully loaded? Like doing ray-tracing, DLSS, and rendering? Because if not, then that means its power use is going to climb more (and/or it could potentially lead to throttling issues).
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
That makes DLSS seem quite unimpressive, and I'm blown away that it is being touted as much as it is if this is really the intent for it. I'd think Nvidia would want to focus on how they can simply outdo AMD in brute force rendering (but then if they try and tout the 4K rendering capability, and talk up that paired with the BFG Displays, its really makes them seem like a joke, where its like sure if you can drop $5k-10k you can really show those consoles who's boss!).

In the past something such as this would have been called a cheat. Times have changed I guess and now it's called innovation....In the end the jury is still out on DLSS.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
In the past something such as this would have been called a cheat. Times have changed I guess and now it's called innovation....
Yep, very true. A "feature" that intentionally drops the resolution by design. Remember when AMD were slammed for putting adjustable tessellation in their driver? But here dropping the whole resolution automatically is OK because Jensen "OMG 10 gigarays" Huang said so.

This card is another 5800 Ultra / Fermi: hot, power-sucker, overpriced, under-performing, and failed promises.
 
Last edited:

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
No, not cheating. The graphics card is still outputting a native resolution image in the end. If you ran this on a conventional gfx card it'd be crushingly slow.

If the neutral net can do the image reconstruction well enough it'll be a really good feature. If it can't then it won't.

We'll find out at some stage.

Since its a bit subjective I anticipate daft, bitter fights about it
 

coercitiv

Diamond Member
Jan 24, 2014
6,393
12,826
136
No, not cheating. The graphics card is still outputting a native resolution image in the end. If you ran this on a conventional gfx card it'd be crushingly slow.
If you ran this on a conventional gfx card you would use 1800p TAA vs. "4k" DLSS and get similar image quality for equivalent performance as Hardware Unboxed showed.
But dig a little deeper, and at least using the Infiltrator demo, DLSS is pretty similar in terms of both visual quality and performance, to running the demo at 1800p and then upscaling the image to 4K. Again, it’s just one demo, but pouring over the footage and performance data really tempered my expectations for what to expect when DLSS comes to real world games.

PS: on a side-note, it's funny to see reviewers inadvertently announcing gamers they did not need to play their games at native 4k, as they can get subjectively equivalent image quality for faster framerates by lowering render resolution.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,393
12,826
136
That last statement is I'd think good reason to reserve judgement about that article
Yeah, let's ignore screenshots and demo recordings and reserve judgement about the article. In fact, let's reserve judgement about all articles that examine DLSS.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
How do you define fully accurate and why do full ray-tracing when you can be smarter? There's no analytical solution for the full light transport equation except for some very special corner cases. In commercial 3D render engines you let the render converge to a good enough state that contains such low level of noise that human eye can't spot.

The trick with denoising is that you can get away with fewer samples and the AI can figure out the rest of it. Of course, 3D render engines and games will have different denoisers as the time budget to get a frame ready is not in the same order of magnitude - games can only afford 1-2 samples per pixel so the denoiser will have to work with a much noisier input.

DLSS is a stopgap to utilize the Tensor cores for something useful till hybrid raster/ray-tracing pipelines will be used in games. And the first few effects that use RT cores that will be found in BF5 and SotTR are just eye-candy, more accurate shadows and reflections. Metro will up the ante as the game will use it to mimick global illumination.

I think you're right about DLSS, the Tensor cores make up like 1/3rd of the compute transistor on the chip so not using them is a massive waste and DLSS allows you to give them useful work until they can be used for RT or more likely GI. I'm more interested in GI actually, accurate reflections are something we've fudged for years and I believe it's close enough in most cases that you can only tell if you stop playing the game and analyse the scene for inaccuracies. The benefit there will ultimately being the devs not having to put in as much effort to fake the effects once the hybrid pipeline is ubiquitous, in the short term it will be more work to do both as people transition. Good thing about GI is that you can load that on the tensor cores and then subtract a bunch of other lighting work off the raster cores and actually boost traditional performance with better visuals. Assuming there's enough Tensor cores to keep up.

My concern still strongly remains, are AMD going to do a 100% traditional rasterization and BTFO these cards because they're not wasting (in some peoples minds) that large part of the chip on mostly unused features, it's a lot of space...on the other hand do we even need that much power, the 2080 ti despite wasing that is a great 4k card...maybe the only thing AMD will be able to is get that same performance but at like 50% the cost.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
If you ran this on a conventional gfx card you would use 1800p TAA vs. "4k" DLSS and get similar image quality for equivalent performance as Hardware Unboxed showed.

PS: on a side-note, it's funny to see reviewers inadvertently announcing gamers they did not need to play their games at native 4k, as they can get subjectively equivalent image quality for faster framerates by lowering render resolution.

Back in the day when we were arguing over 1024 and 1280 and small resolutions people would really slam up-scaling for looking awful, I think that was off the back of the CRT vs LCD arguments. Anyway, scaling did look rubbish but actually now we're talking about much higher initial resolutions like taking 2560x1440 up to 4k, it doesn't look as bad. You get that kind of blur effect that is the result of the up-scaling and it looks an awful lot like FXAA to me. So partly this could be a side effect of us being used to a lower image quality due to ubiquitous use of post processing AA these days, or maybe the upscaling techniques are better, or maybe starting with much higher resolutions is helping, or some combination of all those things.

I've been forced to play KCD (great game btw) in 2560x1440 because I just can't get 4k above 30fps no matter what I disable, but at 2560x1440 up scaled to 4k is actually doesn't look too shabby, I'm starting to think that DLSS might look about the equivalent. It probably just corrects for a few rendering artefacts you get when you do the up-scaling, but I think rendering artefacts from up-scaling get less noticeable the higher resolution you start with.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Yeah, let's ignore screenshots and demo recordings and reserve judgement about the article. In fact, let's reserve judgement about all articles that examine DLSS.

Let’s not

Can we, though, be a bit careful when people basically propose there’s no point rendering at 4K native? That would have non trivial implications if it held up.

For one thing you’d have to test scaling DLSS down a resolution peg as well

Think we know roughly what to expect from DLSS as neural nets are quite well known - it’ll work very well in general (they’re very good at image reconstruction) but throw up a few real weird artifacts at times. Doing stuff like installing texture packs, mods etc could turn it really odd.
 

SirDinadan

Member
Jul 11, 2016
108
64
71
boostclock.com
My concern still strongly remains, are AMD going to do a 100% traditional rasterization
I wonder when they will share some details about how do they plan to challenge the RTX line-up and tech. Although one can bad-mouth NVIDIA for the underwhelming RTX-tech adoption in current games, 3D render engines will have updates to utilize RT cores pretty soon, even CPU only engines will incorporate the tech.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |