Geegeeoh
Member
- Oct 16, 2011
- 145
- 126
- 116
HDR needs tons of memory bandwidth, thats how the new cards pull ahead.Curious, how does HDR gimp the GPU?
HDR needs tons of memory bandwidth, thats how the new cards pull ahead.Curious, how does HDR gimp the GPU?
2080TI will be 3-5% more faster vs 2080 than 1080TI was vs 1080.Actually its 37% more bw. Far less than the delta between original 1080 and ti.
Shader delta is a bit higher than 1080vs ti.
TDP (power limit) delta is much smaller, only 19%
So with all that in mind it actually doesn't make much sense that it would be as much faster as claimed in the video.
That said, I don't think the 2080 in particular will be power limited when RT/Ts cores are not being utilized. hence its boost clock could potentially be capped (which would be great for overclockers)
They are saying here that 1080ti to 2080ti is about 35-45% performance increase on regular games. I didn't watch the whole thing but skipped around.
https://youtu.be/YNnDRtZ_ODM
I'd be interested in getting one but I hear for 4k it would be nice to get a 32. I think I could fit a 32 on my desk, but if it has those side panels too, maybe not. There's also a 32 or 34 model by Asus or maybe it's still Acer that should be out 2019 or 2020.The XB273K is HDR10 and DisplayHDR 400 certification. Acer released their X27 monitor, which is about $2000 and has DisplayHDR 1000 certification, at the same time as the Asus PG27UQ. The XB273K may be a more interesting monitor, especially if it doesn't have a fan.
More than the RTX cores (in very demanding games I am not one to shy from turning down shadows or lighting), it's the TPU cores and DLSS that are most interesting to me. But I really wonder what those DLSS slides are comparing. Is the 2080 doing 4k + DLSS compared to 1080 doing 4k + SSAA or MSAA? Or is the 2080 even rendering at a lower resolution than 4k and using DLSS to mimic it? If it's either of those, is it a fair comparison? DLSS is the wildcard left for me.
I almost sprayed my coffee out my nose, and now have a weird taste in my mouth.I am trying not get my hopes up that this might be SSAA quality at essentially no overhead like they seem to imply. But that would be a killer feature for many, especially on the lower end cards like the 2080/2070.
There is a performance hit (which varies) when enabling HDR on Pascal cards.
Some people are assuming that Turing cards will have no performance hit (or at least much less) and so the performance difference is higher with HDR enabled for both but this will not reflect the performance difference for SDR.
I almost sprayed my coffee out my nose, and now have a weird taste in my mouth.
This isn't meant as an attack on you at all, but this is where we're moving too. We have informed people making reasoned arguments, while calling the $800 and $600 options lower end. The market shift with this generation is remarkable.
I almost sprayed my coffee out my nose, and now have a weird taste in my mouth.
This isn't meant as an attack on you at all, but this is where we're moving too. We have informed people making reasoned arguments, while calling the $800 and $600 options lower end. The market shift with this generation is remarkable.
Is that right? I’ve never noticed a performance difference with it on or off. I haven’t done extensive testing of course and I always use vsync on a 60hz HDR TV so there’s that.
For me, It's a huge stretch to 2070, even at the promised eventual $500 pricing. I'd be happier if the 2060 could do DLSS but no RTX..., though I half suspect the GTX 2060 might just be a rebranded 1070, and GTX 2050 a rebranded 1060. It looks like we have a long wait to find out.
Yea I'm guessing rtx optimisations and also good idea not to launch it at the same time as red dead 2.BF5 has been delayed by 1 month - this may bode well for further RTX optimisations at launch.
As far as I know this is the only site that has tested it extensively and only done so with the GTX 1080 and at 4k -
https://www.computerbase.de/2018-07...force/2/#diagramm-call-of-duty-wwii-3840-2160
As you can see it varies game to game, some of negligible impact some with pretty large impact. I suspect this means it might vary scene to scene as well (as in where they test in the game).
As far as I know this is the only site that has tested it extensively and only done so with the GTX 1080 and at 4k -
https://www.computerbase.de/2018-07...force/2/#diagramm-call-of-duty-wwii-3840-2160
I'm cautiously optimistic on DLSS too, depending on whether it 'just works' on every title or if it basically depends on nVidia to push a good trained NN on a game-by-game basis. But like you say nearly zero overhead for very high quality AA would be an incredible feature, especially for 1440p and 4k where you still can see aliasing in motion but no card is fast enough to crank the AA
What worries me is that AMD optimized titles may not be eligible for NN training.
What worries me is that AMD optimized titles may not be eligible for NN training.
So by those graphs it shows you can get 100+ fps at 4k in battlefield 5 vs 67fps at 4k on the 1080ti? Strange how some games are so close in fps vs others you see +30-40 fps improvement.
If that's legit, perhaps the 2080 will also outperform the 1080ti.
Yea I noticed that was off too. I'm pretty sure with Max settings at 4k you can see around 60+ fps on witcher 3 on the 1080ti by looking at last year's benchmark results.Theres something seriously wrong with those numbers for Witcher 3.
They claim 44 fps average with max settings at 4k with 1080 ti -- that is barely above 980 Ti which averages around 37fps at 4k max.
In fact a 1080 ti should be averaging around 65 fps with everything maxed out, and a Titan V should be getting around 85 fps.