HDR support, displays and video cards, oh my.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The way you wrote it ("The only way I can get HDR working right now...") strongly implied that you had tried and couldn't get it working on your PC - especially in a thread specifically about PC HDR support.
I'm only using my Samsung 4k TV's built in apps, because I can't get it working on my PC because my cards+software don't support it.
 

simas

Senior member
Oct 16, 2005
412
107
116
VirtualLarry - so how do you like the 40 inch TV/monitor so far? The Amazon reviews were nasty complaining that it is not suitable for computer use. thank you as I am considering something similar
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
I don't know, seems like 10 bit color falls into the same category as color space. We can display sRGB without banding with 8 bits and that color space and bit depth is already more than the human eye can discern.

I'd be glad to be proven wrong but the color fidelity stuff sounds comparable to high sample rate and high bit depth audio. Yeah, 192 kHz and 24 bit "more closely" approximates an analog audio waveform than 44.1 kHz / 16 bit, but to your actual senses, it makes no difference. You can't hear the information contained in the higher sample rate because all that buys you is ultra high frequency information which is above human hearing range, and 44.1 kHz captures all frequencies you can hear. The increased bit depth serves to decrease quantization noise, which is already lower than the threshold of human hearing at 16 bit. There's a reason SACD and DVD-Audio never took off.

The real improvement is going to come from display tech allowing for brighter whites and darker blacks.

LOL, the human eye can catch all color. This is the same bullshit as the human eye can't see over 60hz, now that we have 144hz monitors the difference is clear as night and day. In fact if we get 244hz monitors we are probably going to be able to notice the difference.
 

freeskier93

Senior member
Apr 17, 2015
487
19
81
That's not true at all, gradients can easily be seen on 8bit content if you go much higher than 100 nits. HDR content is mastered or generally displayed at 1000+ nits, which is why the 10 bit gradient is important. Content mastered and displayed with a 1000 nit ceiling would have all sorts of banding issues if it were 8 bit content.



It's not quite the same. Your statements about 8 bit color are true... if you assume a maximum light output of 100 nits. Imagine if your statement about 16bit sound were true, but only up to about 55 dB. If people wanted audio mastered for 90dB, 16bit wouldn't cut it anymore. Same thing with 8bit vs 10bit. 8bit is fine up to 100 nits, but HDR content is made to be displayed with a range up to 1000 nits.



Darker blacks more than anything. This is the main reason that plasma was always superior to LCD, and now OLED is.

The issue is actually two fold because increased color space also decreases the resolution of colors that can be displayed.

If you look at the Rtings reviews you'll find that a lot 10-bit panels still show banding too. For me there's no middle ground on color banding, either you see it or you don't. So why the hell spend more money on some fancy 10 bit display if you still see banding?

The ultimate display will be 12-bit OLED with rec.2020 coverage, but it's going to be a while until we get there. Currently, I think the absolute best bang for the buck is a good quality VA panel with local dimming, bit depth doesn't really matter. Love my Vizio M series. Contrast isn't as good as OLED but damn does it still look good, especially given the price.
 

repoman0

Diamond Member
Jun 17, 2010
4,536
3,442
136
LOL, the human eye can catch all color. This is the same bullshit as the human eye can't see over 60hz, now that we have 144hz monitors the difference is clear as night and day. In fact if we get 244hz monitors we are probably going to be able to notice the difference.

The human eye has a finite ability to discern between nearby regions of the CIE chromaticity diagram, and much of the green region looks the same to most people, so much of the expanded gamut is pointless. It probably does change things having that different green primary available, but the number of additional colors isn't as drastic as it appears.

There's plenty of research on the subject if you want to actually learn subtleties beyond "the human eye can catch all color" which is obviously false. It makes no sense because color isn't even a physical property of the world, it's your brain assigning a visual cue to some mixture of light wavelengths. For example, purple light does not exist, but your brain perceives purple when wavelengths representing red and blue mix.

It's nowhere near the same as the 60 Hz issue and there was plenty of similar visual processing research available during that argument to support the fact that higher refresh rate would be perceptible, so I have no idea why you brought it up.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Ahh OLED is one of the few technologies that is more or less natively capable of displaying HDR. Especially given each pixel is individually illuminated. It's the gold standard right now. The cheaper HDR TV option with be IPS or VA with local dimming.

Are there fundamentally different versions of OLED? Otherwise I don't see how they will be used for PC monitors due to the image retention problems they exhibit. We never saw plasma pc monitors for basically the same reason. Here is an example of the problems from a rtings.com review:

https://youtu.be/xz8O3sUH7xc?t=2m29s

It's bizarre that this site gives OLED's recommendations as a great PC monitor all over the site (though they do mention the image retention problem), but then you watch the video review above and they specifically say not to use this display as a monitor.

As a fervent believer and user of plasma TV's, I am well accustomed to the burn in issue. If I play video games for a while with static parts of the screen, I will still see those objects for a while after switching content. How well will OLED's perform in gaming if it takes only 5 minutes for the level of retention you see in the video above?

I have friends that use S6 Galaxys, which use OLED, for work and depending on the color of the background you can see the GPS UI burned into the screen. It seems to be a fundamental problem with the technology. How is this going to be eliminated to make the tech usable for PC monitors?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
What parts of "HDR" really mean anything? Does anybody know? For typical use, 10-bit color doesn't make a difference as far as I know. It seems like what we really need is new display tech with deep blacks and very bright whites, not wider color gamut or 10 bit color.
You can definitely notice the wider color gamut. Especially in greens and blues, trees and leaves are represented much more like how they look in real life especially.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,446
10,113
126
VirtualLarry - so how do you like the 40 inch TV/monitor so far? The Amazon reviews were nasty complaining that it is not suitable for computer use. thank you as I am considering something similar

There were precious few reviews on that Avera TV when I bought it, like a very small handful of reviews, both on Amazon and Newegg.

I'm using a Sapphire Radeon RX 460 4GB Nitro card, in a Skylake i5 rig, with Windows 10, connected via HDMI2.0.

Text isn't great, there's sort of a sharpness "ringing" around the text.

I think I have it set to VSR 5140x2880, and then Windows scaling factor set to 125%. Then it's decent.

I've got the TV set on "Power Saver", which doesn't allow adjustments to the picture.
 

simas

Senior member
Oct 16, 2005
412
107
116
Thank you Larry - you have given me your set up but did not your impressions . What you think about it so far (completely subjective)?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,446
10,113
126
Well, like I said, text isn't great, at 100% scaling. At 125% it's readable.

YouTube 4K nature videos look INCREDIBLE. Gaming is pretty decent too, at least, Skyrim (vanilla). I need to download the free high-res texture pack DLC, as you can see the textures at 4K res.
 
Reactions: simas

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Are there fundamentally different versions of OLED? Otherwise I don't see how they will be used for PC monitors due to the image retention problems they exhibit. We never saw plasma pc monitors for basically the same reason. Here is an example of the problems from a rtings.com review:

I use my LG OLED exclusively as a monitor in my living room. I had a plasma previously, image retention/burn in was painfully obvious on that screen. On the OLED, I haven't noticed it, ever. Will have to look this weekend and see if can spot it, but switching from browser/desktop into movies I don't see any residuals like I used to on plasma.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I use my LG OLED exclusively as a monitor in my living room. I had a plasma previously, image retention/burn in was painfully obvious on that screen. On the OLED, I haven't noticed it, ever. Will have to look this weekend and see if can spot it, but switching from browser/desktop into movies I don't see any residuals like I used to on plasma.
Image retention is one thing - which might be annoying, but that's about it - but burn-in is a serious issue with OLED that noone is close to solving. At my job, we have various Samsung phones and tablets on display, running a demo loop. The loop is constant, running whenever the device isn't used for ~30 seconds, and has zero static elements. The devices automatically sleep outside of opening hours. And even with all that, there is serious burn-in on anything more than a year old. Galaxy S6, Tab S2, they've all got horrible, very visible burn-in. Sure, they're essentially displaying images for 12-ish hours a day, six days a week. But a PC monitor used for work will se as much in a year and a half, if not sooner - and with A LOT more static elements on screen. Like the Windows Task Bar? You better, because it's going to be there for as long as the display is in use.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,446
10,113
126
Heck, my older Westinghouse 32" LCD HDTV (CCFL backlight) has burn-in. Or image retention, but it's been there for quite some time. Heavy contrast (white on black) seems to really cause it.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
The issue is actually two fold because increased color space also decreases the resolution of colors that can be displayed.

If you look at the Rtings reviews you'll find that a lot 10-bit panels still show banding too. For me there's no middle ground on color banding, either you see it or you don't. So why the hell spend more money on some fancy 10 bit display if you still see banding?

The ultimate display will be 12-bit OLED with rec.2020 coverage, but it's going to be a while until we get there. Currently, I think the absolute best bang for the buck is a good quality VA panel with local dimming, bit depth doesn't really matter. Love my Vizio M series. Contrast isn't as good as OLED but damn does it still look good, especially given the price.

The banding present in 10bit displays is ultimately due to white-balance and gamma accuracy. The white-balance charts posted on rtings for example, are just 10 pt readings. While this seems like it makes sense, since most sets only have 10 pt WB controls, it doesn't mean that a televisions WB controls can perfectly tune a sets greyscale. If you do 20pt readings, in most cases, the dE will be higher, and the chart will not be as smooth, this is what is showing up in 10bit banding tests. This of course can be perfected with a 3Dlut or just better WB controls.

Several of the Sony sets actually exhibit perfect gradients (x930D for example), since Sony is generally known for superior video processing.
10 bit processing is pretty new for televisions, and i imagine it will improve in the coming years.

Make no mistake though, the banding on 10bit is not nearly as bad as with 8bit, and that difference becomes much more apparent when you are talking about the much larger luminance range of HDR
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |