HDR support, displays and video cards, oh my.

VirtualLarry

No Lifer
Aug 25, 2001
56,447
10,117
126
I recently picked up an Avera 40" 4K UHD HDR TV, on sale BF week at Newegg, fairly inexpensively.

Anyways, it supports HDR, and so does my RX 460 4GB Nitro card.

Running Win10 1607 64-bit as the OS.

YouTube 4K nature videos look *spectacular*, and I think HDR is likely working.

There was a Newegg review on that TV, by someone with a GTX1070, and he was complaining thre the colors were only so-so and that it seemed like HDR was a marketing gimmick. He also mentioned his HDMI dynamic range setting in the drivers was "limited".

So, I'm guessing that HDR wasn't working for him.

Is there a list of what card / output / display / driver combinations enable HDR throughout the pipeline?

Is there any way to objectively tell that HDR is actually working?
 

David_k

Member
Apr 25, 2016
70
1
41
Most cases with this kind of generic displays, not all of specs are really true, also, displays should indicate on screen when they enter HDR mode.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Yeah I'm not sure that TV actually displays in HDR, it probably just says HDR because it can take an HDR input without issue. A lot of TVs are marketed that way.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
So you have to have a combination of all several things for HDR: 1. HDR capable gpu and of course the proper drivers, HDR monitor and HDR content and HDR port. So if you run HDR on hdmi 1.2 for example when all other stuff are HDR you won't get HDR. If video/game in not HDR you don't get HDR benefit from your hardware, etc...

Then some HDR content might be better than other, certain games will take proper advantage of HDR, while others not so much. Right now for average use HDR is meaningless, its just too little of the market, I suspect its going to be at least 2 years before its a standard thing!
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The only way I can get HDR working right now is via the built in streaming apps or a 4k UHD BluRay player, HDR support is still nascent on the PC side unfortunately. I can't get HDR YouTube to work anywhere personally, even via built in app on my 4k HDR tv.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
The only way I can get HDR working right now is via the built in streaming apps or a 4k UHD BluRay player, HDR support is still nascent on the PC side unfortunately. I can't get HDR YouTube to work anywhere personally, even via built in app on my 4k HDR tv.
Considering the PC in your sig has R9 290s, that's not a surprise - they don't support HDR in any form.

As for how Windows treats it, that's an excellent question. After all, Windows lacks any kind of colour management, so why should this be different in terms of HDR (which requires a wide colour gamut)?
 

repoman0

Diamond Member
Jun 17, 2010
4,538
3,447
136
What parts of "HDR" really mean anything? Does anybody know? For typical use, 10-bit color doesn't make a difference as far as I know. It seems like what we really need is new display tech with deep blacks and very bright whites, not wider color gamut or 10 bit color.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
What parts of "HDR" really mean anything? Does anybody know? For typical use, 10-bit color doesn't make a difference as far as I know. It seems like what we really need is new display tech with deep blacks and very bright whites, not wider color gamut or 10 bit color.

We need OLED to be mainstream. I mean, the $5000 dell is asking for a 30 inch OLED monitor with those specs though.... Worth?
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Can't really answer that directly, but CAN say that my 55" 1080p LG OLED is fantastic. True black is just awesome compared to dark gray you get with LCD.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Considering the PC in your sig has R9 290s, that's not a surprise - they don't support HDR in any form.

As for how Windows treats it, that's an excellent question. After all, Windows lacks any kind of colour management, so why should this be different in terms of HDR (which requires a wide colour gamut)?
I'm not using my PC for 4k at all.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
you aren't watching youtube videos in HDR. Only the chromecast ultra or a Samsung TV can display Youtube HDR videos for the time being. I also have not found an Avera tv that supports HDR10 or Dobly Vision, which means that whatever it's calling "HDR" isn't the real thing.
 
Last edited:

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
you aren't watching youtube videos in HDR. Only the chromecast ultra or a Samsung TV can display Youtube HDR videos for the time being. I also have not found an Avera tv that supports HDR10 or Dobly Vision, which means that whatever it's calling "HDR" isn't the real thing.

Youtube platform supports HDR and 8k. The source needs to also support it though. There are many HDR demos on Youtube already.
 

repoman0

Diamond Member
Jun 17, 2010
4,538
3,447
136

So basically a combination of high peak brightness and a larger color space.

I think a larger color space is overblown. It extends the saturation the display is capable of producing, sure, but it's mostly biased toward the green direction. The human eye isn't particularly good at discerning differences between the greens in the upper part of the CIE chromaticity diagram, and it looks like it barely extends the red portion. With a display capable of producing perfect sRGB and also capable of producing the larger REC2020 space, I have my doubts that you would be able to tell much, if any, of a difference in the final output, given the same content mastered for each. Your eye can only tell so much of a difference in color content. It's a really interesting topic, actually -- I took a digital image processing course in grad school and it really opened my eyes to how complex the mathematics of color are:

https://en.wikipedia.org/wiki/Color_vision

To me, it looks like the color space stuff is an easy marketing gimmick, and the contrast ratio / brightness part of the spec is what will actually make the image look different. Since display companies have been touting nonsense like 5000000:1 contrast ratios for a while now by using fake dynamic numbers, they need some way to sell an actual high contrast ratio, and this is it.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
So basically a combination of high peak brightness and a larger color space.

I think a larger color space is overblown. It extends the saturation the display is capable of producing, sure, but it's mostly biased toward the green direction. The human eye isn't particularly good at discerning differences between the greens in the upper part of the CIE chromaticity diagram, and it looks like it barely extends the red portion. With a display capable of producing perfect sRGB and also capable of producing the larger REC2020 space, I have my doubts that you would be able to tell much, if any, of a difference in the final output, given the same content mastered for each. Your eye can only tell so much of a difference in color content. It's a really interesting topic, actually -- I took a digital image processing course in grad school and it really opened my eyes to how complex the mathematics of color are:

https://en.wikipedia.org/wiki/Color_vision

To me, it looks like the color space stuff is an easy marketing gimmick, and the contrast ratio / brightness part of the spec is what will actually make the image look different. Since display companies have been touting nonsense like 5000000:1 contrast ratios for a while now by using fake dynamic numbers, they need some way to sell an actual high contrast ratio, and this is it.

10 bit will also show a visual improvement over 8 bit. All in all HDR is a move in the right direction they just need to settle on a standard, im not buying a new TV until they do.
 

repoman0

Diamond Member
Jun 17, 2010
4,538
3,447
136
10 bit will also show a visual improvement over 8 bit. All in all HDR is a move in the right direction they just need to settle on a standard, im not buying a new TV until they do.

I don't know, seems like 10 bit color falls into the same category as color space. We can display sRGB without banding with 8 bits and that color space and bit depth is already more than the human eye can discern.

I'd be glad to be proven wrong but the color fidelity stuff sounds comparable to high sample rate and high bit depth audio. Yeah, 192 kHz and 24 bit "more closely" approximates an analog audio waveform than 44.1 kHz / 16 bit, but to your actual senses, it makes no difference. You can't hear the information contained in the higher sample rate because all that buys you is ultra high frequency information which is above human hearing range, and 44.1 kHz captures all frequencies you can hear. The increased bit depth serves to decrease quantization noise, which is already lower than the threshold of human hearing at 16 bit. There's a reason SACD and DVD-Audio never took off.

The real improvement is going to come from display tech allowing for brighter whites and darker blacks.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I'm not using my PC for 4k at all.
The way you wrote it ("The only way I can get HDR working right now...") strongly implied that you had tried and couldn't get it working on your PC - especially in a thread specifically about PC HDR support.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Youtube platform supports HDR and 8k. The source needs to also support it though. There are many HDR demos on Youtube already.

There are, but only the Chromecast ultra and Samsung TV Youtube apps are currently capable of displaying them in HDR mode.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I don't know, seems like 10 bit color falls into the same category as color space. We can display sRGB without banding with 8 bits and that color space and bit depth is already more than the human eye can discern.

That's not true at all, gradients can easily be seen on 8bit content if you go much higher than 100 nits. HDR content is mastered or generally displayed at 1000+ nits, which is why the 10 bit gradient is important. Content mastered and displayed with a 1000 nit ceiling would have all sorts of banding issues if it were 8 bit content.

The increased bit depth serves to decrease quantization noise, which is already lower than the threshold of human hearing at 16 bit. There's a reason SACD and DVD-Audio never took off.

It's not quite the same. Your statements about 8 bit color are true... if you assume a maximum light output of 100 nits. Imagine if your statement about 16bit sound were true, but only up to about 55 dB. If people wanted audio mastered for 90dB, 16bit wouldn't cut it anymore. Same thing with 8bit vs 10bit. 8bit is fine up to 100 nits, but HDR content is made to be displayed with a range up to 1000 nits.

The real improvement is going to come from display tech allowing for brighter whites and darker blacks.

Darker blacks more than anything. This is the main reason that plasma was always superior to LCD, and now OLED is.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
So basically a combination of high peak brightness and a larger color space.

I think a larger color space is overblown. It extends the saturation the display is capable of producing, sure, but it's mostly biased toward the green direction. The human eye isn't particularly good at discerning differences between the greens in the upper part of the CIE chromaticity diagram,

But we're nowhere near the upper part of the CIE range with DCI-P3, which is why the impact is very noticeable.

and it looks like it barely extends the red portion. With a display capable of producing perfect sRGB and also capable of producing the larger REC2020 space, I have my doubts that you would be able to tell much, if any, of a difference in the final output, given the same content mastered for each. Your eye can only tell so much of a difference in color content. It's a really interesting topic, actually -- I took a digital image processing course in grad school and it really opened my eyes to how complex the mathematics of color are:

https://en.wikipedia.org/wiki/Color_vision


I think you need to watch some UHD content. Grand Tour on Amazon for instance; you will immediately notice the wider gamut.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
We need OLED to be mainstream. I mean, the $5000 dell is asking for a 30 inch OLED monitor with those specs though.... Worth?

Exactly. I would pick OLED over HDR any day. Will probably upgrade my TV once the OLED ones from LG become more reasonably priced. I hate watching space movies on my LED TV. Argh...
 

repoman0

Diamond Member
Jun 17, 2010
4,538
3,447
136
That's not true at all, gradients can easily be seen on 8bit content if you go much higher than 100 nits. HDR content is mastered or generally displayed at 1000+ nits, which is why the 10 bit gradient is important. Content mastered and displayed with a 1000 nit ceiling would have all sorts of banding issues if it were 8 bit content.

Fair enough, you are correct that I was thinking in terms of standard 100 nit sRGB brightness. The overall gradient has a wider dynamic range as contrast ratio increases.

I think you need to watch some UHD content. Grand Tour on Amazon for instance; you will immediately notice the wider gamut.

I'd be glad to try -- my monitor is capable of 10-bit and full adobe RGB, just not sure how to make expanded gamut actually work with my video card, UHD content, etc. if it's even possible ...
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
Exactly. I would pick OLED over HDR any day. Will probably upgrade my TV once the OLED ones from LG become more reasonably priced. I hate watching space movies on my LED TV. Argh...

Ahh OLED is one of the few technologies that is more or less natively capable of displaying HDR. Especially given each pixel is individually illuminated. It's the gold standard right now. The cheaper HDR TV option with be IPS or VA with local dimming.

BTW AMD demo'd Polaris doing HDR during CES. Keep in mind that HDR will look much much more stunning in person. The fact that you can actually see the difference in a non-HDR video is amazing.

https://youtu.be/hvD37UUcdIo?t=3m51s
and
https://www.youtube.com/watch?v=MnvctltAKLE
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |