- Dec 12, 2001
- 27,052
- 357
- 126
Sony announced their Ultra app will support 4k HDR streaming only if you have a Kaby Lake Intel CPU. If not, no HDR from the app. The way I read it is it won't work without it even if you have a GPU with the proper HDMI version to pass the metadata on to the TV.
According to WCCFTech Nvidia will support HDR gaming only if you use gamestream from a Pascal GPU to an Nvidia Shield hooked up to a HDR TV since as of yet there are no HDR monitors available. They say nothing about running HDMI directly to the TV or through a compatable AVR to get HDR to your TV. Is this really how it'll work? You need a $200 box in addition to your $400+ GPU just to get HDR to your TV when HDMI 2.0a can handle the metadata and color information perfectly fine on its own? That seems utterly ridiculous. What about AMD? Nvidia claims they are working on getting some titles updated for HDR but does that mean if you have an AMD CPU those games won't be in HDR at all? Is there some sort of standard on the PC for implementing HDR into a game that works with either brand?
Further Nvidia claims they are working to get Netflix streaming in HDR on the PC via Microsoft's PlayReady 3.0. From what I've read about PlayReady 3.0 is that it's hardware based DRM. Do all the components in the system have to have this compatibility for HDR streaming to pass through? Meaning if you use an older CPU but have a new GTX1080 that you can't get HDR because you need a new CPU?
All this makes it seem like they are trying to fail. Hardware locking, requiring extra hardware to get the signal to your TV in games, hardware based DRM potentially locking you out of content access. To me it seems like it'll be one big disaster that makes adopting a game changing technology almost impossible for the PC. I'm trying to figure all this out and my reading today has made it seem like a huge failure right out of the gate.
According to WCCFTech Nvidia will support HDR gaming only if you use gamestream from a Pascal GPU to an Nvidia Shield hooked up to a HDR TV since as of yet there are no HDR monitors available. They say nothing about running HDMI directly to the TV or through a compatable AVR to get HDR to your TV. Is this really how it'll work? You need a $200 box in addition to your $400+ GPU just to get HDR to your TV when HDMI 2.0a can handle the metadata and color information perfectly fine on its own? That seems utterly ridiculous. What about AMD? Nvidia claims they are working on getting some titles updated for HDR but does that mean if you have an AMD CPU those games won't be in HDR at all? Is there some sort of standard on the PC for implementing HDR into a game that works with either brand?
Further Nvidia claims they are working to get Netflix streaming in HDR on the PC via Microsoft's PlayReady 3.0. From what I've read about PlayReady 3.0 is that it's hardware based DRM. Do all the components in the system have to have this compatibility for HDR streaming to pass through? Meaning if you use an older CPU but have a new GTX1080 that you can't get HDR because you need a new CPU?
All this makes it seem like they are trying to fail. Hardware locking, requiring extra hardware to get the signal to your TV in games, hardware based DRM potentially locking you out of content access. To me it seems like it'll be one big disaster that makes adopting a game changing technology almost impossible for the PC. I'm trying to figure all this out and my reading today has made it seem like a huge failure right out of the gate.
Last edited: