For some basic explanations below, I'll reference nits - one candela per square meter. The candela (candlepower) is approximately the amount of light emitted by a common tallow candle, which is also equal to one lumen per steradian. Technically it is the quantity of radiation emitted by 1.667 x 10^-6 square meter of a blackbody at the melting point of platinum.
In the world of RGB-based color display information, 100 nits of brightness is usually reserved for full white (255-255-255). Colors are not as bright as a full white signal. Most modern TVs usually operate at about 500 nits - simply amplifying the brightness of the signal provided (100 nits x 5). Likewise, computer displays usually operate between 150-300 nits. Again, the color information provided still sits within the confines of 100 nits for maximum white. 0 nits = black (no display can do absolute blacks unless it is off, not even OLED), 100 nits = white.
Is the highlighted part above an indication that 4k, 96hz, HDR is possible? Or is 10 bit the first step in improving image quality in games, and HDR the second? I guess why I'm confused is, wasn't there a Half Life 2 patch that showed off the use of HDR years ago? Why is this being brought to market in hardware now?
HDR is metadata transmitted in the stream on top of the 10-bit signal. DP1.3 should have no issues being compatible with this (as they already exceed the HDMI2.0a specification in terms of bandwidth and claim to be compatible).
The difference between HDR in games and HDR in displays comes entirely down to perception vs. reality. Half-Life 2 utilizes HDR rendering to simulate the effect of human vision when exposed to very bright lights in otherwise dark settings. It actively adjusts the brightness of the entire scene based upon whatever the brightest rendered source is (the sun, a lamp, etc.). You can use any display for this, since the peak brightness of your display is not manipulated in any way by the signal - it still never exceeds 100 nits for maximum white level. This results in other colors and dark areas often looking washed out.
HDR displays actually extend beyond the normal maximum brightness level encoded (100 nits) - using those metadata - to provide a full spectrum of brightness for whites and colors. Using the standards for HDR put forth by the UHD Alliance at CES, displays must achieve more than 1,000 nits peak brightness and less than 0.05 nits black level (LCDs) - OR - 540 nits brightness and less than 0.0005 nits black level (OLEDs). By expanding the available brightness
information in the video stream, the display can go far, far beyond the 100 nits barrier. This not only makes the difference between the brightest and darkest parts of the display much more extreme (a flashlight in the dark), but also allows for all colors to be displayed at previously unseen levels (outside of real life).
When games start using actual HDR metadata (beyond 100 nits)... every gamer is going to want an HDR-compliant display.
That could be the saving grace for gaming. I had not even thought of variable frame rates. But here's hoping that all monitors have *Sync technologies as a standard feature in 1-2 years.
I still fear that 4K120 10-bit is 5-8 years away given the how long it takes between the release of new interfaces.
VR in 10-bit is still pretty much a lost cause until then, given its resolution and frame rate requirements for a good experience.
4K120 is here now in the shape of the new 30" Dell 4K OLED. With the advent of USB 3.1 C supporting DP over USB, I don't think we'll have to wait long. The displays will be here long before mainstream GPUs will be able to handle the latest games at that resolution and refresh. Since A-Sync is part of DP1.3, I can only assume that anyone making a "gaming monitor" will include it. I guess we'll see.