[ H ]: Freesync 2 (HDR) vs Gsync

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Every HDR display has a built-in tone mapping module, but it don't know how to apply the tone map properly. I have a ROG Swift PG27UQ for testing the HDR, and the results are very bad. This is why I want to know the monitor characteristics, to allow a usable tone mapping before the HDR transport, and not lose a lot of detail, because of the built-in hardware. I can do this for Freesync 2, and I want to do this for G-Sync as well, but I can't, so my G-Sync custumers will have worse image quality and I can't help them.

Interesting. Are you in contact with Nvidia about this? If so, what's their response?
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
Consoles and current PC games don't need that when dealing with HDTVs or HDR monitors, they work flawlessly. So what changed?

What sort of difference are we talking about?

So FreeSync 2 displays has a built in hardware just like GSync now? Is it proprietary as well?

Remember , He's Console developer.He said I can't use properly G-Sync HDR Inside game engine because of lack of tune mapping library.seem like you're Not happy with current situation.Now here :

https://gpuopen.com/amd-gpu-services-5-1-1/

AGS 5.1 is also the first version to support upcoming FreeSync™ 2 displays, allowing your application to drive them more efficiently. FreeSync 2 moves the tone-mapping part of the HDR presentation to the GPU. It previously would have been handled by the display, potentially increasing latency, and so moving it to the GPU is a key benefit of FreeSync 2 for your games. AGS 5.1 gives you the control you need to implement that.

What's new in AGS 5.1.1 since version 5.0.6

AMD GPU Services (AGS) Library

therefore you need libraries to use HDR inside game engine whether It's AMD or Nvidia. I'm just saying : here , No one is going to downplay Nvidia G-sync.Sooner or later Nvidia will release new SDK for G-sync HDR.
 
Last edited:
Reactions: Krteq

Muhammed

Senior member
Jul 8, 2009
453
199
116
therefore you need libraries to use HDR inside game engine whether It's AMD or Nvidia. I'm just saying : here , No one is going to downplay Nvidia G-sync.Sooner or later Nvidia will release new SDK for HDR.
The part you quoted speaks about latency and not IQ. Two different things. It's also a fact that consoles and current PC games don't need any libraries to handle HDR on current HDTVs and monitors. We have so many of those around already. No tune mapping is done through a library when outputting HDR on current displays.
 

Krteq

Senior member
May 22, 2015
993
672
136
Consoles and current PC games don't need that when dealing with HDTVs or HDR monitors, they work flawlessly. So what changed?
...
What sort of difference are we talking about?
...
So FreeSync 2 displays has a built in hardware just like GSync now? Is it proprietary as well?
...
The part you quoted speaks about latency and not IQ. Two different things. It's also a fact that consoles and current PC games don't need any libraries to handle HDR on current HDTVs and monitors. We have so many of those around already. No tune mapping is done through a library when outputting HDR on current displays.
You still don't know what's going on, don't you?

If you aren't trusting Zlatan, please trust nVidia. Even they are talking about need of tone-mapping for correct HDR function (current HDR pipeline)

Better Gaming Experience by NVIDIA with Ansel, Highlights and HDR_en (PDF) - Start at page 57

And... YES, current console and PC games need built-in support for HDR
 
Last edited:
Reactions: VirtualLarry

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Tone mapping happens wihtin the game. Has nothing to do with the GPU. And how to trigger HDR on a nVidia GPU should be actual knowledge for a guy who claims to have one those very are G-Sync HDR monitors...
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
And... YES, current console and PC games need built-in support for HDR
We are not talking about built in HDR support here, Tone mapping to a specific display characterstics is not needed in the current roaster of console and PC games, you don't see the developers of Horizon Zero Dawn sampling every HDR HDTV in existence to enable HDR on them, it just works according to the HDR10 standard. I certainly didn't need a library when I played FAR CRY 5 PC in HDR on my HDR 4K TV. It just worked flawlessly out of the box.

If you aren't trusting Zlatan, please trust nVidia. Even they are talking about need of tone-mapping for correct HDR function (current HDR pipeline)
I don't surrender my mind brainlessly to any passing info on the Internet thank you very much. That link talks about tone mapping HDR to LDR displays, and tone mapping HDR to suboptimal (1000nits or less) HDR displays according to HDR standards. Again no mention of needing specific libraries to achieve the effect.

AMD taks about a (potentially proprietary) specific hardware in the FreeSync 2 display to eliminate certain steps to improve latency, they never mentioned improving image quality of the HDR implementation over other solutions.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
We are not talking about built in HDR support here, Tone mapping to a specific display characterstics is not needed in the current roaster of console and PC games, you don't see the developers of Horizon Zero Dawn sampling every HDR HDTV in existence to enable HDR on them, it just works according to the HDR10 standard. I certainly didn't need a library when I played FAR CRY 5 PC in HDR on my HDR 4K TV. It just worked flawlessly out of the box.


I don't surrender my mind brainlessly to any passing info on the Internet thank you very much. That link talks about tone mapping HDR to LDR displays, and tone mapping HDR to suboptimal (1000nits or less) HDR displays according to HDR standards. Again no mention of needing specific libraries to achieve the effect.

AMD taks about a (potentially proprietary) specific hardware in the FreeSync 2 display to eliminate certain steps to improve latency, they never mentioned improving image quality of the HDR implementation over other solutions.

First, you don't seem to realize that zlatan is an actual dev. It's it's entertaining. Second, your anaecdote is for a non Gsync display. You keep invoking "current displays', but you fail to realize that GSync isn't like any other display. All non-Gsync monitors and TVs work by exchanging capabilities via EDID. Something that Gsync doesn't do because Nvidia had to circumvent all of the standards literally everyone else uses. HDR10, HDR10+, HLG, and DV weren't written with standards breaking GSync in mind. Furthermore, you don't seem to understand that tone mapping can be done at the source or the destination. TVs do a pretty good job, but projectors don't. Theoretically it should be the same, but players like the Oppo UDP-203 and 205 do the tone mapping better than projectors can. Additionally you talk like 1000 nits is the only luminance output in the HDR standards. I haven't see any computer monitors that are going to support 4000. How do you think a master @ 4k or 10k is handled by a 1k display? Ahh tone mapping.... Tone mapping is the process of converting a higher dynamic range to a lower one. It is not limited to HDR capable displays and ones that are not.

One last thing, can you post up our display's calibration graphs since you talk about flawless? What process did you use to calibrate your HDR display? What colorimeter and/or spectrophotometer did you use? What software?
 
Reactions: Krteq and kawi6rr

Muhammed

Senior member
Jul 8, 2009
453
199
116
Additionally you talk like 1000 nits is the only luminance output in the HDR standards. I haven't see any computer monitors that are going to support 4000. How do you think a master @ 4k or 10k is handled by a 1k display? Ahh tone mapping.... Tone mapping is the process of converting a higher dynamic range to a lower one. It is not limited to HDR capable displays and ones that are not.
Sigh, LEARN to read:
and tone mapping HDR to suboptimal (1000nits or less) HDR displays according to HDR standards.
Second, your anaecdote is for a non Gsync display. You keep invoking "current displays', but you fail to realize that GSync isn't like any other display. All non-Gsync monitors and TVs work by exchanging capabilities via EDID.
That's exactly why a GSync display won't need a different library to handle these things, they will literally work just like any HDTV or a monitor. In fact this fallacy that isn't supported by any credible source doesn't make a dime sense, even according to AMD who uses this library to reduce latency, not introduce superior IQ. That's my whole point, of which you and others fail to understand just because of a personal preference to AMD?

One last thing, can you post up our display's calibration graphs since you talk about flawless? What process did you use to calibrate your HDR display? What colorimeter and/or spectrophotometer did you use? What software?
I didn't even invest a second into calibrating it, it worked out of the box.
 
Last edited:

PhonakV30

Senior member
Oct 26, 2009
987
378
136
You questioned His job (zlatan ) and denied any issue.He clearly said:

Every HDR display has a built-in tone mapping module, but it don't know how to apply the tone map properly. I have a ROG Swift PG27UQ for testing the HDR, and the results are very bad. This is why I want to know the monitor characteristics, to allow a usable tone mapping before the HDR transport, and not lose a lot of detail, because of the built-in hardware. I can do this for Freesync 2, and I want to do this for G-Sync as well, but I can't, so my G-Sync custumers will have worse image quality and I can't help them.

Than please help me where can I find the the tone mapping library for G-Sync.

He meant for G-sync HDR
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I thought the whole point of HDR was you didn't need tone mapping - if you are running SDR and can't support the range then you have a tone map in your game to get decent looking results, if you've got HDR and can support every colour perfectly there's no need for a tone map.
 

Krteq

Senior member
May 22, 2015
993
672
136
Well, that's the theory, but reality is quite different.

There is no display panel capable of required brightness/luminosity etc., so there is still need to use tone-mapping and other techniques.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
Well that was kinda a half assed test. I know I've seen monitors with both gsync and freesync versions. Maybe not freesync 2 though. The biggest issue has been that some freesync implementations have pretty terrible sync ranges and they tend to be on lower range monitors. No doubt gsync adds some cost, but a single sample of each without controling for features or quality hardly justifies saying its $200 for gsync.

Still I just wish Nvidia would support it. I hate having my gpu choice locked in by my monitor.
 
Reactions: Krteq

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
I thought the whole point of HDR was you didn't need tone mapping - if you are running SDR and can't support the range then you have a tone map in your game to get decent looking results, if you've got HDR and can support every colour perfectly there's no need for a tone map.
HDR monitors support 10 or 12 bits per channel (compared to 8 for SDR). GPUs are capable of rendering internally at 16 or even 32 bits per channel. Thus you still need tone mapping.
 
Reactions: Krteq

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
HDR monitors support 10 or 12 bits per channel (compared to 8 for SDR). GPUs are capable of rendering internally at 16 or even 32 bits per channel. Thus you still need tone mapping.

This guy gets it. Whatever the content might be, either rendered from a GPU or mastered on UHD disc, consumer display devices are not capable of the full color space, peak luminance, or contrast ratios demanded by the source. They all have trade offs unless you are on a reference mastering display. Those trade offs are handled by tone mapping.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
And even if you comply with a standard the physical characteristics of each screen are unique, so you have to either 1) accept the image will look different on different monitors to a greater degree, or 2) build a system that accounts for the individual variation of the particular unit. Historically 1 is the approach, but in some respects we are pushing towards 2 where the end device "knows" itself and can therefore adjust accordingly.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Consoles and current PC games don't need that when dealing with HDTVs or HDR monitors, they work flawlessly. So what changed?
It works for sure, but is the result good enough?
What sort of difference are we talking about?
Mostly crushed blacks, or badly saturated whites...

So FreeSync 2 displays has a built in hardware just like GSync now? Is it proprietary as well?
No this is not a hardware. It just a software that allows the devs to create a correct image to any display inside the game engine.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Interesting. Are you in contact with Nvidia about this? If so, what's their response?
They pretty much aware of it. But AMD has a working solution now, and they still thinking on something similar. That's good after all, because this is just a software, but they are painfully slow nowadays when they need react to real problems.
 
Reactions: Krteq

zlatan

Senior member
Mar 15, 2011
580
291
136
Tone mapping happens wihtin the game. Has nothing to do with the GPU. And how to trigger HDR on a nVidia GPU should be actual knowledge for a guy who claims to have one those very are G-Sync HDR monitors...
Normally within the game, and then in the monitor. Now we can cut the monitor part, because it will just give us worse image quality. This should be a must have technique today, because we not just get better latency, but also able to create better image quality. I can tune my tone map, if there is a library that allow me to query the monitor characteristics. And than I don't need anything else. The monitor can output my image and done. Best quality with the lowest possible latency.
 
Reactions: Krteq

zlatan

Senior member
Mar 15, 2011
580
291
136
AMD taks about a (potentially proprietary) specific hardware in the FreeSync 2 display to eliminate certain steps to improve latency, they never mentioned improving image quality of the HDR implementation over other solutions.
They don't have specific hardware. FreeSync 2 is just a specification. The monitor manufacturers just need to follow it, and nothing else. With this the game engine can calculate a correct tone mapped image and the monitor don't need to touch that result with the built-in tone mapping. This will improve quality and decrease latency.
 
Reactions: Despoiler and Krteq

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
They don't have specific hardware. FreeSync 2 is just a specification. The monitor manufacturers just need to follow it, and nothing else. With this the game engine can calculate a correct tone mapped image and the monitor don't need to touch that result with the built-in tone mapping. This will improve quality and decrease latency.
How do curent gen hdr tv's do it?
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
It works for sure, but is the result good enough?
Yes, extremely good actullay.
Mostly crushed blacks, or badly saturated whites...
hmm, that doesn't sound like a huge IQ difference to begin with. Most problems in the HDR implementation will be the maximum nits reachable by the panel (is it 1000nits or less?) and the color space as well as contrast. Whether it can maintain decent black (OLED, Full Array Backlight, QLED .. etc) or not. So your limitations are mostly hardware, not software.
How do curent gen hdr tv's do it?
Again as I said, GSync and consoles work through the HDR10 standards. They don't do it through extra APIs or libraries.
FreeSync proposes to bypass the HDR10 standard achieving a claimed lower latency. But it needs special coding and attention to achieve that, it's not automatic. To quote Tom's hardware on this:

" The question of why not simply use the HDR10 or Dolby Vision transport spaces is already answered, then—they’d require another tone mapping step. David Glen, senior fellow architect at AMD, said that HDR10 and Dolby Vision were designed for 10 or more years of growth. Therefore, even the best HDR displays available today fall well short of what those transport spaces allow. That’s why the display normally has to tone map again, adding the extra input lag FreeSync 2 looks to squeeze out.

Sounds like a lot of work, right? Every FreeSync 2-compatible monitor needs to be characterized, to start. Then, on the software side, games and video players must be enabled through an API provided by AMD. There’s a lot of coordination that needs to happen between game developers, AMD, and display vendors, so it remains to be seen how enthusiastically AMD’s partners embrace FreeSync 2, particularly because the technology is going to be proprietary for now. "

http://www.tomshardware.com/news/amd-freesync-2-hdr-lfc,33248.html
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
a non-developer advises Console developer/Programmer , It's awesome......

Don't forget he hasn't even calibrated his own display then tells everyone that crushed blacks and badly saturated whites are no big deal for IQ. Then proceeds to say it's not a software limitation when that is precisely what deals with hardware limitations.

As far as AMD's approach, neither HDR10 or DV were created with gaming in mind. That leaves it up to the display manufacturer to deal with the inherent latencies of the panel and content transport. Normally I'm all for standards, but in this case AMD's approach seems better for gamers. In the future I believe there should be a standardization of how the HDR gaming mode is handled in displays with the various HDR formats.
 
Last edited:

Muhammed

Senior member
Jul 8, 2009
453
199
116
Don't forget he hasn't even calibrated his own display then tells everyone that crushed blacks and badly saturated whites are no big deal for IQ. Then proceeds to say it's not a software limitation when that is precisely what deals with hardware limitations.
Yeah coming from someone who actually used and tested HDR, not heard about it from his favorite company that he is willing to believe fantastical legends about it (magic Vega drivers anyone?).

Once again, your biggest issue running HDR games is your panel capabilities for HDR, the nits, the contrast, the color space and the measure of true black it can support. HDR 10 runs console games just fine on 4K TVs. zlatan claims there is an IQ difference when activating FreeSync 2, a claim even AMD didn't make, and I still take issues with that claim, and I argue it has no basis in reality. A software solution will not enhance the quality of a hardware limited HDR panel. If it can't do proper blacks (because it lacks Full Array Backlight) or if it can't properly brighten the scene (because it's not 600nits or more), then FreeSync 2 will not fix that. It MIGHT reduce latency, but nothing more.

Also It's funny when people scream standards from the top of mountains and loathe proprietary stuff, but when their favorite company steers from standards intro the proprietary they are all about embracing it, hypocracy has no bounds indeed.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |