[TechPowerUp article] FreeSync explained in more detail

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
This is also one of the reasons why the monitor needs enough memory to hold an entire frame of information (the other is presumably to calculate overdrive between two images).

DP is a extrem fast connect with a packet based protocol. If the LCD controllers accept the needed commands (what they dont do today!) it's possible to do all the needed calculations on the GPU. In that case, you don't need a big amount on memory on the controller.

The idea should be that the GPU drives the LCD, and not the other way around! Adding special logic to the LCD board is not the solution.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Really, this G-sync V Free-sync research and competition is just great for us all in the longer term.
Whichever technology gets market acceptance we win.
Them trying to one up each other is the best guarantor we have of better hardware and more cool stuff to fool with.
It looks like Mantle and G-Sync right now are leading the pack.:thumbsup:
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Geez,

HD3D hasn't been abandoned, It just has an incredibly small adoption rate. The last few games I remember that had Native HD3D support were, Crysis 2, Deus Ex, Dirt 3/showdown, Two worlds 2, Sniper Elite V2 and Crysis 3. Those work out of the box and don't need special software.

But don't let facts stop you.

From what I understand and feel free to correct me. If AMD gets VESA to added the needed tech to DP, then for any monitor then on to be DP certified would need to support the tech?
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
According to AMD the reason G-sync exists in the first place is because NVidia GPU's don't have the monitor sync tech integrated like AMD GPU's do.

If this is true, NVidia GPU's can't get Freesync, just G-sync.

But keep in mind, that's what AMD is thinking, I don't think there was any official word from NVidia weather they support the tech or not.

It's not true. Variable refresh is how G-Sync works, and it is supported by Kepler, and on DP 1.2.

The thing people still, it seems, continue to miss is that it requires new hardware in the display to support variable refresh. Making it a VESA standard to be supported by the DP connection doesn't help, because the panels themselves, as currently designed, are not capable of understanding it. Even if that were adopted, it does not force display manufacturers to include the necessary interface control in the display itself that translates the instructions from the GPU to the panel. That DP will "enable" the usage does not mean that somehow, magically, the panels we've all been using that are designed at fixed refresh rates will somehow be able to change refresh rate on the fly.

This is not something that can be solved just on the GPU side, it requires new hardware in the display. An AMD executive acknowledged this in follow-up interviews.

G-Sync is not "dead for sure." It's the only thing that exists, because FreeSync doesn't. FreeSync is an utter hoax, a cheap, deceptive shot designed to steal some thunder from G-Sync's announcements at CES. And it seems to have fooled a lot of people, but fortunately people who have actually followed up have nearly completely debunked FreeSync's claims - both that it's free, and that it's simply a matter of updating the DisplayPort spec.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
lol. Yeaa, ofcource they wrap it in all kinds if marketing nonsense, but you have to give them credit for showing it is usefull the use the wheel.
Amd have been sitting on this tech for years and didnt use it, how stupid is that?

But G-synch is, in its current form, except for the few early adopters 2014, now dead for sure.

AMD's cards and drivers have been ready, but it looks more like it's taken this long to get the DisplayPort spec to include this functionality to exploit it in a desktop environment. AMD has the patents so clearly they had the foresight that this could be useful. What isn't clear is if Nvidia actually had the novel application of AMD's idea. ie AMD was thinking laptop power saving, but Nvidia saw that it could be used to solve v-sync tearing etc.... I really really hope Anandtech digs into this. We need the white papers on both techs to see how it works, but the timeline of events prior to these announcements could make one heck of a story.
 

Mand

Senior member
Jan 13, 2014
664
0
0
AMD's cards and drivers have been ready, but it looks more like it's taken this long to get the DisplayPort spec to include this functionality to exploit it in a desktop environment. AMD has the patents so clearly they had the foresight that this could be useful. What isn't clear is if Nvidia actually had the novel application of AMD's idea. ie AMD was thinking laptop power saving, but Nvidia saw that it could be used to solve v-sync tearing etc.... I really really hope Anandtech digs into this. We need the white papers on both techs to see how it works, but the timeline of events prior to these announcements could make one heck of a story.

It has nothing to do with the DP spec, and everything to do with new hardware controllers in displays. AMD itself has admitted that even if DP 1.3 does what they want it to, that new display hardware will still be required.

And, not coincidentally, that's exactly what G-Sync is doing. Only the difference is that Nvidia is working directly with display manufacturers in order to get it working, and has put in an enormous amount of engineering time into doing so, and AMD has done nothing and plans to do nothing, as per their announcement. They're just going to hope that display manufacturers do it on their own.

But don't take my word for it, here's one of the display manufacturers directly:

http://overlordforum.com/topic/603-nvidia-g-sync/page-5

So here is where GSYNC for Tempests stand: in queue.

Since Nvidia handles all the board design for all the OEMs on the planet, for any and all panels they request, Nvidia is a bit overwhelmed at the moment. I was told yesterday that Nvidia only has so much "bandwidth" (person hours) for GSYNC design and those engineers are working their tails off trying to get all the boards done as soon as possible.

What does this mean for our requested panel design? We are not in process yet, but their engineers want to get ours out. However, the larger OEMs are first to be served, which makes sense since Overlord is very small compared to all the others. For now it seems only TN-related panels are being designed.

It is good news that Nvidia's engineers want to tackle our panel and OC PCBs - it is somewhat bad news that there isn't enough hours in the day to get everything done! I was told our panel will be designed, but not when. There also seems to be some discussion as to what extent the overclock on the panel would be set and how that all would work. That discussion is for a later time once the engineers take a look at our gerbers and decide how to best tackle GSYNC and our panel.

Overall, this means once the design is complete we would offer the same deal going for the ASUS 248 panel (and others that will be coming in the next 6 months with GSYNC) - a pre-done panel with GSYNC, a mod service, and a kit (that is the plan at this moment). Of course, all of this can change at any time since we are at the mercy of Nvidia and their schedule.
GSYNC does require an entirely new PCB with display port only. There is no module that clicks into any existing PCB since the entire board, with the module, must be tuned to the specific panel being used.

This is the cause for the delay. Nvidia has to hand tune every PCB (input and in some cases the timing controller) to match the panel every OEM wants to use.

Again, we are hoping to have something within the next few months to test, but aren't holding our breath. The goal is to have the new Gsync version available, with a mod service, and a standalone kit. How all this will work is still up in the air as it depends on PCB design, sizing, etc.
Bolded for emphasis. These technical hurdles are not just going to magically vanish with a DP spec update.
 
Last edited:

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
lol. Yeaa, ofcource they wrap it in all kinds if marketing nonsense, but you have to give them credit for showing it is usefull the use the wheel.

Yeah, thanks Nvidia for bringing this up. And thanks for making it a paid feature when it could be 100% free.

Amd have been sitting on this tech for years and didnt use it, how stupid is that?

Same with OpenCL drivers. What's the point of that much compute potential in your cards when your drivers make impossible to use it?

But G-synch is, in its current form, except for the few early adopters 2014, now dead for sure.

Nvidia just tried to pull an Apple here. I'm guessing that they didn't think that VESA would actually include VBLANK features in the 1.3 spec and hurried the heck out of the G-Sync release.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
Making it a VESA standard to be supported by the DP connection doesn't help, because the panels themselves, as currently designed, are not capable of understanding it. Even if that were adopted, it does not force display manufacturers to include the necessary interface control in the display itself that translates the instructions from the GPU to the panel. That DP will "enable" the usage does not mean that somehow, magically, the panels we've all been using that are designed at fixed refresh rates will somehow be able to change refresh rate on the fly.

You don't see the big picture.

It's not about having a 100% support, it's about having the choice to integrate it if the market requires such a solution. It's about reducing costs by producing a single controller but only enabling the feature on the top models. It's about having a solution that works on all GPUs.

We should be happy for this!
 

Mand

Senior member
Jan 13, 2014
664
0
0
You don't see the big picture.

It's not about having a 100% support, it's about having the choice to integrate it if the market requires such a solution. It's about reducing costs by producing a single controller but only enabling the feature on the top models. It's about having a solution that works on all GPUs.

We should be happy for this!

We should be happy for it if it would actually exist, which it won't. Not without a lot of hardware development on the part of display manufacturers.

Which won't be free.

To the people who still think DP 1.3 will magically solve the hardware problem, if this needs an as-yet-undetermined change to the DP 1.3 spec to work, why does it work now on DP 1.2?
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
We should be happy for it if it would actually exist, which it won't.QUOTE]

There's no reason to think that it will not happen. By hardware.fr's article it's actually moving in the right direction.

nVidia is +- 1 year ahead, and if you want x-Sync screen this year, you should only look into G-Sync.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
G-Sync is not "dead for sure." It's the only thing that exists, because FreeSync doesn't. FreeSync is an utter hoax, a cheap, deceptive shot designed to steal some thunder from G-Sync's announcements at CES. And it seems to have fooled a lot of people, but fortunately people who have actually followed up have nearly completely debunked FreeSync's claims - both that it's free, and that it's simply a matter of updating the DisplayPort spec.

:thumbsup:
I almost fell out of my chair when i read that. "Gsync is dead for sure" cause the fairy power of the freesync vaporware has already made it obsolete. Man, AMD is good. They win without ever having to do anything. Without any alternative in sight, with absolutely nothing they somehow beat Gsync. Or at least there are already some so drunk on the kool-aid they do declare.

Wouldnt it just make more sense to reserve such opinions, and most importantly spoken declarations, until this other "alternative" manifest itself so that it can be proven one way or the other? Is it proper these days to make such bold claims with nothing to go on? When freesync is nowhere in sight and has never been put up against anything (much less gsync) to even be compared? How can one declare such extremes in this case and ever be taken seriously again? I wouldnt consider it rational at all. At the very least i would say its really strange to make such statements at this point.

A level minded person who hasnt sided with one brand over another would at least want to see both methods in action before they would began to make those kind of calls. Freesync truly is nowhere in sight. But even if it came out tomorrow, most rational people would want to see how it compares to gsync and measure them up. This is just the proper normal way to do things. You cannot ever expect to be taken seriously if your spouting out baseless nonsense, can you? And right now we have no way to base freesync at all. And we wont for some time. For a long time.......if ever at all.

Thats really all i wanted to say at this point. You cannot claim this huge doom to gsync without ever seeing how freesync fairs against it. And i find it so very strange that so many are hating on gsync while praising AMDs vaporware product like so. I mean, at the very very least Nvidia is trying to push us in a better direction. No one would be thinking about any of this if it wasnt for them. If freesync turns out fantastic, we have gsync to thank. It would have pushed us into a better future. One way or the other, i think it will. Disregard it all you want. It is obvious that it is already making an impact.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Of course, another feature of G-sync is to adjust color and gamma settings so color looks the same to us. Apparently variable refresh rates cause the color to appear different at different times.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Geez,

HD3D hasn't been abandoned, It just has an incredibly small adoption rate. The last few games I remember that had Native HD3D support were, Crysis 2, Deus Ex, Dirt 3/showdown, Two worlds 2, Sniper Elite V2 and Crysis 3. Those work out of the box and don't need special software.

But don't let facts stop you.

From what I understand and feel free to correct me. If AMD gets VESA to added the needed tech to DP, then for any monitor then on to be DP certified would need to support the tech?
The problem with HD3D is that there are 0, absolutely 0 monitors being made that support Active Stereoscopic HD3D. The only monitors left are the passive ones, which result in 1920x590 resolutions and super blurry text.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
The problem with HD3D is that there are 0, absolutely 0 monitors being made that support Active Stereoscopic HD3D. The only monitors left are the passive ones, which result in 1920x590 resolutions and super blurry text.

Er. It works with any 3D display from what I know TV or Monitor or Projector. It works on my second monitor. A samsung S27A950D.

This is off topic anyway. Just saying that this may come to market afterall. Alot of AMD's management has been changed for the better so I'm hoping for the best.
 

Spidre

Member
Nov 6, 2013
146
0
0
This is a lot of very interesting discussion. There's one inconsistency I keep seeing though. AMD is getting praised for trying to have an open standard, which it should, and Nvidia is being shunned for having a proprietary one. Yet Mantle is being praised left right and center, while Nvidia's work with Open GL is being ignored. Largely in fact by the same people.

Reguardless, I believe there will be an open standard available sometime in 2015. I doubt any GPU manufacturer will have anything to do with it, and it's probably the monitor manufacturers doing it for added features. Gsync will probably exist for a few years, eventually turning into an added extra that monitors can tack on for another feature point.

I myself got turned off when I heard that G-sync and lightboost will not be usable at the same time. I'm going to be buying a ROG anyways, so I will at least have the option to toggle them depending on the game.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
It has nothing to do with the DP spec, and everything to do with new hardware controllers in displays.

To the people who still think DP 1.3 will magically solve the hardware problem, if this needs an as-yet-undetermined change to the DP 1.3 spec to work, why does it work now on DP 1.2?

I've already explained this ages ago in this thread, but here we go again.

AMD can showcase Free-Sync because those laptops have DP 1.2 inputs and an eDP(embedded Displayport) output to the panel. The connection from the GPU to the panel is end to end DisplayPort and it bypasses most of what the controller/scaler board does. That's how it works.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Maybe because the DP 1.3 spec isn't finished?

Which is one reason why I say it doesn't exist. How can it, if the DP 1.3 spec isn't finished?

This is putting aside the major technical challenges with the hardware modifications required, which AMD is just hoping that people might do, somehow, by advocating for DP 1.3

Whereas Nvidia engineers are doing the gruntwork of getting it to actually be in things we can buy. There's NO comparison.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I thought the dynamic refresh rate tech intention was for it to be an optional part of the DP 1.3 spec. Its not currently in there at all, they are trying to change the spec a few months from ratification. The assumption here is that the monitor companies take this demo, ratify it as an optional part of the spec and then develop hardware on that basis.

That puts Freesync at least a year away really. I am also not convinced on how Freesync actually works. My look into how DRR works in the eDP spec told me that the rate of change was about once a second not every frame. Worse than that is instigated using the info section which comes after the completion of the vblank signal which allows the GPU to tell the monitor to change the refresh/resolution etc. The monitor also required seamless refresh rate adjustment (not common at all) and then the GPU is required to hold vblank as long as the predicted time from the info from the previous frame.

So unless AMD is intending to improve on eDP's dynamic refresh, rather than just adding it to DP 1.3 then I have an issue with the way it works, its not going to be even remotely the same thing.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Er. It works with any 3D display from what I know TV or Monitor or Projector. It works on my second monitor. A samsung S27A950D.

This is off topic anyway. Just saying that this may come to market afterall. Alot of AMD's management has been changed for the better so I'm hoping for the best.
All those Samsung monitors are no longer made.

All those TV's are not full 1920x1080p active displays. They are upscaled images, whether it is left/right at half resolutions, or 720p.

AMD has a tendency to provide features, but leave it up to the manufacturers to decide if they want to use it or not.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
According to AMD the reason G-sync exists in the first place is because NVidia GPU's don't have the monitor sync tech integrated like AMD GPU's do.

If this is true, NVidia GPU's can't get Freesync, just G-sync.

But keep in mind, that's what AMD is thinking, I don't think there was any official word from NVidia weather they support the tech or not.


That was simply hypothesizing on the AMD reps part. He was asked why nVidia uses additional hardware, if none is needed and he said that he didn't know, unless their hardware didn't support the eDP standard (paraphrased).
 

Mand

Senior member
Jan 13, 2014
664
0
0
That was simply hypothesizing on the AMD reps part. He was asked why nVidia uses additional hardware, if none is needed and he said that he didn't know, unless their hardware didn't support the eDP standard (paraphrased).

Nobody's desktop displays support the eDP standard, and there's a lot more that G-Sync is doing than what FreeSync is proposing.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |