VESA Adopts Adaptive-Sync

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

parvadomus

Senior member
Dec 11, 2012
685
14
81
I'd like to know how you think you know more than Nvidia about why Nvidia is doing what they're doing. Quoting doesn't equal knowledge, but the person being quoted sure does.

Nvidia is trying to differentiate its products with all kind of propietary stuff like always has done (to milk consumers as much as possible).. this time they failed miserably with a technology which was too easy to implement, and in fact it was already implemented in notebooks. Now G-Sync is dead :'(

Enough with the inflammatory replies. If you can't discuss the topic without throwing gas on the fire, leave the thread.

-Elfear
 
Last edited by a moderator:

96Firebird

Diamond Member
Nov 8, 2010
5,714
316
126
Nvidia is trying to differentiate its products with all kind of propietary stuff like always has done (to milk consumers as much as possible).. this time they failed miserably with a technology which was too easy to implement, and in fact it was already implemented in notebooks. Now G-Sync is dead :'(

We get it, you think GSync is dead. So far you've added nothing to the discussion. Please let the educated people discuss the technologies.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Nvidia is trying to differentiate its products with all kind of propietary stuff like always has done (to milk consumers as much as possible).. this time they failed miserably with a technology which was too easy to implement, and in fact it was already implemented in notebooks. Now G-Sync is dead :'(

If it was so easy to implement than G-Sync, why did Nvidia get there so much faster?

Nvidia said they looked at this, and specifically chose not to do it due to display hardware reasons.

I can't believe you can proclaim that they "failed miserably" now that we've been presented with something that someday, maybe, might do something somewhat similar, once someone, somehow, figures out how to make it. Maybe, because then we'd have to evaluate them side by side for performance, which we can't do when only one of them exists and the other doesn't.

Are you kidding me?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
If it was so easy to implement than G-Sync, why did Nvidia get there so much faster?

Nvidia said they looked at this, and specifically chose not to do it due to display hardware reasons.

I can't believe you can proclaim that they "failed miserably" now that we've been presented with something that someday, maybe, might do something somewhat similar, once someone, somehow, figures out how to make it. Maybe, because then we'd have to evaluate them side by side for performance, which we can't do when only one of them exists and the other doesn't.

Are you kidding me?

So did they look at it and think there's nothing in it for me is the real question. NVidia likes to go the proprietary route when and if it's possible or make economic sense to them.

Me I could care less what others think. I'll make my own choices based on what I think is the best for me....No company loyalties for me.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
^ Absolutely, this a great point. Regarding both an Oculus implementation or any other implementation, I'm wary to see it in action and at what price. If its significantly cheaper than G-Sync it will likely be worse in application, if it is as high of quality as G-sync it will likely command a similar (probably slightly smaller) premium. If it ends up working as well and being cheap I'll be happy and flabbergasted simultaneously

Trouble with gsync is that is isn't compatible with low persistence. Even if you strobe at variable rates you run into the issue that strobe is annoying at low refresh rates where gsync (and presumably adaptive sync) is most useful at rate rates lower than you want to strobe at. Low persistence is going to more important to presence in VR.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Tough few weeks for nvidia. First titanz gets gut-punched by 295X2 and seemingly indefinitely delayed and now gsync takes one to the gonads.

This is a good thing all round. What gsync and this non-proprietary async provide is a big positive. Tearing sucks and vsync is not good for online FPS games, so this now coming as a standard with all monitors and available for all video card brands is a lot better for gamers and consumers
than gsync would of been.

Only ones crying will be nvidia and their shareholders. I think most saw this coming, there was nothing about the nature of what gsync did that could be made exclusive. There still is not even an off the shelf monitor from any manufacturer that supports gsync anyways. You can buy the kit and void your monitor's warranty or get a third party to do that for you and install the kit into the one monitor currently supported by the gsync module.

Once I heard the gsync module needs to be tuned on a monitor by monitor basis it became apparent it would be nothing but headaches and endless waits for a decent screen to support it. Now that gsync is going to be rendered obsolete and A-sync will be widely available as an incorporated standard we should start to see some real support and variety in available screens.
 

gorobei

Diamond Member
Jan 7, 2007
3,777
1,226
136
actually one of the real benefits to active synch will be for htpc and movie viewer. asynch can go down to 24 fps on what would normally be a 60hz monitor, thus eliminating the need for 3:2 pulldown. before this you would need a 120hz monitor to avoid pulldown.

assuming video player software took advantage of this, it could even make older 25fps pal content easier to view.
 

Mand

Senior member
Jan 13, 2014
664
0
0
and now gsync takes one to the gonads.

Sigh. How can people still react like this? This kick to the gonads is vapor. There is nothing there. There won't be for at least a year, and there's no guarantee it will be equal to or better than G-Sync, or that it will cost the same or less.

Why are you so eager to jump on an injury to G-Sync based on something that might or might not happen sometime next year? A year is an eternity in the tech world. And that's a year that G-Sync is going to be on the market, available, and in ever-increasing quantity.

And yet, somehow, it's dead. Well folks, guess the fancy Haswell Refresh is dead, because Broadwell! Wait, that's dead too, because Skylake!
 

showb1z

Senior member
Dec 30, 2010
462
53
91
So is a-sync a requirement in 1.2a or is it optional like Ryan said in the article? I can't find a source on that statement in the press release.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
If it was so easy to implement than G-Sync, why did Nvidia get there so much faster?

Nvidia said they looked at this, and specifically chose not to do it due to display hardware reasons.

I can't believe you can proclaim that they "failed miserably" now that we've been presented with something that someday, maybe, might do something somewhat similar, once someone, somehow, figures out how to make it. Maybe, because then we'd have to evaluate them side by side for performance, which we can't do when only one of them exists and the other doesn't.

Are you kidding me?

Nvidia just sell an existing notebook feature as desktop, instead of promoting a new standard. They didnt get faster to nowhere.
Its just like Mantle and DX12. The difference is that Mantle is free, G-Sync not.
 

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
Just look at HDMI 2.0 for how the industry moves on tcons in displays. Even if an OEM decided to start designing an a-sync monitor today it's a loooong ways away. I'd guess that's why Nvidia decided to go their own way on it hardware wise.

How long has HDMI 2.0 been official? 5 months or so? Can you point me to a mainstream TV supporting it?
 

Mand

Senior member
Jan 13, 2014
664
0
0
Nvidia just sell an existing notebook feature as desktop, instead of promoting a new standard.

Frame-by-frame variable refresh is not an existing notebook feature. For you to call it one demonstrates just how little you understand what G-Sync does, and what the FreeSync demo at CES showed.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I couldn't find a straight answer on how often the refresh rate could change on amd hardware under eDP. Intel did have a presentation which said once a second. From the details I found I assumed it was possible to do frame by frame since hopefully that is what AMD demoed at CES. Its just another detail in what is a fog of information after 5 months.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
I couldn't find a straight answer on how often the refresh rate could change on amd hardware under eDP. Intel did have a presentation which said once a second. From the details I found I assumed it was possible to do frame by frame since hopefully that is what AMD demoed at CES. Its just another detail in what is a fog of information after 5 months.


That's not what AMD demoed at CES. They had two displays, one going at 60 Hz and the other going at 50 Hz. There was no variation of the VBLANK interval in real time whatsoever.

You assumed it's possible to do it frame by frame because that's what AMD said it should be able to do somehow, eventually. And then neglected to actually provide any details whatsoever about how that would happen, even up to and including this recent announcement and FAQ. All we have to go on is that assumption, and apparently that's enough for people to claim AMD the winner, and G-Sync dead.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
I couldn't find a straight answer on how often the refresh rate could change on amd hardware under eDP. Intel did have a presentation which said once a second. From the details I found I assumed it was possible to do frame by frame since hopefully that is what AMD demoed at CES. Its just another detail in what is a fog of information after 5 months.

I've made some research some time ago for a now closed FreeSync thread:

With PSR you can interrupt the actual image hold and get the frame from the DP input instead of the buffer.

The VESA specs say that you can control the PSR by using the MSA/SDP of the DP Protocol. This SDP is auxiliary data that is send with the frame information.

This is taken from the VESA specs, I don't know it's implementation in the hardware.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Let's try this one more time.

This isn't Freesync. This isn't AMD. This is VESA.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
GCN 1.0 supports DP1.2, too.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/8

Äh, what? AMD announced that only GCN 1.1 hardware will support Freesync. Hawaii was released in October at the same time like G-Sync, which is supported down to the GTX680...

GK104 had it tape-out nearly 2 years before Hawaii...



They dont need to debug anything. Either GCN 1.0 supports the DP features or not.

We don't know why they are only supporting it with GCN1.1. Could be hardware or it could be they aren't interested in investing in backwards compatibility. It's a new standard, after all. You can't expect older hardware to support it. It would be nice, but you can't really expect it. As far as G-sync goes, that's different. That's a paid exclusive feature. You would expect more support.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
As far as we can tell vesa just copied two techniques from eDP spec to be optional in the dp 1.2a spec at AMDs request. Without some details on how it actually works we don't know if its a competitor for gsync or not. I think based on what we know it's pretty hard to call gsync dead, its just out and adaptive vsync is a spec that only vesa members have seen and hasn't yet been implemented and we aren't yet sure if the spec is actually what is needed as AMD hasn't provided any details and neither has VESA. If it was good/better (gsync has overhead due to its polling so it can be beat in efficiency) would they not say so? The lack of information 5 months after a working demo and now a very wide delivery timescale, no partner announcements, no product details etc it all looks like vapour right now. Its a bit too early to get excited about the tech at this stage. It can't kill gsync until it at least exists.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If it was so easy to implement than G-Sync, why did Nvidia get there so much faster?

Nvidia said they looked at this, and specifically chose not to do it due to display hardware reasons.

I can't believe you can proclaim that they "failed miserably" now that we've been presented with something that someday, maybe, might do something somewhat similar, once someone, somehow, figures out how to make it. Maybe, because then we'd have to evaluate them side by side for performance, which we can't do when only one of them exists and the other doesn't.

Are you kidding me?

All it took was a simple DP update that added functionality that was already there in eDP to the desktop to discrete monitors. It will be here in 6-12 months as a standard, normal, everyday, no big deal, feature.

It's a cool feature, but it obviously isn't rocket science to implement. Gsync and Adaptive Vsync do the same thing, different names.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
All it took was a simple DP update that added functionality that was already there in eDP to the desktop and discrete monitors. It will be here in 6-12 months as a standard, normal, everyday, no big deal, feature.

It's a cool feature, but it obviously isn't rocket science to implement. Gsync and Adaptive Vsync do the same thing, different names.

There is not currently functionality that has been demonstrated to support frame-by-frame extension of the VBLANK interval in eDP.

You can't say it isn't rocket science to implement when you don't understand what was and was not shown.
 

Mand

Senior member
Jan 13, 2014
664
0
0
The lack of information 5 months after a working demo

That was --NOT-- a working demo. That was a demo of two static refresh rates, one normal, and one changed to a different static refresh rate by a change to the VBLANK interval.

Variable refresh was --NOT-- demonstrated at the CES demo. It was not a working demo by any judgment, despite the ability of AMD to convince people it was. It was not a demo showing off variable refresh, it was a demo showing how much better things look when frame rate matches refresh rate. That is ALL. And that's a very, very big distinction that apparently has gotten lost on all of the "GSYNC IS DEAD" crowd.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |