Freesync monitors to start releasing in November

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

geoxile

Senior member
Sep 23, 2014
327
25
91
Providing support for an unknown feature set that (possibly) directly competes with an in house developed first to market known product does not make Nvidia more competitive. What it does do is lend value to that competing feature set that appears to only be supported by Nvidias only direct competitor in the space.

This is not good business and it will not be good business unless freesync becomes the defacto implementation.

And regardless of your statement all indications are active sync was developed by AMD and gifted to VESA (likely on an effort to undermine an existing similar implementation created by a competitor - which is good business!) This makes it AMDs standard by virtue of creation and by virtue of being the only graphics manufacturer who is actually utilizing the process.

I really don't even know what to make of this mess.

Let's make this clear, adaptive sync, being a part of the DP spec means any VESA member can get access to the info. It is in no way unknown to IHVs. What is unknown is how freesync, AMD's support for adaptive sync, functions. Nvidia does NOT have to follow AMD's implementation, they can (and in any case, will) come up with their own implementation to support adaptive sync.

What isn't good business is giving your main competitor an exclusive feature by failing to support third party monitors. When adaptive sync monitors launch AMD cards are going to have incentive added against Nvidia cards exactly like what happened when G-sync monitors launched.

"Active sync"? Please, if you don't know something, keep your mouth shut.
Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

By nature of being a VESA spec, any member should be able to view and use the technology as they see fit.

This means that it is in fact not AMD's standard, at all or in any way, since VESA is the one in control of its design, implementation (in DP), and ultimately usage. Nor is AMD the one implementing the feature, that's the monitor manufacturers. AMD is providing a feature that makes use of adaptive sync. Nvidia supporting it as well in no way gives AMD an advantage, in fact failing to support it provides AMD with an advantage by handing them an exclusive feature.
 

godihatework

Member
Apr 4, 2005
96
17
71
In that case, I hope it has an equal or better feature set than Gsynch.

That would be the ideal situation yes. If the open standard is demonstrably superior one would hope that Intel would begin to support that standard. If Intel comes on board that will truly define the "standard" as they drive some ridiculous percentage of all graphics displays right now.

All this discussion is pissing in the wind right now tho - we don't know if freesync is inferior, comparable or superior to gsync because we haven't actually seen it.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
backlight pulsing can be done on any monitor with fast pixel transition times, and it's pretty cheap to implement. I expect to see it on the vast majority of 120hz+ displays, regardless of whether they implement variable refresh.


Is it cheap? I seem to remember hearing the opposite...
 

godihatework

Member
Apr 4, 2005
96
17
71
I really don't even know what to make of this mess.

Let's make this clear, adaptive sync, being a part of the DP spec means any VESA member can get access to the info. It is in no way unknown to IHVs. What is unknown is how freesync, AMD's support for adaptive sync, functions. Nvidia does NOT have to follow AMD's implementation, they can (and in any case, will) come up with their own implementation to support adaptive sync.

What isn't good business is giving your main competitor an exclusive feature by failing to support third party monitors. When adaptive sync monitors launch AMD cards are going to have incentive added against Nvidia cards exactly like what happened when G-sync monitors launched.

"Active sync"? Please, if you don't know something, keep your mouth shut.


By nature of being a VESA spec, any member should be able to view and use the technology as they see fit.

This means that it is in fact not AMD's standard, at all or in any way, since VESA is the one in control of its design, implementation (in DP), and ultimately usage. Nor is AMD the one implementing the feature, that's the monitor manufacturers. AMD is providing a feature that makes use of adaptive sync. Nvidia supporting it as well in no way gives AMD an advantage, in fact failing to support it provides AMD with an advantage by handing them an exclusive feature.

Is DisplayPort™ Adaptive-Sync the industry-standard version of Project FreeSync?
A: The DisplayPort™ Adaptive-Sync specification was ported from the Embedded DisplayPort™ specification through a proposal to the VESA group by AMD. DisplayPort™ Adaptive-Sync is an ingredient feature of a DisplayPort™ link and an industry standard that enables technologies like Project FreeSync.

Certainly looks like this was amds baby per amd.

EDIT

And to be quite frank, I don't have a horse in this race. But plain and simple - there is no good reason for Nvidia to implement a standard that directly competes with their own existing technology unless that standard actually defines the market. Their best play is to attempt to outcompete and look to establish the actual standard as defined by market share. A good example would be AMD64 vs whatever travesty Intel came up with (itanium? Its been a while).
 
Last edited:

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
AMD did. They had a Q&A (probably still do) where they asked themselves the question what the difference was and said it was better. Its not some forum dweller that said that it was AMD.


Got a link, Bright? I don't recall ever reading such a claim from AMD, and the only superiority claim made in the official FAQ is that FreeSync is non-proprietary: http://support.amd.com/en-us/search/faq/220

That's not a claim that the technical performance is superior...
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Is DisplayPort™ Adaptive-Sync the industry-standard version of Project FreeSync?
A: The DisplayPort™ Adaptive-Sync specification was ported from the Embedded DisplayPort™ specification through a proposal to the VESA group by AMD. DisplayPort™ Adaptive-Sync is an ingredient feature of a DisplayPort™ link and an industry standard that enables technologies like Project FreeSync.

Certainly looks like this was amds baby per amd.

If by AMD's baby you're talking about their efforts to get adaptive sync onto a new platform, then yes. That doesn't make it AMD's standard. Do you think if company "Example" proposes that Mantle be ported to GNU/Linux and AMD does so, Mantle becomes "Example"'s baby? Absolutely not. Case closed, it's not AMD's baby, AMD just wanted VESA to port it over so they could use it for other purposes.

As for Intel supporting adaptive sync.

http://www.phoronix.com/scan.php?page=news_item&px=MTU0Njg

Dynamic Refresh Rate Switching is a power-saving feature that allows for the graphics driver to switch between low and high refresh rates based upon the system's usage scenario...

Dynamic Refresh Rate Switching is dependent upon having an internal embedded DisplayPort (eDP) panel and support for DRRS must be advertised by the panel's EDID data.
They've already used it on eDP as a power saving feature for mobile platforms. For the final time, adaptive sync is not AMD's standard, it's not their technology, and it's certainly not an unknown feature. AMD Freesync != Adaptive sync.

And to be quite frank, I don't have a horse in this race. But plain and simple - there is no good reason for Nvidia to implement a standard that directly competes with their own existing technology unless that standard actually defines the market. Their best play is to attempt to outcompete and look to establish the actual standard as defined by market share. A good example would be AMD64 vs whatever travesty Intel came up with (itanium? Its been a while).
In fact, there is a good reason. That being AMD will pretty much be using adaptive sync exclusively and probably marketing it as such. Itanium lost because it had no legacy support, being a completely new architecture whereas AMD64 was fully backwards compatible with existing x86 code. Not really applicable here.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
I imagine Nvidia already uses a power-saving technique on their mobile platforms to lower refresh rates via eDP as well. That doesn't mean Intel or Nvidia will use adaptive-sync for variable refresh rates that change frame-by-frame.
 

godihatework

Member
Apr 4, 2005
96
17
71
I imagine Nvidia already uses a power-saving technique on their mobile platforms to lower refresh rates via eDP as well. That doesn't mean Intel or Nvidia will use adaptive-sync for variable refresh rates that change frame-by-frame.

Dingdingdingdingding!

Which is really the whole point of this is it not?

Also - is the root of this disagreement the semantic difference between active sync, adaptive sync and freesync? Because if that's the case then that's some pretty big BS there.

AMD64 out competed the Intel solution and became the standard huh? And then Intel cross licensed it. Wow. I wonder why we would expect Nvidia to latch onto their competitors usage model rather than trying to pull the same thing.

Good for the business /= good for the consumer. Maybe its time for folks to take a step back and realize that. Every company would charge the highest market equilibrium price they could to maximize profit and consumer be damned if they could get away with it. Legitimizing a competitor who is late to market while simultaneously delegitimizing a proprietary solution owned by the company is something I would expect a company to be forced to do by the market, not to voluntarily undertake on their own. Until freesync offers a more compelling solution (through price, features, or a combination of both) Nvidia isn't going to touch it with a ten foot pole. [EDIT - and even then there's no guarantees]
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
Also - is the root of this disagreement the semantic difference between active sync, adaptive sync and freesync? Because if that's the case then that's some pretty big BS there.

There are a couple main sources of disagreement:

1 Mistrust of AMD's representation of Freesync/Adaptive sync. (particularly on how difficult it is to implement/how long it will take to get to market.) Articles like the one in OP don't exactly help.

2 Derision of Nvidia for locking down their implementation of a technology that should have been in monitors ever since the switch from CRT to LCD.



I don't really mind Nvidia locking down their implementation, so long as they keep improving monitors faster than the rest of the industry is willing. There's still a lot of room to improve on g-sync. From AMD, I want them to get the low hanging fruit Nvidia missed, and push back on performance and price.
 

Fastx

Senior member
Dec 18, 2008
780
0
0
144Hz IPS-type Panels Finally On Their Way! 1440p as Well!


Below we bring you some news from AU Optronics about some of their forthcoming panels. Surely of most interest is a new 27" panel (M270DAN02.3) currently in development which will be based on AUO's AHVA panel technology, equivalent to LG.Display's IPS and with very similar performance characteristics. This is a 27" panel with a 2560 x 1440 resolution, 1000:1 contrast ratio, 350 cd/m2 brightness, sRGB gamut and 178/178 viewing angles. Nothing special you might think - wrong! This will be the first IPS-type panel to natively support 144Hz refresh rate, something buyers have been crying out for for a long time! 144Hz AHVA and 2560 x 1440 resolution, we can't wait!

Note: to clear up any confusion, the panel being discussed is an AU Optronics AHVA technology (Advanced Hyper Viewing-Angle), as opposed to their AMVA (Advanced Multi-Domain Vertical Alignment). AHVA is their equivalent to LG.Display's IPS (In Plane Switching) technology, as is Samsung's PLS (Plane to Line Switching). Both were designed as a competing alternative. All 3 technologies (AHVA, IPS, PLS) are very similar in characteristics and performance in practice, and are often simply labelled as "IPS" by manufacturers. This is why we refer to them as IPS-type in this news piece.


In other areas and in the 27" space AU Optronics are also set to start production on their AHVA (IPS-type) 3840 x 2160 resolution 4k panels. The M270QAN01.1 panel with an sRGB colour gamut will go into production in October while the Adobe RGB version, the M270QAN01.2 will go into production in Q4 of 2014.

Also in 27" size there are some new 2560 x 1440 AHVA (IPS-type) panels coming soon. The M270DAN02.0 with sRGB gamut and the M270DAN02.1 with Adobe RGB gamut go into production in October and December respectively. There is also the M270DAN02.2, M270DAN02.3 and M270DAN02.5 going into production in September with sRGB gamuts. The M270DAN02.3 is perhaps the most interesting as this is the 144Hz capable AHVA panel we already discussed.

http://www.tftcentral.co.uk/news_archive/31.htm#144hz_ips
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
If NV won't support open standard FreeSync and G-Sunc at the same time, they are effectively making FreeSync artificially vendor locked by stopping their cards from supporting this open standard feature. This is really bad news for consumers since that means even if you get a FreeSync monitor, you essentially become vendor locked to AMD.

Yes, but you won't be locked to any monitor vendor, thus giving you more options. I suspect Nvidia will lose.
 

Mushkins

Golden Member
Feb 11, 2013
1,631
0
0
Yes, but you won't be locked to any monitor vendor, thus giving you more options. I suspect Nvidia will lose.

Why is it win or lose? Why cant two products exist in the same space? We have more than one car manufacturer, more than one phone manufacturer, etc.

Monitors can absolutely support freesync. Monitors can also support Nvidia Gsync. My money is on Nvidia continuing the brand even if freesync becomes readily available by adding premium "gaming" features and such.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Why is it win or lose? Why cant two products exist in the same space? We have more than one car manufacturer, more than one phone manufacturer, etc.

Monitors can absolutely support freesync. Monitors can also support Nvidia Gsync. My money is on Nvidia continuing the brand even if freesync becomes readily available by adding premium "gaming" features and such.

Wow this ^ this ^ this ^^

This is whole thread is baseless. Its fallacy...heresy

People villainizing and pure deception. Talk about taking it to a whole new level.

But your post is like a breath of reality. There is no "do or die" here. No need to sensationalize invalidity. Why are we even talking about this with such absolutes? At this point there is so very little known about freesync at all. At least let AMDs solution see the light of day before we start grabbing pitch forks and gather troops against Nvidia.

There is no real indication that the freesync route is even on par with gsync. It might be but wouldnt it be dumb of Nvidia to jump on board an inferior method? I am not trying to say freesync wont be as good just that we are nowhere near knowing this. And I would think that is a very important thing you would want to consider when your bashing nvidia for not supporting it.

And this doesnt change the fact that there is nothing wrong with having different options out there. If freesync has its own set of pros and cons and gysnc has its own, they dont have to win. Both can exist together.

everyone is so quick to invent and make stuff up. Its really interesting at times
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Why is it win or lose? Why cant two products exist in the same space? We have more than one car manufacturer, more than one phone manufacturer, etc.

Monitors can absolutely support freesync. Monitors can also support Nvidia Gsync. My money is on Nvidia continuing the brand even if freesync becomes readily available by adding premium "gaming" features and such.

I'm not so sure monitors can actually support both. I suppose they could have 2 different controllers and pathways, but that seems excessive to make it compatible with the same feature for 2 vendors simply because one refuses to adopt an industry standard.

The only way this would make sense is if Gsync ends up being functionally superior, which AMD is claiming it's not. That remains to be seen though.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
I wouldnt hold my breath on AMD claims. Has there been any recent demo's of freesync in action, especially if monitors coming in November?
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
I wouldnt hold my breath on AMD claims. Has there been any recent demo's of freesync in action, especially if monitors coming in November?

Besides the Nvidia demos and showcases, did any review sites receive and test the ROG Swift (since it was the first G-sync monitor to market) before it's official release date? I honestly don't remember, but it seems weird we haven't even seen a preview of Free-sync from anyone...

Edit - It seems reviews for the ROG Swift started July 4th by KitGuru. Release date is hard to find, but Amazon has had the product since August 26th. Not the most reliable source but its all I can find for now.

With the hardware being released next month, hopefully some reviews start trickling in...
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
could someone with good understanding of gsynv + freesync do a simple tldr?

what do these 2 techs do? what do they do to justified the associated cost?

I have a nice dell monitor(1080p, 24 inch) which I bought 3 years ago, still chugging along, and it is still perfect for what I need. my new 290 tri x is pooning every game I have atm. zero fps issues at max everything. What do the 2 sync techs do over my dell monitor that has all of you guys watering your mouths? I want to understand.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Is it cheap? I seem to remember hearing the opposite...
Backlight pulsing is extremely cheap to implement, and any monitor manufacturer can implement it. You have to have fast pixel transition times so you don't have two images on panel at the moment you pulse the backlight. There are some tricks that can make it look better, but it's not really a difficult to implement technology.

It would be much more expensive making it work at the same time as variable refresh, not so much on a per unit cost, but on the research and engineering costs. You can't use backlight pulsing at low framerates, because you'd get very noticeable flicker, and you'd have to modify the pulse width on a per frame basis in order to get a constant apparent brightness when pulsing the backlight on a changing refresh rate.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
could someone with good understanding of gsynv + freesync do a simple tldr?

what do these 2 techs do? what do they do to justified the associated cost?

I have a nice dell monitor(1080p, 24 inch) which I bought 3 years ago, still chugging along, and it is still perfect for what I need. my new 290 tri x is pooning every game I have atm. zero fps issues at max everything. What do the 2 sync techs do over my dell monitor that has all of you guys watering your mouths? I want to understand.

G-Sync will dynamically change the monitor's refreshrate to sync with the rate at which the GPU is producing frames, meaning no image tearing, and all with no input lag that V-Sync can introduce

Freesync promises to deliver something similar, but we still know almost nothing about it and how it will actually work or how it will perform.

Most G-Sync monitors (those that can do 120Hz) also include the option for ULMB to further improve motion clarity (cannot be used in conjunction with variable sync), something AMD has never addressed AFAIK, but that technology could be implemented by the monitor manufacturers on an individual basis
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Besides the Nvidia demos and showcases, did any review sites receive and test the ROG Swift (since it was the first G-sync monitor to market) before it's official release date? I honestly don't remember, but it seems weird we haven't even seen a preview of Free-sync from anyone...

Edit - It seems reviews for the ROG Swift started July 4th by KitGuru. Release date is hard to find, but Amazon has had the product since August 26th. Not the most reliable source but its all I can find for now.

With the hardware being released next month, hopefully some reviews start trickling in...

THG - 11/8 http://www.tomshardware.com/reviews/rog-swift-pg278q-g-sync-monitor,3897.html
Tweak also 11/8 http://www.tweaktown.com/reviews/65...144hz-g-sync-gaming-monitor-review/index.html
HotHardware - 13/8 http://hothardware.com/Reviews/ASUS-ROG-SWIFT-PG278Q-GSYNC-Monitor-Review/
also 3dguru, overclockerclub, tftcentral and pcper...
 

SoulWager

Member
Jan 23, 2013
155
0
71
could someone with good understanding of gsynv + freesync do a simple tldr?

what do these 2 techs do? what do they do to justified the associated cost?

I have a nice dell monitor(1080p, 24 inch) which I bought 3 years ago, still chugging along, and it is still perfect for what I need. my new 290 tri x is pooning every game I have atm. zero fps issues at max everything. What do the 2 sync techs do over my dell monitor that has all of you guys watering your mouths? I want to understand.
V-sync works by delaying the graphics card to match the monitor's refresh cycle. g-sync and freesync work by delaying the monitor's refresh cycle to match the timing of a newly completed frame. The advantages are smoother animation at non-maxed framerates, lower latency than v-sync, and no tearing whatsoever.


If you consistently 100% max out your monitor's refresh rate, you will see no change in smoothness, but you can get a latency improvement over v-sync by framecapping your game engine just under the monitor's max refresh rate, and letting the variable refresh pull all the slack out of the render chain. To minimize input latency you want the bottleneck to be just before you grab the input data, so that the frame is never waiting to be worked on, the resources needed are done with the last frame by the time they receive the next one.


tl;dr: It's smoother animation, lower latency, and no tearing.

As for the differences between the techs, we need to wait for a review of freesync before we can comment on that, but there is a lot of information on how g-sync works.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
As far as the differences go:
The G-sync module comes with ULMB and some color correction while using ULMB (similar to Lightboost, but improved with the color correction and requires no hack for 2D use). However, ULMB mode does not work at the same time as G-sync.

Freesync is simply the variable refresh rate implementation, though that doesn't mean some monitors couldn't include a ULMB mode.

The differences in variable refresh rate is unknown, though I wouldn't expect much differences in actual use.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'll be curious to see the prices of Freesync capable monitors because Gsync monitors are stupid expensive.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
G-Sync will dynamically change the monitor's refreshrate to sync with the rate at which the GPU is producing frames, meaning no image tearing, and all with no input lag that V-Sync can introduce

Freesync promises to deliver something similar, but we still know almost nothing about it and how it will actually work or how it will perform.

Most G-Sync monitors (those that can do 120Hz) also include the option for ULMB to further improve motion clarity (cannot be used in conjunction with variable sync), something AMD has never addressed AFAIK, but that technology could be implemented by the monitor manufacturers on an individual basis
thank you.
V-sync works by delaying the graphics card to match the monitor's refresh cycle. g-sync and freesync work by delaying the monitor's refresh cycle to match the timing of a newly completed frame. The advantages are smoother animation at non-maxed framerates, lower latency than v-sync, and no tearing whatsoever.


If you consistently 100% max out your monitor's refresh rate, you will see no change in smoothness, but you can get a latency improvement over v-sync by framecapping your game engine just under the monitor's max refresh rate, and letting the variable refresh pull all the slack out of the render chain. To minimize input latency you want the bottleneck to be just before you grab the input data, so that the frame is never waiting to be worked on, the resources needed are done with the last frame by the time they receive the next one.


tl;dr: It's smoother animation, lower latency, and no tearing.

As for the differences between the techs, we need to wait for a review of freesync before we can comment on that, but there is a lot of information on how g-sync works.
thank you. Gonna google videos of gsync and check it out. I am glad that since I can max everything, the impact isn't much.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |