AMD Freesync Monitors & Reviews Thread

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Elfear

Diamond Member
May 30, 2004
7,115
690
126
My absolute number one priority is getting a 3440x1440 IPS, preferably with LG's window management software or equivalent, because that's a really good screen and resolution for gaming,

That's what I'm holding out for as well. Freesync, 3440x1440, IPS, and 120hz refresh rate. 120hz is probably asking too much at this point but I can always dream.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You see Ghosting with ULMB because the pixel response isnt fast enough to switch. But this is a problem on every display except OLED and Plasma.

It has nothing to do with PCPer findings.

Notice how the blurring/ghosting on the Freesync monitor is blamed on AMD, either Freesync is at fault or AMD didn't make sure the panel was specc'd to a high enough standard. Notice how the same phenomenon on the Gsync panel has nothing to do with Gsync or nVidia. It's the 1ms panel that's too slow?

I agree with you sontin, it's an artifact of the panel tech. Now, let's see how many people continue to blame AMD though.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I remember the first time I turned on my ASUS VG248QE - out of the box - and the image was ghosting like crazy and look absolutely horrid in motion. It all stemmed from the "Trace Free" setting being at maximum by default. I turned that off and the ghosting went away entirely. I certainly didn't blame my NVIDIA GPU or DVI cable for the issue. It was ASUS's fault.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I remember the first time I turned on my ASUS VG248QE - out of the box - and the image was ghosting like crazy and look absolutely horrid in motion. It all stemmed from the "Trace Free" setting being at maximum by default. I turned that off and the ghosting went away entirely. I certainly didn't blame my NVIDIA GPU or DVI cable for the issue. It was ASUS's fault.

Right it happens, monitors come tuned poorly all the time.

What we aren't sure of, though, is if there is extra tuning required for variable refresh rates.

Personally I'd just get a monitor with tweakable overdrive like the BenQ. This seems like a poor substitute for a properly VRR calibrated scaler, though...

edit: well, "substitute" is a poor choice of words, since you'd want such a setting anyway. I just meant to say that constantly messing with the overdrive depending on what game you're playing is dumb (since all games would have different performance)
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
That's what I'm holding out for as well. Freesync, 3440x1440, IPS, and 120hz refresh rate. 120hz is probably asking too much at this point but I can always dream.

Higher refresh rate is something I'd definitely want, but it's in the territory of things I'd be perfectly willing to compromise on if I couldn't get it but could get 3440x1440 IPS.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I can see it. I wouldn't have noticed if he hadn't pointed it out, though

This is exactly the point. Without anyone pointing it out or looking for it, I would never see it.

Its not so much "Is there ghosting/flicker?" but "How bad is it?"
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Freesync will win the overall market due to widespread DisplayPort 1.2a adoption, but this choice wont be decided by AMD or Nvidia users but by Intel since Broadwell supports 1.2a. I suspect Nvidia will have to adopt some form of open 1.2a support for its mobile GPUs since I doubt the laptop market will be as willing to pay a G-SYNC premium. Whether G-SYNC survives long term as a premium sync alternative with unique features for Nvidia users remains to be seen.


Intel is nearly irrelevant in this conversation. The benefit of these sync technologies is for gaming. Nobody does serious gaming with igpu.

When the market for gaming is dominated by nvidia I don't see how your scenario will work out.

Like I said, best to wait it out right now.
 
Last edited:

mindbomb

Senior member
May 30, 2013
363
0
0
what will eventually happen is that monitors will come out that support both and have a displayport for nvidia, and another for AMD. This after the costs come down so that you can do this in an affordable fashion.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Intel is nearly irrelevant in this conversation. The benefit of these sync technologies is for gaming. Nobody does serious gaming with igpu.

When the market for gaming is dominated by nvidia I don't see how your scenario will work out.

Like I said, best to wait it out right now.

But there are tons of people who do "non-serious" (i.e. casual) gaming on igpu, and they would absolutely stand to benefit from this.
 
Feb 19, 2009
10,457
10
76
Intel is nearly irrelevant in this conversation. The benefit of these sync technologies is for gaming. Nobody does serious gaming with igpu.

I hope you realize a lot of gamers on LOL or Dota run it fine on Intel iGPU or AMD APUs. These games along with CS:GO are the most popular online games. Heck even WoW on low details run perfectly fine on iGPUs. These gamers are the ones who benefit from FS/GS technology as they suffer bad FPS dips.

Serious gaming isn't a pedestal reserved for those with high-end rigs.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
what will eventually happen is that monitors will come out that support both and have a displayport for nvidia, and another for AMD. This after the costs come down so that you can do this in an affordable fashion.

nVidia isn't going to make Gsync come down in price. It exists purely as a way to increase profits for nVidia.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
But there are tons of people who do "non-serious" (i.e. casual) gaming on igpu, and they would absolutely stand to benefit from this.

Screens need to change radically first. 40-48hz ranges are useless for anythign than high powered gaming. Even movies will fail.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Freesync will win the overall market due to widespread DisplayPort 1.2a adoption, but this choice wont be decided by AMD or Nvidia users but by Intel since Broadwell supports 1.2a. I suspect Nvidia will have to adopt some form of open 1.2a support for its mobile GPUs since I doubt the laptop market will be as willing to pay a G-SYNC premium. Whether G-SYNC survives long term as a premium sync alternative with unique features for Nvidia users remains to be seen.

Could you list where Broadwell supports DP1.2a?

Not even Skylake list DP 1.2a on its diagrams.
http://www.hardwareluxx.de/images/stories/newsbilder/mniedersteberg/News_2014/skylake_PCHIO.JPG
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Intel is nearly irrelevant in this conversation. The benefit of these sync technologies is for gaming. Nobody does serious gaming with igpu.

When the market for gaming is dominated by nvidia I don't see how your scenario will work out.

Like I said, best to wait it out right now.

It doesn't matter who dominates the market. The manufactures will decide what wins. It costs LG or BenQ or Acer or Samsung nothing to make a VRR monitor. All the major scalars. already have the ability. The scalers that they are already using and will continue to use. That is why ALL of Samsungs 4K monitors going forward will have freesync compatibility. LG don't make gaming monitors and they have Monitors ready on launch day.

If they want Gsync, they would have to go buy nvidia's scaler. Samsung, LG and Dell are never going to do that. a large selling point of their monitors is the software, features and calibration settings, by going with Gsync they are giving all of that up.

If freesync didn't exist. you wouldn't have had VRR from LG or Samsung, but since it still works with what they have on their monitors already. It was a no brainer.

An example is Asus who released a monitor with VRR that works with Freesync. It isn't a gaming monitor. It just so happens to have a compatible scaler. That how easy it is.

Nvidia have Asus, Acer and BenQ. Freesync has Asus, Acer, BenQ, Samsung, LG, Viewsonic and Nixeus. I am pretty sure Dell and HP will release monitors as well. There is going to be a much larger variety of freesync monitors than gsync.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It doesn't matter who dominates the market. The manufactures will decide what wins. It costs LG or BenQ or Acer or Samsung nothing to make a VRR monitor. All the major scalars. already have the ability. The scalers that they are already using and will continue to use. That is why ALL of Samsungs 4K monitors going forward will have freesync compatibility. LG don't make gaming monitors and they have Monitors ready on launch day.



If they want Gsync, they would have to go buy nvidia's scaler. Samsung, LG and Dell are never going to do that. a large selling point of their monitors is the software, features and calibration settings, by going with Gsync they are giving all of that up.



If freesync didn't exist. you wouldn't have had VRR from LG or Samsung, but since it still works with what they have on their monitors already. It was a no brainer.



An example is Asus who released a monitor with VRR that works with Freesync. It isn't a gaming monitor. It just so happens to have a compatible scaler. That how easy it is.



Nvidia have Asus, Acer and BenQ. Freesync has Asus, Acer, BenQ, Samsung, LG, Viewsonic and Nixeus. I am pretty sure Dell and HP will release monitors as well. There is going to be a much larger variety of freesync monitors than gsync.


It'll matter if gsync displays are in demand. That's the point, the market will decide not necessarily what any display manufacturer does. If the market starts to demand gsync then they will produce it. If not, then whatever. Still, you are locked in a brand of gpu either way for now.
 
Last edited:

RoarTiger

Member
Mar 30, 2013
67
33
91
Could you list where Broadwell supports DP1.2a?

Not even Skylake list DP 1.2a on its diagrams.
http://www.hardwareluxx.de/images/stories/newsbilder/mniedersteberg/News_2014/skylake_PCHIO.JPG

Read that yesterday

As you can see from my post I was referencing the mobile lineup. VESA Adaptive Sync has been in eDP standards for years.

From the VESA 1.2a announcement last year
"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."


This is how PCPER could magically access G-SYNC feature without an adaptor using just a driver. Adaptive Sync is already available in many mobile GPU it just needs a driver and a compliant panel.

This is why I said Freesync (hardware free adapative sync) would win the market. How long will Nvidia justify charging a G-SYNC hardware premium to desktop users only; I guess that depends on how many value added features they can enable with their hardware.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
As long as AdaptiveSync displays dont provide the same quality and experience like G-Sync the market will have room for both techniques.

It is up to AMD to provide something as good.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Read that yesterday

As you can see from my post I was referencing the mobile lineup. VESA Adaptive Sync has been in eDP standards for years.

From the VESA 1.2a announcement last year
"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP&#8482 specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."


This is how PCPER could magically access G-SYNC feature without an adaptor using just a driver. Adaptive Sync is already available in many mobile GPU it just needs a driver and a compliant panel.

This is why I said Freesync (hardware free adapative sync) would win the market. How long will Nvidia justify charging a G-SYNC hardware premium to desktop users only; I guess that depends on how many value added features they can enable with their hardware.

I read the techpowerup, but unfortunately the article writer forgot Intels specs for the released products.

So we can agree, no DP1.2a in Broadwell and Skylake? The only support they have is via eDP 1.4. Meaning freesync/adaptive monitors are completely irrelevant for Intel products until sometime after Skylake or later. At best the first Intel product, assuming Intel throws in full support, will be late 2016 with Cannonlake. Or in other words, DP1.3 because DP1.2a will never get support outside AMD from the looks of it.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
As long as AdaptiveSync displays dont provide the same quality and experience like G-Sync the market will have room for both techniques. It is up to AMD to provide something as good.

maybe or maybe not. Freesync is likely to evolve faster due to its wider industry support as its based on VESA standards. The monitor scaler vendors will look to reduce the minimum of VRR to 30 Hz. Moreover the 16/14nm FINFET GPUs with HBM2 are going to bring massive performance uplift in 2016 and 2017. So its going to be much easier to maintain 40+ fps at 1440p and 4k with a single GPU. Games will continue to be hungry for more powerful hardware but if Titan-X with 7 TFLOPS, 12 GB VRAM and 336 GB/s bandwidth or a R9 390X with 8.5+ TFLOPS , 8 GB HBM1 and 512+ GB/s bandwidth are powerful 4k cards imagine what a GP300/GP400 or a R9 490X / R9 590X with 12 - 15 TFLOPS, 16 - 32 GB HBM2 and 1 TB/s bandwidth are capable of. We are also likely to see 4k 120 - 144 Hz monitors launch so as to exploit the power of 2 of these monster GPUs. Most console games which are ported to PC will run extremely well on single GPU flagships and even mid range cards (USD 300 - 400) at 1440p and 4k and with the improvement of minimum of VRR to 30 Hz the monitor performance below 30 fps will be a moot point.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The minimum rate for g-sync/freesync really needs to be 24 or lower. 30 is just barely acceptable gaming wise. For AMDs APUs for example, it needs to be significantly lower.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
As long as AdaptiveSync displays dont provide the same quality and experience like G-Sync the market will have room for both techniques.

It is up to AMD to provide something as good.

AMD branded monitors? They could partner up with someone I guess to make it happen.

Downside is it could be 95% perfect and we'd be reading countless postings/reviews about the 5% imperfections. Nitpicked to death I'd imagine.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
As long as AdaptiveSync displays dont provide the same quality and experience like G-Sync the market will have room for both techniques.

It is up to AMD to provide something as good.
G-sync wasn't exactly perfect when it came out and still isn't. Adaptive sync will evolve and get better too

The minimum rate for g-sync/freesync really needs to be 24 or lower. 30 is just barely acceptable gaming wise. For AMDs APUs for example, it needs to be significantly lower.
It's panel technology that needs to change, not g or free sync as they can go lower

AMD branded monitors? They could partner up with someone I guess to make it happen.

Downside is it could be 95% perfect and we'd be reading countless postings/reviews about the 5% imperfections. Nitpicked to death I'd imagine.

Lol, very true
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |