[techreport] BenQ's XL2730Z 'FreeSync' monitor reviewed

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Abwx

Lifer
Apr 2, 2011
11,172
3,869
136
AMD allows for a tear, or the method I mentioned.

Without the tear, you might get up to 7ms added to a frame if it falls below 25ms on one that has a minimum of 40hz. Or at least that is what AMD says is supposed to happen.

When a frame takes longer than 25ms to display, the same frame gets a refresh. It has no idea how long it will be before the next frame is ready, so it made no attempt to refresh earlier than at 25ms of displaying. Given that the fastest a refresh can take place is 7ms, if this happens, that frame is committed to be displayed for at least 32ms. Even if a new frame was ready at 26ms. This is still better than what typically happens with V-sync, but still not as ideal as Nvidia attempts to do by averaging out the previous 2 frames to allow for it to preemptively refresh sooner than 25ms.

ocre is most definitely way off. I agreed on that. I was just trying to explain what actually happens, or is supposed to. Someone earlier was mentioning that it might be adding 25ms to the frame, rather than 7ms. This is not supposed to happen, and should be ironed out fairly easily.

What eventualy allow for a tear is the screen manufacturer, if this latter use a panel that has minimal frame rate of 25ms then Freesync can do nothing about it, just get a monitor that has a lower minimal refresh time rate, that s purely panel dependant, or eventualy firmware dependant but in this latter case there s few probabilities that the manufacture will artificialy limit a panel below its capabilities.

Notice that Gsync is not free from this issue, it s just that panels with lower refresh times were selected for the purpose, in this respect Techreport is wrong to assume that it could have the slightest advantage over Freesync, the only thing they witnessed is that the Benq has not a panel as good (refresh time wise) as the Gsync one used as comparison.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I haven't really dug into this, I was just thinking out loud I guess.

My post was simply based on what I remember about LEDs and how we achieve different levels of brightness.

So, brightness is perceived by the pulse of the LED. I am no tv expert not a monitor expert. My thinking was a sudden change in duration would effect brightness, which for LEDs is true. But I was assuming that the LED would be on for the duration between each refresh. Surely, not everyone will have the brightness all the way up but it makes me think of a lot of potential issues.

My post above was backwards besides the fact that I never took the time to look at how led pulsing for brightness is effected by a changing variable refresh rate. I also just assumed that all the LEDs go off when it is time to refresh the new frame. That might be totally wrong too.

So absolutely I could be totally wrong about how these things work. Just wrote about what I was thinking.

as for a single led, a sudden drastic change in refresh rate would cause a very noticeable burst of bright to darkness. sudden flicker. This is absolutely true if you take any single LED and a pulse rate of 24ms that jumps to 7ms periodically.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
What eventualy allow for a tear is the screen manufacturer, if this latter use a panel that has minimal frame rate of 25ms then Freesync can do nothing about it, just get a monitor that has a lower minimal refresh time rate, that s purely panel dependant, or eventualy firmware dependant but in this latter case there s few probabilities that the manufacture will artificialy limit a panel below its capabilities.

Notice that Gsync is not free from this issue, it s just that panels with lower refresh times were selected for the purpose, in this respect Techreport is wrong to assume that it could have the slightest advantage over Freesync, the only thing they witnessed is that the Benq has not a panel as good (refresh time wise) as the Gsync one used as comparison.

Not according to the article.
http://techreport.com/review/28073/benq-xl2730z-freesync-monitor-reviewed/3
I asked AMD's David Glen, one of the engineers behind FreeSync, about how AMD's variable-refresh scheme handles this same sort of low-FPS scenario. The basic behavior is similar to G-Sync's. If the wait for a new frame exceeds the display's tolerance, Glen said, "we show the frame again, and show it at the max rate the monitor supports." Once the screen has been painted, which presumably takes less than 6.94 ms on a 144Hz display, the monitor should be ready to accept a new frame at any time.

What FreeSync apparently lacks is G-Sync's added timing logic to avoid collisions. However, FreeSync is capable of operating with vsync disabled outside of the display's refresh range. In the event of a collision with a required refresh, Glen pointed out, FreeSync can optionally swap to a new frame in the middle of that refresh. So FreeSync is not without its own unique means of dealing with collisions. Then again, the penalty for a collision with vsync enabled should be pretty minor. (My sense is that FreeSync should just paint the screen again with the new frame as soon as the current refresh ends.)
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I haven't really dug into this, I was just thinking out loud I guess.

My post was simply based on what I remember about LEDs and how we achieve different levels of brightness.

So, brightness is perceived by the pulse of the LED. I am no tv expert not a monitor expert. My thinking was a sudden change in duration would effect brightness, which for LEDs is true. But I was assuming that the LED would be on for the duration between each refresh. Surely, not everyone will have the brightness all the way up but it makes me think of a lot of potential issues.

My post above was backwards besides the fact that I never took the time to look at how led pulsing for brightness is effected by a changing variable refresh rate. I also just assumed that all the LEDs go off when it is time to refresh the new frame. That might be totally wrong too.

So absolutely I could be totally wrong about how these things work. Just wrote about what I was thinking.

as for a single led, a sudden drastic change in refresh rate would cause a very noticeable burst of bright to darkness. sudden flicker. This is absolutely true if you take any single LED and a pulse rate of 24ms that jumps to 7ms periodically.

That is only true on CRTs and in ULMB mode. With pulsing backlighting, faster refreshes result in brighter images, but in normal LED modes, they are solid state, and the brightness level never changes, but when displayed for longer than 33ms, the color can degrade.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I dont know much about how monitors and TVs handle it or how the pulsing works with the refresh rate but i am fairly certain that the brightness will be a result of how fast the LEDs pulse. For full brightness, they will be on for their max duration and at half brightness, they will be on off on off pattern. This pulsing will be evenly spaced out. That is how it should work
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I dont know much about how monitors and TVs handle it or how the pulsing works with the refresh rate but i am fairly certain that the brightness will be a result of how fast the LEDs pulse. For full brightness, they will be on for their max duration and at half brightness, they will be on off on off pattern. This pulsing will be evenly spaced out. That is how it should work

Normal LED/LCD panels don't pulse. That is the UMLB/Lightboost modes that sometimes exist, but are not compatible with Freesync or Gsync. Now the brightness levels may be a result of the pixels pulsing, but that is uneffected by the refresh rate, so it does not matter if the hz is higher or lower.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
How about instead of listening to what AMD has to say, you look at the screen of test equipment hooked up to an actual monitor? Mute the video if you think the reviewers are biased, the correlation between framerate and refresh interval speaks for it's self: https://youtu.be/VkrJU5d2RfA?t=1410

I was not questioning the intervals and their correlation to framerate, I was merely pointing out the transition in and out of vrr and the effect it might have on how long a frame is displayed. What would be more useful is if pcper set a sweep for 40 fps and below and if the difference is as stark as they claim it is, it should show in a side by side comparison.They don't have enough street cred with me for me to take them on their word.Here is a video of both solutions at ~30fps freesync with vsync off.
 

SoulWager

Member
Jan 23, 2013
155
0
71
I was not questioning the intervals and their correlation to framerate, I was merely pointing out the transition in and out of vrr and the effect it might have on how long a frame is displayed. What would be more useful is if pcper set a sweep for 40 fps and below and if the difference is as stark as they claim it is, it should show in a side by side comparison.They don't have enough street cred with me for me to take them on their word.Here is a video of both solutions at ~30fps freesync with vsync off.
The tearing is pretty obvious in that video, but right now I'm curious about the AMD guy on the beyond3d forums. I looked through the posts in that freesync thread since march, and didn't see any new information about how freesync is implemented, just the blatantly obvious "It should be able to do this" posts. Which user is from AMD, and is that user involved in freesync development?
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Is pcper really that biased? It's not like their testing/results were patently false, it's not catastrophic for Shrout to prefer NVIDIA stuff in contemporary times.

Go read some of their articles, or follow some youtube videos. Apply critical thinking skills and watch how they portray things regarding AMD/NV. They are blatantly biased.

The final straw was pretending to develop their software to test the smoothness of frame delivery. Well, later on it gets revealed that they were the site publishing NV's internally developed FCAT tools results which showed NV in a good light. XDMA comes out and is better than kepler, FCAT basically dies out. Now when maxwell has came out and XDMA crossfire is as good or even better the FCAT reviews have dried up or are minimal. Sure they have taken a couple games months after maxwell release and ran them through FCAT (which basically confirmed that maxwell was mediocre or lacking in sli).

Did any of you that are claiming bias, read the article? They are vary favorable towards AMD's freesync. They said they think Nvidia will eventually be forced to use it, and while they did acknowledge Nvidia's slightly better features, they didn't think you could see the difference except in rare cases if you are looking for it.

Just how was this article unfair towards AMD's new product?

It's hard not to be "favorable" when basically all reviews say that freesync is good. They just make sure to make mountains out of molehills from NV's marketing to ensure NV has something of value in the $150+ premium. NV basically reiterates the points NVPer mentions coincidentally.

I'm not claiming there aren't any issues, and the issue being discussed may be more of a panel flaw than freesync itself, the point is that PCPer is blatantly biased and or sponsored by NV. They pick apart and nitpick AMD, it's almost as if they work with NV marketing directly (calling their "friends" at NV while on camera to discuss issues that will be fixed quickly so they can gloss over flaws).

I hesitate to call a site "biased", especially if I just question their results once or twice, but PCPer is consistent in that regard.

tldr;
Take off the green glasses and watch for that bias. PCPer is very biased.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
Go read some of their articles, or follow some youtube videos. Apply critical thinking skills and watch how they portray things regarding AMD/NV. They are blatantly biased.

The final straw was pretending to develop their software to test the smoothness of frame delivery. Well, later on it gets revealed that they were the site publishing NV's internally developed FCAT tools results which showed NV in a good light. XDMA comes out and is better than kepler, FCAT basically dies out. Now when maxwell has came out and XDMA crossfire is as good or even better the FCAT reviews have dried up or are minimal. Sure they have taken a couple games months after maxwell release and ran them through FCAT (which basically confirmed that maxwell was mediocre or lacking in sli).



It's hard not to be "favorable" when basically all reviews say that freesync is good. They just make sure to make mountains out of molehills from NV's marketing to ensure NV has something of value in the $150+ premium. NV basically reiterates the points NVPer mentions coincidentally.

I'm not claiming there aren't any issues, and the issue being discussed may be more of a panel flaw than freesync itself, the point is that PCPer is blatantly biased and or sponsored by NV. They pick apart and nitpick AMD, it's almost as if they work with NV marketing directly (calling their "friends" at NV while on camera to discuss issues that will be fixed quickly so they can gloss over flaws).

I hesitate to call a site "biased", especially if I just question their results once or twice, but PCPer is consistent in that regard.

tldr;
Take off the green glasses and watch for that bias. PCPer is very biased.
What are you on about? PCPer still uses FCAT for their video card reviews. People were talking a lot about frame pacing because there was a big difference between AMD and Nvidia. Now there isn't, so why would people spend a lot of time talking about it?

It doesn't matter where the tools come from as long as they work as advertised. Nvidia had something to gain from helping shine a light on frame pacing, but it's not like adding a colored bar to the left side of each frame somehow made AMD perform worse.

You need to wake up to the fact that AMD is responsible if their own products perform poorly, pointing out the flaws in a product is a reviewer's JOB, it's not an indication of bias. A bias would be noticing a flaw in a product and not reporting on it because of who the manufacturer is.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
PCPer's testing demonstrates a 25ms lockout

Until more reviews come out you can't take that test and come to a conclusion like that. If we see a trend, if we see similar results from more sources, then you can say yeah NV's solution is better. Meanwhile relax a bit.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Until more reviews come out you can't take that test and come to a conclusion like that. If we see a trend, if we see similar results from more sources, then you can say yeah NV's solution is better. Meanwhile relax a bit.

Actually, I can come to a conclusion like that. I can make that assessment confidently because I understand how an oscilloscope works, and what the waveform on the scope should look like. If it was working with a 7ms lockout, you'd see a big peak(25ms) followed by a short peak(7ms) when running at 33~35fps. As you go below 33, that short peak would slowly get bigger until you get to 20fps, where it's as long as the first peak. If freesync was working like this, there would be zero tearing or judder in a 30fps locked scenario.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's hard not to be "favorable" when basically all reviews say that freesync is good. They just make sure to make mountains out of molehills from NV's marketing to ensure NV has something of value in the $150+ premium. NV basically reiterates the points NVPer mentions coincidentally.

I'm not claiming there aren't any issues, and the issue being discussed may be more of a panel flaw than freesync itself, the point is that PCPer is blatantly biased and or sponsored by NV. They pick apart and nitpick AMD, it's almost as if they work with NV marketing directly (calling their "friends" at NV while on camera to discuss issues that will be fixed quickly so they can gloss over flaws).

I hesitate to call a site "biased", especially if I just question their results once or twice, but PCPer is consistent in that regard.

tldr;
Take off the green glasses and watch for that bias. PCPer is very biased.

It sounds like you just don't like to see anything but the positive to me. They sure made it sound like a great monitor, even when uncovering its minor flaws. They seem to be a lot more technical than most sites, which is why they do find the minor flaws more often. That is not bias, that is just them doing a thorough job.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
What are you on about? PCPer still uses FCAT for their video card reviews. People were talking a lot about frame pacing because there was a big difference between AMD and Nvidia. Now there isn't, so why would people spend a lot of time talking about it?

It doesn't matter where the tools come from as long as they work as advertised. Nvidia had something to gain from helping shine a light on frame pacing, but it's not like adding a colored bar to the left side of each frame somehow made AMD perform worse.

You need to wake up to the fact that AMD is responsible if their own products perform poorly, pointing out the flaws in a product is a reviewer's JOB, it's not an indication of bias. A bias would be noticing a flaw in a product and not reporting on it because of who the manufacturer is.

How do we know that when they barely cover anything with FCAT anymore? It does matter where the tool comes from, when you present it as your own (initially) on a supposedly neutral review site. Of course NV had everything to gain, why did they go to PCPer with it? Maybe they have pretty good ties? Rhetorical questions for thought.

Of course AMD is responsible for their products, and credit can be given where due (good or bad).

Anyways, I'm done here as long as you miss the point, just read their reviews and see for yourself. It's not about a single review, it's a pattern with them.


It sounds like you just don't like to see anything but the positive to me. They sure made it sound like a great monitor, even when uncovering its minor flaws. They seem to be a lot more technical than most sites, which is why they do find the minor flaws more often. That is not bias, that is just them doing a thorough job.

Nope, you missed the point. Just read their reviews critically and see whether they tend to be biased for yourself.

It's not about the monitor, nor a single review. :\
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Normal LED/LCD panels don't pulse. That is the UMLB/Lightboost modes that sometimes exist, but are not compatible with Freesync or Gsync. Now the brightness levels may be a result of the pixels pulsing, but that is uneffected by the refresh rate, so it does not matter if the hz is higher or lower.

Actually almost all LED panels pulse, just very fast. It's how brightness is controlled. It's called Pulse Width Modulation. You can read about it here: http://www.lutron.com/TechnicalDocumentLibrary/048360a_PWM_vs_CCR_LED_App_Note.pdf13

Edit:
I guess I should have read your entire post
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
How do we know that when they barely cover anything with FCAT anymore? It does matter where the tool comes from, when you present it as your own (initially) on a supposedly neutral review site. Of course NV had everything to gain, why did they go to PCPer with it? Maybe they have pretty good ties? Rhetorical questions for thought.




Nope, you missed the point. Just read their reviews critically and see whether they tend to be biased for yourself.

It's not about the monitor, nor a single review. :\
I'd like you to come up with an example of a biased conclusion. I.e. A situation where AMD had a clearly better product, and PCPer still pushed Nvidia. If you can't do that, then maybe you should question your assumptions.

PCPer did the exact same low framerate investigation with the g-sync diy kit over a year ago, and brought up suboptimal behavior below 30fps. Nvidia admitted that it was a problem, and fixed it for monitors shipping with g-sync. http://www.pcper.com/reviews/Graphi...lation-and-Performance-Review/Impressions#bug

But no, OBVIOUSLY pcper only tested fallback cases because they wanted to nitpick AMD.

PS: ryan shrout started doing the frame capture stuff before nvidia made their color injection tool, and I think they currently use rivatuner, instead of anything made by nvidia.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Wait just a minute here. 25ms + 7ms = 32ms. 30fps = 33ms The freesync monitor should be displaying new frames exactly on time here, just like the g-sync monitor. What gives?

I saw this posting: https://forum.beyond3d.com/posts/1837324/
So i downloaded the video and played it at 25% of the speed: Obviously the driver doesnt behave in the way like AMD claimed it.

You can even see it at 60FPS that G-Sync is smoother.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Some people just want to close their eyes, but whether or not you think the whole FCAT thing was a conspiracy, you cannot deny that it resulted in AMD fixing a big problem, and now AMD users much better CF experiences as a result.

Them bringing out the Freesync flaws is also likely going to result in a better Freesync experience. The situations at the edges of the monitors hz can easily be improved with some prediction in their software, and now that PCPer has brought up its minor flaws, it can be addressed.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Some people just want to close their eyes, but whether or not you think the whole FCAT thing was a conspiracy, you cannot deny that it resulted in AMD fixing a big problem, and now AMD users much better CF experiences as a result.

Them bringing out the Freesync flaws is also likely going to result in a better Freesync experience. The situations at the edges of the monitors hz can easily be improved with some prediction in their software, and now that PCPer has brought up its minor flaws, it can be addressed.

I fully agree with that. Who's closing their eyes? Questioning a sites bias has nothing to do with it.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I fully agree with that. Who's closing their eyes? Questioning a sites bias has nothing to do with it.

You aren't happy with them, because they detailed faults. How else can I take that? If they didn't show the faults, that would be closing their eyes.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
5,622
8,847
136
Does anyone know where pcper's oscilloscope + screen camera setup came from and why it wasn't used previously (or was it)?
 

Abwx

Lifer
Apr 2, 2011
11,172
3,869
136
Some people just want to close their eyes, but whether or not you think the whole FCAT thing was a conspiracy, you cannot deny that it resulted in AMD fixing a big problem, and now AMD users much better CF experiences as a result.

And its suppression resulting in Nvidia not needing to fix the same issues, why bother, they are downplayed at will...

Actualy this had much more to do with viral marketing than anything else, Hawai was just too good to be competed only on a price/perf comparison.


As for the article below that s total bs, look like even the graphs were supplied by Nvidia, if not the complete article, besides where are the oscillloscope measurements of Gsync..?.

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Are we supposed to trust this reviewer at face value, that is, measurement for AMD and plain marketing for Nvidia, hey, they cant lie about the possibilities of their implentations..

Actualy he s measuring nothing, just that the panel is limited to 25ms time frame and the corresponding management by Freesync.

Indeed statements like this one marketing Gsync say it all :

But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As if there was no possible buffering by the GPU, how does it proceed when refreshing using the existing pic..?.

He s actualy downplaying one of AMD advantage wich is that the driver has full control of the screen while in Nvidia "solution" the screen can be out of control, to summarize any update in Gsync management will require an updated module, with previous monitors definitly lacking eventual future improvements.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
He s actualy downplaying one of AMD advantage wich is that the driver has full control of the screen while in Nvidia "solution" the screen can be out of control, to summarize any update in Gsync management will require an updated module, with previous monitors definitly lacking eventual future improvements.
This is a huge advantage for AMD, any improvements will be done on the driver level. If you have a G-Sync panel you are frozen in time no updates for you.
 

Hitman928

Diamond Member
Apr 15, 2012
5,622
8,847
136
This is a huge advantage for AMD, any improvements will be done on the driver level. If you have a G-Sync panel you are frozen in time no updates for you.

Not sure that's necessarily true. Depends on what type of fpga and other factors. Without knowing the details, it's likely you could do essentially firmware updates to make changes if needed.
 

Abwx

Lifer
Apr 2, 2011
11,172
3,869
136
Does anyone know where pcper's oscilloscope + screen camera setup came from and why it wasn't used previously (or was it)?

Surely from the same source as their FCAT gear...

This is a huge advantage for AMD, any improvements will be done on the driver level. If you have a G-Sync panel you are frozen in time no updates for you.

Indeed, that said their competitor is known to be future proofing adverse for its own products, i guess that programmed obsolesence is part of their business model.

As for Freesync it s obvious that, as stated by Hardware.fr, AMD is quite tolerant when giving the Freesync sticker even when panels are quite average, manufacturers are somewhat abusing of the thing as a mean to sell what would be otherwise slightly less expensive screens, that s the only negative conclusion we can draw from thoses early reviews.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |