[Sweclockers] Asus MG279Q is to have the same panel as Acer XB270HU

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
Seems that minimum frequency are important to freesync, unless they fix the vsync/tearing at low fps through drivers.
just saying
peeps are hung up about this low fps -er 1440 not 4k
who buys a new high end monitor to run at sub 30 fps thats what game setting adjustments are for.
plus the extra $200 buys a much better gpu at the next gpu up grade imo
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Dang Gryz, you sure know your stuff.
No, I don't. I know very little about monitors. I am a simple amateur.

But I know something about computer-networks. When dealing with networks, you always have to take bandwidth and delay into account. Same applies here. You just have to use your common sense.

Too bad my post is on the bottom of the previous page. Most people will skip it. I would really like to hear if my simple reasoning has any big flaws.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Of course it's simple. Everything is simple.

I'll give you something to think about.
Maybe you can tell me the answer.

When you have a monitor that can do 30Hz - 144Hz, that actually means that the monitor can display a frame between 7 milliseconds and 33 milliseconds. G-Sync (and FreeSync) assure that everything on the screen looks smooth, as long as the monitor receives a new frame to display from the GPU, at least every 33 milliseconds.

But what does a monitor do when after 33 milliseconds, there is no new frame yet ?

There are 3 options.
1) It doesn't do anything. Result: the screen will turn white, until a new frame arrives. This gives flickering. We don't want that.
2) The monitor displays the last frame again. To do this, the monitor needs to have the last frame somewhere. The G-Sync module has memory with the last frame in it. So the G-Sync module can do this. Free-Sync can not.
3) The monitor depends on the GPU. If the interval between two frames is longer than 33 milliseconds, the GPU needs to resend its last frame.

Now there is one thing that many people tend to forget, when talking about networks. (And yes, the monitor and the PC form a network). Networks never have infinite bandwidth and they never have zero delay. In our case, DP1.2a has an effective bandwidth of 17.28Gbps. Which allows something like 180-190 1440p-frames per second. That means that when the GPU sends a new frame, there will be ~6 millisecondsbetween the first bit and the last bit of the frame.

That means that if the GPU needs to send a duplicate frame, to prevent the monitor from showing white pixels, it needs to make the decision not 33ms after it finished sending its last frame, but 27 milliseconds after sending its last frame. Otherwise the monitor will not have received the full frame when it needs to be displayed.

Did I make any mistakes so far ?

Now what happens if the GPU finished its next frame, right after is started sending the duplicate frame ? Does it stop sending the duplicate frame, and immediately start sending the new frame ? It can't. Because then the monitor will not have received the full new frame before the previous frame has expired. Even if the GPU did stop sending, you'll get tearing on the monitor.

With G-Sync, this problem is easier. The monitor has a copy of the last frame. That means the monitor can decide whether to display the last frame again or not. So now let's look what a G-Sync monitor can do. When the current frame is about the expire after 33 milliseconds, it has to make the decision to display the last frame again or not. So it has 6 milliseconds more time to make that decision !

G-Sync can do even something smarter.
When a monitor has displayed a frame for 27 milliseconds, it can look at its incoming data, and see if a new frame has started to be sent or not. If indeed a new frame is incoming, it can wait up to 6 milliseconds to receive the full new frame. And then display the new frame. The previous frame will not be displayed twice.

Now suppose that when a frame has been displayed for 27 ms, and no new frame is incoming. The monitor can then decide to show the current frame a second time. Note, the minimum holding time for a frame on the screen is 7 milliseconds (on a 144Hz monitor). Now suppose 1 microsecond later a new frame is coming in. It'll take 6 milliseconds to receive the full frame. And 1 ms later, the screen is ready to display a new frame. Hardly deviation from the points in time when frames should have been displayed.

Did I make myself clear ?

A G-Sync monitor can be smoother at low frame-rates. Because it can look 6 milliseconds into the future. A Free-Sync monitor can not.


But yeah, it's all simple. No reason to make things complicated. The G-Sync monitor is all bullocks. Any engineer can see that.

You might be right assuming the Gsync module can do all of that. In the end though, I'm gonna run my games faster than 30fps if I'm using a 144Hz monitor
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
That IPS display garnered a lot of positive commentary because it had seemed it bridged the one fundamental weakness that had been historically associated with IPS displays; namely response time.

To be honest IPS has never been slow, but very few IPS panels are really fast like the garbage TN panels.

I myself got a Asus PB278QR with Semi-Glossy PWM/Flicker Free 8 Bit AHVA this baby got 1-3ms via Dual-Link DVI. AHVA panel is just another proprietary "IPS-like" panel type.

Review for whoever is interested to get a 1440p 60hz before 4k GPU's comes out that can drive a faster 120hz+ panel http://wecravegamestoo.com/forums/m...-pwm-flicker-free-8-bit-ahva.html#post1311889
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So anytime I want something of higher quality I am paying a tax? Interesting concept I thought I was just paying more for a better product.

Did you not read the OP?

Sweclockers have managed to get it official that Asus' answer to the Acer monitor - which was widely praised - will use the same exact panel.

We'll need a review to compare them but your assumption that the GSync monitor with the same panel will somehow be superior and warrant a $100-200 premium seems premature imo.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This so reminds me of the slower but smoother campaign it isn't funny.

It's $200 more.
It's smoother @ 30Hz though.
It's a 144Hz panel though.
You might drop down to 30fps in some game someday though, you never know?
Oh, I have to get me one of those then.

Any measurable that nVidia is better on ends up being THE reason to purchase something.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116

When doing work only on the GPU, it knows the state of the next frame and can take decisions based on that. Something the module does not know without pooling the GPU.

It's the basic idea of VRR, don't let the monitor make decisions.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
This so reminds me of the slower but smoother campaign it isn't funny.

It's $200 more.
It's smoother @ 30Hz though.
It's a 144Hz panel though.
You might drop down to 30fps in some game someday though, you never know?
Oh, I have to get me one of those then.

Any measurable that nVidia is better on ends up being THE reason to purchase something.

Thats the value of the brand. The process is the buyer looks for a purpose to buy the brand. Often this process happens AFTER the product is bought.
In this case reading reviews and participating in discussions for a product you already have. The reason is we also apply meaning in retrospective. Having a good brand demand you support that process.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Another way of expressing it is. You buy a product and then you use a lot of time to confirm how good a decision it was and how clever and smart you are.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Another way of expressing it is. You buy a product and then you use a lot of time to confirm how good a decision it was and how clever and smart you are.

I don't think it's simply human emotion. I think it's bought and paid for marketing. Why are we not hearing more about the Swift flickering at 40Hz?
 

omeds

Senior member
Dec 14, 2011
646
13
81
In the end though, I'm gonna run my games faster than 30fps if I'm using a 144Hz monitor

Exactly. I'm most interested in if there's any latency advantages between the two techs in say the 80 to 144hz range, or any other differences. From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Exactly. I'm most interested in if there's any latency advantages between the two techs in say the 80 to 144hz range, or any other differences. From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).

Not sure. You can disable vsync while using Freesync which Gsync doesn't do. So when running at or above the refresh rate Freesync will add no latency.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Exactly. I'm most interested in if there's any latency advantages between the two techs in say the 80 to 144hz range, or any other differences. From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).
When blurbusters tested gsync for latency it was similar to vsync off as long as fps stayed under 144. If the framerate gets higher it's similar to vsync. With amd you have the option to allow tearing above 144 fps, so latency should be better in that case. But in both cases it'd be better to an ingame fps limiter to avoid tearing and latency.
 

omeds

Senior member
Dec 14, 2011
646
13
81
So are you saying 80-144fps/hz has equal latency between the techs? I'm aware AMD's solution allows you to disable vsync above and below, which is almost a must, but tbh I think Nv will add that option shortly. I'm not too concerned that gsync offers better synchronisation below 37 fps or so on a 144hz display, but I can see the value for 4k.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
I don't think it's simply human emotion. I think it's bought and paid for marketing. Why are we not hearing more about the Swift flickering at 40Hz?

Ofcource its made by marketing. Eg treating the reviewers right and have good connection here. But it wouldn't work if there wasnt some human emotions to attach that marketing to. Rational consumers would eg pose the question you do. But when a brand is strong most just dont. They instead look for storries that confirm the brand.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
When doing work only on the GPU, it knows the state of the next frame and can take decisions based on that.
Are you suggesting that when a GPU is in the middle of rendering a frame, it can predict exactly how long it will take to finish rendering that frame ?
I find that very unlikely.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).
The G-Sync module can display a frame at the same time it stores it in its buffer. There is no extra delay.

Having the last frame in a buffer is only important when that frame is staying longer on screen that the maximum pixel holdtime. (E.g. 33ms when the min refreshrate is 30Hz). In the case of AMD, the GPU needs to resend the frame again, while with G-Sync, the monitor still has that frame.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
So are you saying 80-144fps/hz has equal latency between the techs? I'm aware AMD's solution allows you to disable vsync above and below, which is almost a must
I've not seen any recent tests, blurbusters only tested the gsync kit for the asus screen, maybe the more graceful fallback at low fps adds some latency. Seems unlikely though, and so far the gsync screens have been the fastest ones on the market when it comes to processing and pixel response.

I disagree the vsync off is a must, pretty much all games these days have fps limiters built in. If you cap fps at something like 125 fps you'll have extremely low latency and no tearing.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
Are you suggesting that when a GPU is in the middle of rendering a frame, it can predict exactly how long it will take to finish rendering that frame ?
I find that very unlikely.

Not precisely, but close enough (see frame pacing) to be ahead of the module, aka really "see in the future" like you said
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Waiting for R9 390X and GTX 980 Ti to hit this summer, and the whole split between Freesync/G-Sync has thrown yet another wrench into which direction I want to go.

If I'm going to be stuck with one of the two, I might error on the side of price. This market fragmentation is getting crazy.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Thats the value of the brand. The process is the buyer looks for a purpose to buy the brand. Often this process happens AFTER the product is bought.
In this case reading reviews and participating in discussions for a product you already have. The reason is we also apply meaning in retrospective. Having a good brand demand you support that process.

It goes beyond that, like closing a blind eye at the awful price/performance of 750Ti and 960 cards, not just defending your own purchase, because you have allegiance to the brand and can't recognize/criticize less than stellar products that your own preferred brand makes.

Good article on this topic.

This is pretty good too as he talks about a study done to correlate how people with strong emotional attachment to a brand get offended/suffer in self-esteem when you criticize their preferred brand or something positive is said about a brand they dislike.

Waiting for R9 390X and GTX 980 Ti to hit this summer, and the whole split between Freesync/G-Sync has thrown yet another wrench into which direction I want to go.

If I'm going to be stuck with one of the two, I might error on the side of price. This market fragmentation is getting crazy.

The market fragmentation has only happened because people are buying GSync monitors and NV refuses to support an open standard. AMD can never support GSync in its form but NV can easily support GSync + FreeSync simultaneously. Nothing is stopping NV from adding DP1.2+/1.3 to its future GPUs which means A-Sync support out of the box. If NV thinks that GSync is superior, they would have nothing to fear. If they do not, well no one will be buying $100-200 more expensive GSync monitors when FreeSync is as good or better. Hence why NV is not supporting FreeSync yet. NV also wants to do everything possible to lock you into their eco-system like Apple does.

The strong supporters of GSync don't have long-term vision because 70% of the graphics card market is Intel and Intel cannot support GSync either. The cop out argument used is that "no one games on Intel GPUs" but Intel isn't going to be standing still. In the next 10 years we can expect Intel's GPU performance to increase 10-15X. As soon as Intel jumps onboard with FreeSync, then 85% of the GPU market will be supporting this standard.

Again, even existing GSync monitor owners would not have anything to fear if NV supports both standards. After all, if you bought a GSync monitor, chances are you are committed to using NV for 5-10 years.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Nvidia could support Freesync if they wanted to, but the "problem" is that they see G-sync as superior

This is just a PR statement. It's in fact the opposite. If NV thought GSync was truly superior, they would support both standards just like modern GPUs will support OpenGL/Vulkan and DX12 at the same time. NV has a financial incentive to not support FreeSync and they will exercise their market share advantage for as long as possible to not support FreeSync while collecting profits with every single GSync monitor sold. AMD gets $0 profits whenever a FreeSync monitor is sold because someone could buy a FreeSync monitor with a GeForce GPU/Intel APU. Since NV has almost 80% of discrete GPU market share, this FreeSync vs. GSync situation will drag for a long time unless Intel adopts FreeSync.

Correct me if I am wrong but so far I don't think there is a killer 4K GSync/FreeSync monitor (i.e, AVA/IPS panel) for sale. 2560x1440 FreeSync/GSync monitors feel like a stop-gap solution. Therefore, despite GSync having a 1.5 year+ head start, there is hardly a plethora of 4K GSync monitors worth buying, and only a handful of 2560x1440 IPS ones.

Shockingly, low resolution monitor purchases seem to keep increasing if Steam Survey is accurate. 1366x768 grew by 0.26%, while 1920x1080 grew by 0.12%. This may not be accurate but neither 2560x1440 nor 3840x2160 seems to be taking off despite much lower prices of both panel types in recent years. That's alarming regardless of GSync vs. FreeSync, and possibly a reason why monitor manufacturers are reluctant to divert their efforts towards high quality 4K GSync/FreeSync monitors.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |