just sayingSeems that minimum frequency are important to freesync, unless they fix the vsync/tearing at low fps through drivers.
No, I don't. I know very little about monitors. I am a simple amateur.Dang Gryz, you sure know your stuff.
Of course it's simple. Everything is simple.
I'll give you something to think about.
Maybe you can tell me the answer.
When you have a monitor that can do 30Hz - 144Hz, that actually means that the monitor can display a frame between 7 milliseconds and 33 milliseconds. G-Sync (and FreeSync) assure that everything on the screen looks smooth, as long as the monitor receives a new frame to display from the GPU, at least every 33 milliseconds.
But what does a monitor do when after 33 milliseconds, there is no new frame yet ?
There are 3 options.
1) It doesn't do anything. Result: the screen will turn white, until a new frame arrives. This gives flickering. We don't want that.
2) The monitor displays the last frame again. To do this, the monitor needs to have the last frame somewhere. The G-Sync module has memory with the last frame in it. So the G-Sync module can do this. Free-Sync can not.
3) The monitor depends on the GPU. If the interval between two frames is longer than 33 milliseconds, the GPU needs to resend its last frame.
Now there is one thing that many people tend to forget, when talking about networks. (And yes, the monitor and the PC form a network). Networks never have infinite bandwidth and they never have zero delay. In our case, DP1.2a has an effective bandwidth of 17.28Gbps. Which allows something like 180-190 1440p-frames per second. That means that when the GPU sends a new frame, there will be ~6 millisecondsbetween the first bit and the last bit of the frame.
That means that if the GPU needs to send a duplicate frame, to prevent the monitor from showing white pixels, it needs to make the decision not 33ms after it finished sending its last frame, but 27 milliseconds after sending its last frame. Otherwise the monitor will not have received the full frame when it needs to be displayed.
Did I make any mistakes so far ?
Now what happens if the GPU finished its next frame, right after is started sending the duplicate frame ? Does it stop sending the duplicate frame, and immediately start sending the new frame ? It can't. Because then the monitor will not have received the full new frame before the previous frame has expired. Even if the GPU did stop sending, you'll get tearing on the monitor.
With G-Sync, this problem is easier. The monitor has a copy of the last frame. That means the monitor can decide whether to display the last frame again or not. So now let's look what a G-Sync monitor can do. When the current frame is about the expire after 33 milliseconds, it has to make the decision to display the last frame again or not. So it has 6 milliseconds more time to make that decision !
G-Sync can do even something smarter.
When a monitor has displayed a frame for 27 milliseconds, it can look at its incoming data, and see if a new frame has started to be sent or not. If indeed a new frame is incoming, it can wait up to 6 milliseconds to receive the full new frame. And then display the new frame. The previous frame will not be displayed twice.
Now suppose that when a frame has been displayed for 27 ms, and no new frame is incoming. The monitor can then decide to show the current frame a second time. Note, the minimum holding time for a frame on the screen is 7 milliseconds (on a 144Hz monitor). Now suppose 1 microsecond later a new frame is coming in. It'll take 6 milliseconds to receive the full frame. And 1 ms later, the screen is ready to display a new frame. Hardly deviation from the points in time when frames should have been displayed.
Did I make myself clear ?
A G-Sync monitor can be smoother at low frame-rates. Because it can look 6 milliseconds into the future. A Free-Sync monitor can not.
But yeah, it's all simple. No reason to make things complicated. The G-Sync monitor is all bullocks. Any engineer can see that.
That IPS display garnered a lot of positive commentary because it had seemed it bridged the one fundamental weakness that had been historically associated with IPS displays; namely response time.
So anytime I want something of higher quality I am paying a tax? Interesting concept I thought I was just paying more for a better product.
Sweclockers have managed to get it official that Asus' answer to the Acer monitor - which was widely praised - will use the same exact panel.
This so reminds me of the slower but smoother campaign it isn't funny.
It's $200 more.
It's smoother @ 30Hz though.
It's a 144Hz panel though.
You might drop down to 30fps in some game someday though, you never know?
Oh, I have to get me one of those then.
Any measurable that nVidia is better on ends up being THE reason to purchase something.
Another way of expressing it is. You buy a product and then you use a lot of time to confirm how good a decision it was and how clever and smart you are.
In the end though, I'm gonna run my games faster than 30fps if I'm using a 144Hz monitor
Exactly. I'm most interested in if there's any latency advantages between the two techs in say the 80 to 144hz range, or any other differences. From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).
When blurbusters tested gsync for latency it was similar to vsync off as long as fps stayed under 144. If the framerate gets higher it's similar to vsync. With amd you have the option to allow tearing above 144 fps, so latency should be better in that case. But in both cases it'd be better to an ingame fps limiter to avoid tearing and latency.Exactly. I'm most interested in if there's any latency advantages between the two techs in say the 80 to 144hz range, or any other differences. From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).
I don't think it's simply human emotion. I think it's bought and paid for marketing. Why are we not hearing more about the Swift flickering at 40Hz?
Are you suggesting that when a GPU is in the middle of rendering a frame, it can predict exactly how long it will take to finish rendering that frame ?When doing work only on the GPU, it knows the state of the next frame and can take decisions based on that.
The G-Sync module can display a frame at the same time it stores it in its buffer. There is no extra delay.From what I understand AMD's solution may prove to be better here, as the gsync module stores a 1 frame buffer (is this correct?).
I've not seen any recent tests, blurbusters only tested the gsync kit for the asus screen, maybe the more graceful fallback at low fps adds some latency. Seems unlikely though, and so far the gsync screens have been the fastest ones on the market when it comes to processing and pixel response.So are you saying 80-144fps/hz has equal latency between the techs? I'm aware AMD's solution allows you to disable vsync above and below, which is almost a must
Are you suggesting that when a GPU is in the middle of rendering a frame, it can predict exactly how long it will take to finish rendering that frame ?
I find that very unlikely.
Thats the value of the brand. The process is the buyer looks for a purpose to buy the brand. Often this process happens AFTER the product is bought.
In this case reading reviews and participating in discussions for a product you already have. The reason is we also apply meaning in retrospective. Having a good brand demand you support that process.
Waiting for R9 390X and GTX 980 Ti to hit this summer, and the whole split between Freesync/G-Sync has thrown yet another wrench into which direction I want to go.
If I'm going to be stuck with one of the two, I might error on the side of price. This market fragmentation is getting crazy.
Nvidia could support Freesync if they wanted to, but the "problem" is that they see G-sync as superior