unseenmorbidity
Golden Member
- Nov 27, 2016
- 1,395
- 967
- 96
Given how much more expensive a GSync monitor is (about 30% more) if Vega could actually reach 1080ti speeds, it would be the only real choice for 4k gaming.
I don't expect it to reach those speeds. But I suppose it might. Then all we need is someone to come out with a 4k Freesync screen that will do more than 60 Hz.
Even 1080 speeds for 1080 prices is still attractive if only, again, because Freesync is so much cheaper than GSync.
4K gaming isn't really much of a "choice" but a tradeoffs at this point. 40-50fps Freesync actually looks worse than triple buffered vsync as it still tears a bit. I don't consider Freesync/Gsync really viable tech for 4K until we have video cards that can consistently stay above 60fps 100% of the time, simply because < 60fps gameplay looks bad regardless if you have adaptive vsync or not.
40-50fps Freesync actually looks worse than triple buffered vsync as it still tears a bit. (I had a Fury X previously).
I'm going to say your freesync wasn't working right, because it shouldn't show any tearing at all.
That... shouldn't be happening.4K gaming isn't really much of a "choice" but a tradeoffs at this point. 40-50fps Freesync actually looks worse than triple buffered vsync as it still tears a bit. (I had a Fury X previously).
I don't consider Freesync/Gsync really viable tech for 4K until we have video cards that can consistently stay above 60fps at 4K 100% of the time, simply because < 60fps gameplay looks bad regardless if you have adaptive vsync or not.
That... shouldn't be happening.
Freesync is completely tear free within its range.
You might have had a problem with your setup.
And while Freesync/GSync should indeed be tear free, it will never be as smooth as VSync (assuming constant frame time), due to the inherent problem of displaying a frame at the wrong point in time which will result in jittery animation.
And while Freesync/GSync should indeed be tear free, it will never be as smooth as VSync (assuming constant frame time), due to the inherent problem of displaying a frame at the wrong point in time which will result in jittery animation.
..what?And while Freesync/GSync should indeed be tear free, it will never be as smooth as VSync (assuming constant frame time), due to the inherent problem of displaying a frame at the wrong point in time which will result in jittery animation.
I did some testing today of LFC.I managed to hack my Freesync 4k to a range of 33Hz to 60Hz from 40-60 Hz and it's an amazing difference between just below the Freesync limit (tearing, choppy, bad!) and anything above it (smooth and acceptable frame rate)
Now, most of my games are old enough I never notice anything. But it's really noticeable with the one (1) 2017 release I purchased in 2017, Total War: Warhammer. Also, that game is a game that doesn't really need high frame rates to be playable so I could see if you like to play other types of games that 40-60 Hz wouldn't be very fun.
..what?
It's displayed the very moment it's rendered. If the frametimes are constant, then the display will be constant.
I did some testing today of LFC.
When my range is 40-144, it remains tear free until 30FPS, and then there's slight tearing at 25FPS, then it gets worse the lower you go (but always better than no freesync).
When my range is 35-144, it pushes down the tear free range to 25FPS, and tears only start appearing at 20FPS.
Again, regardless of the range, Freesync + LFC is better than no freesync at all. Though it's best to remain within your native range.
And while Freesync/GSync should indeed be tear free, it will never be as smooth as VSync (assuming constant frame time), due to the inherent problem of displaying a frame at the wrong point in time which will result in jittery animation.
Specs-wise, we are looking at the following:
14 nm Vega 10 GPU
"All signs point to the radeon RX Vega launching sometime in May this year, meaning we've got somewhere in the region of 4-6 weeks before these are finally in our hands."
- 4096 Stream Processors
- 64 NCUs (Next Compute Units)
- 2048-bit memory bus
- 8GB HBM2
- 512GB/s Memory Bandwidth
- PCIe Gen 3 x 16
- 225W TDP
I'm now convinced that you never had freesync working. First, it's much smoother than vsync since there is no input lag while Vsync often adds 60ms+.
Freesync/GSync work by matching your monitor refresh rate to that of the program (game) output. So if the game renders @ 40fps the monitor will display @ 40hz. If it renders @ 75, monitor will refresh @ 75hz. It will go up and down as the fps changes, always displaying "instantly".
You describe Freesync in laymans fashion. The issue is, that the game engine makes a prediction when the next frame is to be displayed. It is doing this prediction before the frame is rendered without any knowledge about the time it takes. When it takes longer the key frame is displayed too late resulting in jitter. If it takes less time Freesync will display it too early.
I really want to see some legit leaks of this thing. Given how close we are to launch, I'd have hoped to see something on Vega.
I'm sorry but can you please provide examples of games that do this?
Idk, if AMD had something that could compete with the Titanxppp, then I feel they would be shouting it from the rooftops. Nvidia's already laid their cards on the table in a preemptive strike against Vega.If AMD run a tight ship without making too much noise its better than the usual marketing talk which unnecessarily hypes up the product and builds unrealistic expectations. AMD's best products generally came when nobody expected them to deliver, like the HD 4870 . imo if AMD have a strong product they are better off keeping it a well guarded secret and let the press and the reviews speak for their product.
Essentially all 3D PC games work like this. It is task of the engine running on the CPU to determine the time of the next frame to be displayed. Based on this the next world space and view space transforms are calculated. After this the GPU takes over. However, as i said, it is just a prediction typically done based on the frame interval of the last frames.
You can imagine, that given how Freesync works, the frame is almost never precisely displayed at the intended time resulting in jitter.
Thats the beauty of Vsync. Assuming the the GPU is always fast enough, the frame interval precisely matches the prediction resulting in a jitter free experience. Of course, i give you that, in case the time it takes to render a frame is larger than the display refresh interval then VSync jitter is much larger than Freesync jitter.
Download the AMD Freesync windmill demo, there's a red bar test pattern that works really well for telling if it's working. (It's running on a display on the floor of my store here and every customer that has seen it and done the on/off test has been amazed at the difference)
@tential https://community.amd.com/thread/180553
Edit: Found the link!