Essentially all 3D PC games work like this. It is task of the engine running on the CPU to determine the time of the next frame to be displayed. Based on this the next world space and view space transforms are calculated. After this the GPU takes over. However, as i said, it is just a prediction typically done based on the frame interval of the last frames.
You can imagine, that given how Freesync works, the frame is almost never precisely displayed at the intended time resulting in jitter.
Thats the beauty of Vsync. Assuming the the GPU is always fast enough, the frame interval precisely matches the prediction resulting in a jitter free experience. Of course, i give you that, in case the time it takes to render a frame is larger than the display refresh interval then VSync jitter is much larger than Freesync jitter.
I was asking for actual examples because I've played a ton of games on Freesync and have never noticed any issues with animation and jitter. I've never seen it even mentioned besides your comment.