I am not I understanding your last point. FreeSync doesn't have issues if the fps goes below 60 fps. The issue seems to occur when the fps drops below the minimum rating of the monitor panel. For example, if the panel is rated at 48-75Hz, then if the fps drops below 48, you have problems. However, AMD has already stated and many sites have confirmed that FreeSync has an operational frequency of 9-240Hz. That means the limitations and compromises FreeSync has today in the 30-48Hz range are entirely related to the manufacturer of the panel/ monitor manufacturer, not FreeSync. To further support this point, some FreeSync monitors go as low as 30Hz while others are 40Hz and yet others are 48Hz. It seems the panel choice is a major factor in how good the monitor itself is, regardless whether it even support FreeSync to begin with.
Thus far, there are no high quality panel FreeSync monitors in the first batch but this has been used to criticize FreeSync technology itself. It's important to separate these 2 aspects as they are not inter-related.
Part of the issue though, even if the panels can go as low as 30Hz, is what does it actually look like at those low refresh rates? Too low and you definitely get flicker.
That's the one-up that G Sync has over AdaptiveSync and FreeSync: once you get below the panel's low limit, it begins double or even triple buffering while bringing the refresh rate back into the normal zone. This avoids flicker due to a refresh rate that is too low.
I imagine there is a drawback to that solution as well. I haven't seen any actual in-depth analysis, however, I would love to know what kind of input lag is introduced when you begin double and even triple buffering low framerate in G Sync. In G Sync, when you are around 30fps, it is quite likely going to start double buffering to create a faster refresh rate, as opposed to using 30Hz and introducing flicker. Is there comparable input lag to V Sync at that point, which is essentially doing the same thing in those instances?
That could be one thing that AdaptiveSync is entirely attempting to mitigate: input lag. So, if a panel cannot tolerate 30Hz but can do 35Hz, then it will do 35Hz and the framerate will simply no longer match. Perhaps that's an intended trade-off so that input lag is not introduced?
I hope AdaptiveSync can incorporate some new tricks with revisions, because that would be the ultimate answer to the issue. Or if it is simply a matter of consumer education, because quite frankly, I think I'd rather the AdaptiveSync route anyway, even if G Sync might be smoother at 30fps. My whole goal in gaming is to not ever approach 30fps, period. I won't upgrade to 4K or beyond until I know I have or can afford the GPU muscle power. Granted, I'm not far from it now with 3x1080p, that's 75% of 4K, but if I upgrade to 4K it's unlikely to be a single monitor, and regardless, the next upgrade will be at minimum ultra widescreen at very high resolution. I'll hold off until: a) I can afford it; b) I can produce smooth visuals mostly around 60fps.
Both technologies would provide what I'd really want: smoothness when the game dips below 60fps into moments of 40-45 fps, perhaps even down to 35fps for whatever reason for brief moments, all while maintaining zero input lag and remaining smooth. I'm of the type that I don't care that G Sync brings the same visual smoothness to 30fps or lower, because I don't care to be around that level of framerate anyway.