It was good that we were also unable to detect any input lag degradation by using G-SYNC instead of VSYNC OFF. There were many situations where G-SYNCs incredible ability to smooth the low 45fps frame rate, actually felt better than stuttery 75fps this is a case where G-SYNCs currently high price tag is justifiable, as Crysis 3 benefitted immensely from Crysis 3.
Dang, really looking forward to a good monitor with this. Anyone want a 120hz Alienware monitor?
It looks like they are simply trying to convey that G Sync adds almost nothing input lag wise over running with V Sync off, however, it gives the real improvements in tearing and stuttering.
I don't know if this is the right place for this question, but with 4K at something like 28" if you ran the monitor at 1080p while gaming wouldn't it just be like have a 1080p monitor? Are the pixels small enough that several would just combine to have the same density as a regular 1080p monitor?
Would be nice to have the screen real estate in desktop and the performance in gaming until GPU's catch up.
I think I am going 1440p so long as that Asus monitor isn't terrible.
Price, availability and the future of the technology remain the main concerns. When Nvidia released the news I was skeptical about how they would handle AMD, Intel and licencing concerns for the technology. The answer I think is "very badly". So badly infact that AMD is creating its own implementation that is sort of similar but quite different from scratch. Rather than trying to work out a standard from the work done at Nvidia on this they are completely ignoring it and moving on with their own somewhat different approach.
This remains a grave concern, I don't want to buy a gsync monitor to use it for a year (especially at the prices being talked about) to find out that in a years time that DP 1.3 contains it by default and many monitors have it and Nvidia abandons it completely or worse continues having a competing and incompatible technology. This sort of behaviour is bad for all of us.
I need a new monitor soon but with all these new technology coming in '14 it's getting hard to choose. Probably gonna be easier to make a decision once we see the prices lol.
They are on the market...it is a matter of volume driving down the prices hopefully. 4k seems to be dropping semi-quickly, although it isn't exactly ready for mass (price) appeal yet.
It looks like 4K is the next standard for consumer TVs- Vizio are bringing out a $999 4K TV this year. Lots of 4K TVs means cheaper 4K panels. I wouldn't be surprised if a 4K monitor was cheaper than a 1440p monitor by the end of 2015, same way a 1920x1080 one is cheaper than a 1600x1200 one right now.
Of course a larger resolution will mean larger buffers needed in the G-Sync module, so the premium will go up a bit; but hopefully not too much.
If they use HDMI 2.0, it will support 60hz, but capable of backward compatibility with current HDMI versions to use with old tech, like DVR's. So as a TV, it'll likely use 30hz, or take 30hz of input and convert it. On a PC, you use an HDMI 2.0 cable (or just a normal high speed cable), and get 60hz.I don't know but with a 4k 60 hz TV's won't that be display port only , leaving any add on hardware with a hdmi to be a connection/bandwidth issue -dvd , avr's , dvr's ,cable boxes , would be out of date.
Price, availability and the future of the technology remain the main concerns. When Nvidia released the news I was skeptical about how they would handle AMD, Intel and licencing concerns for the technology. The answer I think is "very badly". So badly infact that AMD is creating its own implementation that is sort of similar but quite different from scratch. Rather than trying to work out a standard from the work done at Nvidia on this they are completely ignoring it and moving on with their own somewhat different approach.