Do these tests exist in wide numbers?
We see test at multiple resolutions. Why don't we see tests at different graphics settings widely done?
Say, 1080 gets max settings, 1440 gets one step below that, and 2160 one step further down.
As a buyer of a card I'd be really interested in seeing what setting a game needs to hit something playable.
I've found that *sync makes games vastly more playable at low frame rates, so I could accept playing a game if 99.5% of the frames were at or above, say, 35 FPS.
For example, the GTX 1080 (ironically named because why would you use it at 1080 gaming?) is the only card I see that can handle 3840x2160 gaming at max settings. It rarely tops 60 fps in a game (which is fine as the monitors don't go above 60 Hz) but stays at 40+ in basically everything.
What settings would other cards need to achieve that performance? Can they even do it?
We see test at multiple resolutions. Why don't we see tests at different graphics settings widely done?
Say, 1080 gets max settings, 1440 gets one step below that, and 2160 one step further down.
As a buyer of a card I'd be really interested in seeing what setting a game needs to hit something playable.
I've found that *sync makes games vastly more playable at low frame rates, so I could accept playing a game if 99.5% of the frames were at or above, say, 35 FPS.
For example, the GTX 1080 (ironically named because why would you use it at 1080 gaming?) is the only card I see that can handle 3840x2160 gaming at max settings. It rarely tops 60 fps in a game (which is fine as the monitors don't go above 60 Hz) but stays at 40+ in basically everything.
What settings would other cards need to achieve that performance? Can they even do it?
Last edited: