Yep, gonna be great. It's gonna be a pretty decent amount of time before it's feasible though. Given current jumps, I'd say at least 6-8 generations before midrange hardware can do it @ 60fps/4K. I think right now triple 7970GE could do it with AA disabled and some details turned down. I also think that AA might become a little pointless with 4K @ desktop display.
It would be doable with Two flagship cards, no need for three.
Unless you want constant 60fps at highest settings, might need more horsepower.
I play some games at 6060x1080 (triple 1080p + bezel correction) using 2x 560 TI 2GB cards. Granted, with that hardware, I have to play BF3 with mostly low and some medium settings, certainly no AA.
That works out to be roughly 6.5million pixels. Reaching 4K resolution would be like adding another 1080p monitor to my setup. On my 560 Ti 2GB SLI setup? Even BF3 at low settings at 4K probably would be fairly unplayable. 670's in SLI? Quite possible, perhaps medium settings, with some on high. Very unlikely Ultra settings would produce a constant 60fps, but I'm not even sure what they are truly capable of on my Surround setup, so I cannot really project an estimate on 4K.
Two 680s (ungodly expensive setup) could probably produce a consistently smooth multiplayer experience with terrific visuals, not sure about Ultra settings though.
If I got a 4K monitor, it wouldn't be anything less than a 30" model though. I just cannot imagine the rest of the computing experience on that resolution, even with that size of monitor, let alone something like a 24" monitor (my monitors are 23").
I would definitely argue the pixel density on a 30" display (or smaller), if it's 4K, would definitely render AA unnecessary. And with advances in post-process AA, any additional AA needs would be satisfied with even the most minimal application of post-process AA. It would be ideal if such AA was provided correctly to be introduced to the rendering PRIOR to any UI being added to the image (like cursors, minimaps, character names, other text, etc).
So with that, I don't think we're far at all, if one wouldn't say it is possible TODAY, from a pleasing 4K gaming experience. We just need the displays to start shipping, and it would be enjoyed by an appreciative crowd. The present cost of entry to render most modern engines at that resolution, even at low or medium settings, would push away quite a few people (either due to cost or the requirement to settle for low quality), but it could also be treated quite a bit like "but can it run Crysis?".
To see BF3, at Ultra, on a 4K display? I'd drool, and drool some more.
Or... have I projected a bit wrong, here? I was assuming based entirely upon pixel count of my display setup compared to a 4K pixel count. But there could be a key difference that impacts rendering quite a bit differently: FOVs and aspect angle.
a 4K, single monitor with a 16:10/16:9 aspect ratio... would it be vastly different from, say, 4x1080p monitors rendered horizontally?
Things like shadows, especially Ambient Occlusion, might they have a far larger rendering space to be addressed for a single higher resolution monitor, compared to the same pixel count rendered on a spanned 1080p resolution of the same pixel count? That is, a single AO shadow on the single monitor would require 4x the number of pixels to be calculated... would that be different from having 4x the number of shadows rendered, each with 1/4 of the pixel coverage of the single monitor?
I don't have the grasp on rendering requirements to really understand if it would be the same or not.