Well then let's focus on Crysis 3, the game the OP specifically mentioned. Here's Crysis 3 at
Medium: here it is at
Very High. Now there's obviously better IQ in the second screenshot in terms of LOD and Ambient Occlusion, no question. But I have a feeling that if you had two similar-sized monitors side by side, one running Medium at native 4K and the other running Very High at native 1080P, the vast majority of people will identify the Medium one as having better quality. Resolution matters. Would you suggest someone would have better IQ playing at 800x600 with max details over 1600x1200 with medium details?
I prefer gaming at 1080P with max details because I do like the ambiance that is evoked through things like ambient occlusion, light shafts, soft shadows and shader effects; I'd personally opt for ultra water quality over just about any other graphical improvement because I'm a sucker for realistic water effects. But the reality is that the crispness offered by 4K, even without all the bells and whistles, is generally going to be more visually striking than subtle improvements gained through special effects. If you don't have a 4K monitor, it's hard to demonstrate; it's like when they advertised Blu-ray on old DVDs but they literally could not show you the actual resolution Blu-ray ran at so they had to simulate it. It's basically like saying "ok, now imagine everything you're looking at is twice as sharp." But when you get a chance to see it in person, it's really striking.