I also know several people who still game in 720p, despite GPUs that are more powerful for them to need to. It's pretty obvious I'm referring to what the overwhelming majority of people are using.
If I say a human has two arms, that doesn't mean I haven't taken the ones that are born with one or three arms, into the equation. It just means I'm referring to humans in general.
I could bet you there are more GTX 1080 users playing at 4K than those playing at 1080p. And using 4K as an example would have made my argument even stronger (as CPU usage at that point with a single GTX 1080/Titan XP is around 40%), but I chose not to include it. Neither is it any relevant to include the 1080p cases. The whole idea of those benchmark tests online is after all to reflect normal day usage, or at least to help the viewers/readers what hardware to buy for their game. So unless Digitalfoundry wants us to use a GTX 1080 and a Titan XP at 1080p, their test is meaningless.
Actually, I would be surprised if you were. The "overwhelming majority of people" are certainly not hardware enthusiasts and they barely realize what they have in their boxes. Plenty of them certainly go with overpriced, overpowered devices for their needs and generally want something that takes a few minutes to connect all the holes, power on, and be done with it. I think it's more than reasonable that the actual "overwhelming majority of people" [that have a 1070 or 1080 in their system] will, by and large and by the end of this year, be running with crappy 1080p or less displays and happily do so until something breaks, triggering the next inadequate/overpriced upgrade.
More than any other part of the system, the general user--even enthusiasts--cheap out on the display. It still happens to this day. it's easier for someone to look at a handful of classes of cards, more often default to "that green company is better" and just go with the expensive one because they assume more expensive = more better..whether or not they need it.
This person is now out of money or simply burned out from making further decisions, so looking at hundreds and hundreds of display options with nebulous functions, they get something cheap that "sounds right" and is least problematic. This isn't controversial, it's just behavior.
Added to that, the tech for 4K right now is just not all that special and, as cheap as it already seems to be, savvy planners are probably thinking that 4K has an incredibly short lifespan. Just look at the way TVs are more or less "skipping" 4K tech--there's barely any content available, we are still "waiting" for good HDR to make 4K worth the upgrade, and we are already seeing 8K trickling out. ...and the hardware isn't even here to really push 4K60p. Yes, I think anyone that goes with 4K and has the hardware to push a good 4K display today will be fine with that for years, but it seems like 1440p is the better value proposition for GPU/Display cost and performance during this same several year's time period where we are in 4K limbo.