Is it bad that i still game on a 20" 1600x900 monitor?
I would much rather spend $350-400 on a new 27-32" 1440p monitor and get a $250 R9 390/970/Polaris 10 when they are on a fire sale than have a 20" 1600x900 with a $600 1080/980Ti. Besides, there are much cheaper 1440p monitors. The GPU will be outdated but the monitor you can enjoy for 5-7+ years. Think about how long you've kept yours? Furthermore, if you do anything else on your computer (productivity, media), 1440p would be a huge upgrade from 20" 1600x900. Of course if your budget doesn't allow you to buy a new GPU+monitor, then it's different. My 1998 Viewsonic was 19" 1600x1200. If I had to use that size monitor now with a modern PC for games, productivity and media I'd snap!
To each his/her own. Some guys here have i7 6700K and will be getting a 400-500 Euro GP104 and they are happy using a 200 Euro 22-24" 1080p 60Hz monitor. I would never enjoy such a system. To me the monitor and the power supply are 2 single most valuable components. Everything else becomes outdated junk. For example, when I upgrade to 4K A-Sync HDR monitor, I can reuse a 32" 2560x1440 in an office/work environment. That means such a monitor can easily last 10 years. Same with the PSU. $650 flagship card from 2015 will be low end by 2020. I actually think I don't spend enough on an even better monitor if anything.
I am not into competitive online shooters. If I was, I'd probably buy a 1080/1440p 120-165Hz monitor, not a 1080p 60Hz one. I get why so many people use 1080/60, because they have a budget. Then I see people with i7s and 980/980Ti using 1080 144Hz with VSR/DSR. That makes sense too. Playing modern games with Vaseline smudged FXAA on with an i7 6700K and 980Ti at 1080 on a 23.5" $200 monitor? Ya, that I don't get.
I also hate small monitors. It's one of those things -- immersion factor. The bigger the screen, the more immersed I feel. If I could afford 2016 LG OLED 4K 65" for PC gaming, I would get that. Once you use 27, 32-40+" gaming monitors, it's very hard to go back to 19-24" gaming. That's why I wouldn't buy a 25-25.5" 4K monitor either.
When I ask my friends who game in consoles only what they like most about it? Almost always it's "I get to play on a huge 50-65" screen in the living room instead of a crappy 22" monitor chained to a desk". Immersion isn't just about resolution. It's why watching a movie in IMAX blows the doors off my Panasonic plasma.
As I already said, and it's being conveniently ignored. 980Ti and Fury X are both bottlenecked st 1080p. The benchmarks at TPU prove it.
Even an i7 4790K @ 4.9Ghz still bottlenecks the Fury X at 1080p:
i7-6700K@4.6 vs i7-4790K@4,9 in 10 Games (Fury X) (*Spoiler* Haswell still bottlenecks Fury X @ 1080p)
https://www.youtube.com/watch?v=f5lfMogcrPU&app=desktop
The irony is that most people gaming at 1080p aren't using 4790K/6700K/5820K OC, and yet they want to keep comparing Polaris 10 vs. Fury X/980Ti @ 1080p. So in essence when someone starts comparing these flagship cards to lesser cards at 1080p, the lesser cards naturally look waaaaay better. And they should since you are basically concluding that flagship cards are bottlenecked and are barely faster than $350 ones when they are CPU limited. That's stating the obvious isn't it?
1600x900 - 980Ti only beats 980 by 14%
1920x1080 - 980Ti beats 980 by 19%
2560x1440 - 980Ti beats 980 by 24%
4K - 980Ti beats 980 by 27%
Why does this matter? Because if you use lower resolutions like 1600x900/1920x1080 to compare GPUs, you are
automatically penalizing flagship cards by placing them under greater probability of CPU bottlenecking. Alternatively, you aren't doing a true graphics card test comparison since you aren't shifting the workload to the GPU. Even a modern Skylake CPU isn't fast enough to prevent these CPU bottlenecks.
1600x900
Gigabyte 980Ti beats a reference 980 and 970 by 28% and 43%, respectively.
2560x1440
Gigabyte 980Ti now beats a reference 980 and 970 by
49% and
72%, respectively.
4K
Gigabyte 980Ti now beats a reference 980 and 970 by
56% and
82%, respectively.
Source
Moral of the story to me: for 1600x900 or even 1080p 60Hz gaming (without DSR/VSR/SSAA), flagship $600 cards are a waste of $ in many PC games. This may change as more modern demanding games come out but not right now. That's why comparing Polaris 10 to Fury X and concluding it's "just as good" by only using 1080p resolution is highly unfair to 980Ti and Fury X and frankly misleading/unrepresentative of what those flagship AMD/NV cards are capable off at 1440P and 4K.