The great misconception about a graphic card being "overkill" for a resolution.

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
This thread doesn't make sense. near 60 FPS in all titles? Thats great FPS for a single card.

Depends what kind of games you play and your standards. I have a 144 Hz refresh monitor and the difference between 60 fps and 120-140 fps is huge for me. Playing games like Witcher 3 with a high fps(120+) and high visual fidelity is currently impossible with a single GPU like the GTX 1080 at 1440p@144Hz, so it means I'd have to compromise, since my GPU is even lower than that. That's fine for most people, but most people also don't use a 1440p@144Hz monitor.

It's why I'm on an enthusiast forum.
 
Reactions: NTMBK

amenx

Diamond Member
Dec 17, 2004
4,011
2,279
136
Does anyone remember when Crysis was released in 2007? No one got 60fps in that title. Did anyone complain at the "crappy" graphics to get it to run at 1600x1200 or 1280x1024? As far as I recall most were quit enamored with the title even with the reduced settings. I certainly enjoyed it @ 1680x1050, and with reduced settings it still looked better than many other games. Did anyone set that game aside until they could run it at 60FPS? Or wait 3 or 4 years for more capable GPUs to do that?





I get far better performance at 1440p with most titles even with a modest gtx970. Does anyone remember Halo with the 30fps locked option? It played very smoothly for me and thoroughly enjoyed it and its graphics. Of course each to their own. Some cant stand less than 60 or even 120fps, and thank god I'm not one of them. Not saying they do not have valid arguments, but only in so far as it applies to them.

Secondly FPS tolerances vary from game to game. Racing games or fast paced shooters do annoy me somewhat when less than 60fps, but not enough to reduce resolution. Reduce settings, sure, just not the res. RPGs, RTS and most other games, no issue whatsoever with less than 60fps.

Thirdly, even if I was more insistent on higher fps in gaming, I realize it also comes down to priorities. Gaming comes a far second to my overall PC usage/experience, and here there is no compromise to less than 1440p.

1080p vs 1440p, to me, today, is what 1024x768 was when most were on 1680x1050 some 10 years ago. Yes I am sure there were the 1024x768 hold outs insisting on higher FPS back then while most were just enjoying the higher detailed IQ of higher resolutions. Same silly argument then as now.
 

DamZe

Member
May 18, 2016
187
80
101
Does anyone remember when Crysis was released in 2007? No one got 60fps in that title. Did anyone complain at the "crappy" graphics to get it to run at 1600x1200 or 1280x1024? As far as I recall most were quit enamored with the title even with the reduced settings. I certainly enjoyed it @ 1680x1050, and with reduced settings it still looked better than many other games. Did anyone set that game aside until they could run it at 60FPS? Or wait 3 or 4 years for more capable GPUs to do that?





I get far better performance at 1440p with most titles even with a modest gtx970. Does anyone remember Halo with the 30fps locked option? It played very smoothly for me and thoroughly enjoyed it and its graphics. Of course each to their own. Some cant stand less than 60 or even 120fps, and thank god I'm not one of them. Not saying they do not have valid arguments, but only in so far as it applies to them.

Secondly FPS tolerances vary from game to game. Racing games or fast paced shooters do annoy me somewhat when less than 60fps, but not enough to reduce resolution. Reduce settings, sure, just not the res. RPGs, RTS and most other games, no issue whatsoever with less than 60fps.

Thirdly, even if I was more insistent on higher fps in gaming, I realize it also comes down to priorities. Gaming comes a far second to my overall PC usage/experience, and here there is no compromise to less than 1440p.

1080p vs 1440p, to me, today, is what 1024x768 was when most were on 1680x1050 some 10 years ago. Yes I am sure there were the 1024x768 hold outs insisting on higher FPS back then while most were just enjoying the higher detailed IQ of higher resolutions. Same silly argument then as now.

I am afraid it's not the same as 768p vs 1050p, the difference is miniscule going from 1080p to 1440p on screens smaller than 32", while the performance needed to drive 1440p at high detail on the newest titles becomes a factor, remember back in the day’s games weren’t as demanding as they are today, not even close, hence why the 8800GT/9800GTX era was so long.

As for resolution, 4K on anything less than a 55" TV monitor is wasted pixels. I've looked at 27” PC monitors with 1440p/4K and you would have to sit incredibly close to it to notice any difference vs 1080p., who wants to sit with his face glued to the screen when playing video games, and who wants to torture his wallet by artificially increasing pixel density for a nigh unnoticeable effect?

I have a big desk and a great gaming chair, I like being distanced from my screen (at least 3 feet), not hunched over like Gollum. So for me a 32” 1440p monitor could make sense (and is on my wishlist). 1440p/4K on a 27" monitor is a waste of pixels IMHO. 1080p for all its worth is still an excellent resolution for monitors ranging between 22”-32” especially if they are quality panels with great colors and response time. There is a point of diminishing returns with resolution/screen size, and this new “I need to push a billion pixels because of e-peen” doesn’t add up. Sure large screens benefit from larger resolutions, so anyone who has their 1080, Titan X/P hooked up to their sexy 65” OLED 4K HDR monstrosity, my hats off to you! But pushing 4K at 27” is nigh pointless.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
OP and others are confused and reaching the wrong conclusions because they're evaluating performance at defined graphics settings.

This is almost a circular argument since the defined settings are based upon arbitrary performance goals in the first place. But as you consider the graphics settings available, especially up the top, you'll see this line of thinking is backwards. One must remember that these particular maximum settings were chosen for a reason.

I blogged about this particular issue a long time ago when I was running PCgamingstandards.com now defunct. There's sadly a crowd of people who insist that a game is "badly optimized" because no card can currently run it at max settings. The corollary of which is that cards are bad which can't run games in max settings.

I argued at the time that "max settings" is somewhat arbitararily selected, and that in most modern games the video settings, most notibly ones which scale such as draw distance, shadow resolution, tessellation, etc. These features are pretty much uncapped, when developers put a draw distance slider into their options the min distance could be 20 in game meters from your view point, and the max could be 200m. But it could easily be that the min is 2m and the max is 2000m. Judging whether or not a game is "well optimised" based on analysis of performance alone without considering what exactly is being rendered is just flat out stupid.

And really what you've pointed out here is kind of the inverse, developers specifically set these limits as something sensible for the range of hardware available at the time. And quite frankly you can take games like for example Skyrim, open up the ini files to edit them manually and many many of these settings actually exceed what the in game options, you can go dump the view distance value to be huge, same with LOD bias and things like grass/folliage density. Which neatly comes back in a circle to what I said prior, which is that anyone can write a game or a mod which requires demands unreasonable performance and so it means almost autistic-like standards such as "the card has to be able to run everything at 1080p at 60fps, otherwise it's not a 1080p ready card", these just become meaningless.

It depends on what games you spend the most time on. If the 2% of games in your library are 90% of your gaming time going forward it matters. If you are a hardcore gamer that plays new releases as they come out that huge library might just be played games or what I call "Steam Decoration" (aka games you buy on sale and never play).

What matters is how you spend your time and some people want to play new releases on max settings. Other people want to play eGames or whatever the call older but popular games like CS Go and Dota 2 nowadays. That is why the third question after resolution and budget is what games do you play or plan to play.

I agree, I think this gets to the heart of how many gamers (but not all) approach how they buy their hardware, because in the real world we have constraints such as how much money we have, how much time we can dedicate to different games and other priorities, it means almost all purchasing decisions are compromises of some description.

I think too many people live in "review world" which is this highly academic world of ideal frame rates, expensive hardware and thinking common bench marking lineups are representative of gaming in the aggregate. If we forget that these are really the top most demanding games on the market and try and target our purchases at just that information, then we will have a tendency to vastly over estimate performance we need which is a mistake.
 
Reactions: bystander36

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
who wants to torture his wallet by artificially increasing pixel density for a nigh unnoticeable effect?

I have a big desk and a great gaming chair, I like being distanced from my screen (at least 3 feet), not hunched over like Gollum. So for me a 32” 1440p monitor could make sense (and is on my wishlist). 1440p/4K on a 27" monitor is a waste of pixels IMHO.

Agreed. Went from 27" 1080 to 32" 1440. Honestly did not see that much of a difference. If I had it to do over again I would have stuck with 1080p and cheaper graphics cards.
 
Reactions: Thinker_145

tential

Diamond Member
May 13, 2008
7,355
642
121
The whole discussion makes 0 sense in my opinion. So what if you have a 1080p screen with sli 1080ti. There is dsr/vsr.

I use 1080p still and my only interest is in high end gpus. Would love the 1080 performance just don't want to pay the inflated price for it when there is 0 competition
 
Reactions: Thinker_145

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I blogged about this particular issue a long time ago when I was running PCgamingstandards.com now defunct. There's sadly a crowd of people who insist that a game is "badly optimized" because no card can currently run it at max settings. The corollary of which is that cards are bad which can't run games in max settings.

I argued at the time that "max settings" is somewhat arbitararily selected, and that in most modern games the video settings, most notibly ones which scale such as draw distance, shadow resolution, tessellation, etc. These features are pretty much uncapped, when developers put a draw distance slider into their options the min distance could be 20 in game meters from your view point, and the max could be 200m. But it could easily be that the min is 2m and the max is 2000m. Judging whether or not a game is "well optimised" based on analysis of performance alone without considering what exactly is being rendered is just flat out stupid.

And really what you've pointed out here is kind of the inverse, developers specifically set these limits as something sensible for the range of hardware available at the time. And quite frankly you can take games like for example Skyrim, open up the ini files to edit them manually and many many of these settings actually exceed what the in game options, you can go dump the view distance value to be huge, same with LOD bias and things like grass/folliage density. Which neatly comes back in a circle to what I said prior, which is that anyone can write a game or a mod which requires demands unreasonable performance and so it means almost autistic-like standards such as "the card has to be able to run everything at 1080p at 60fps, otherwise it's not a 1080p ready card", these just become meaningless.

This is the same thinking that keeps people around here from using 4K monitors. The slider bars and settings presented to us are set to be able to push 1080p resolutions, so it is up to us to turn down settings to use 4K, but because those settings are presented to us, the games are unplayable by these same people. So every new generation of card that is about to come out is super exciting to them, because they will finally be able to play at 4K with a single card, only to learn that the newest games that are released, present us with sliders that push hardware even further.

If you realize, as you pointed out, that these settings are just a range the dev's present to us, but much higher and lower ones still exist, you can start to understand that 4K vs 1440p vs 1080p becomes a choice of what settings that you want to use. The resolution as one of those settings. If you use a higher resolution, you use lower settings, and visa versa. This will never change. The only thing that changes what you find more meaningful, high resolution or higher settings. It's all about finding the right balance. As resolution and settings increase, diminishing returns kick in.

One thing you have to consider is how much money are you willing to spend for those settings you can't currently use? Is MSAA x4 instead of TAA on a single game worth spending $200? Is 80 FPS instead of 40 FPS worth $200? I don't think the first is worth it, but I will spend $200 for 80 FPS or more in the games I play. Maybe not for 1 game, as if it's just 1 game I will turn down settings, but 80 FPS in 1st person view games is when motion sickness no longer effects me.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The whole discussion makes 0 sense in my opinion. So what if you have a 1080p screen with sli 1080ti. There is dsr/vsr.

I use 1080p still and my only interest is in high end gpus. Would love the 1080 performance just don't want to pay the inflated price for it when there is 0 competition

I think the point is that the OP is upset that people consider a GTX 1080 overkill for 1080p. I don't think most of us give a crap if he buys a GTX 1080 for 1080p, but if we were asked if we thought it was overkill, we'd say yes. I buy GPU's that most would consider overkill too, but I consider my choice an exception, because 1) I use 3D Vision and 2) I get motion sickness in 1st person games when I have less than 80 FPS (60 FPS is tolerable, lower is not).
 

amenx

Diamond Member
Dec 17, 2004
4,011
2,279
136
I am afraid it's not the same as 768p vs 1050p, the difference is miniscule going from 1080p to 1440p on screens smaller than 32", while the performance needed to drive 1440p at high detail on the newest titles becomes a factor, remember back in the day’s games weren’t as demanding as they are today, not even close, hence why the 8800GT/9800GTX era was so long.

As for resolution, 4K on anything less than a 55" TV monitor is wasted pixels. I've looked at 27” PC monitors with 1440p/4K and you would have to sit incredibly close to it to notice any difference vs 1080p., who wants to sit with his face glued to the screen when playing video games, and who wants to torture his wallet by artificially increasing pixel density for a nigh unnoticeable effect?

I have a big desk and a great gaming chair, I like being distanced from my screen (at least 3 feet), not hunched over like Gollum. So for me a 32” 1440p monitor could make sense (and is on my wishlist). 1440p/4K on a 27" monitor is a waste of pixels IMHO. 1080p for all its worth is still an excellent resolution for monitors ranging between 22”-32” especially if they are quality panels with great colors and response time. There is a point of diminishing returns with resolution/screen size, and this new “I need to push a billion pixels because of e-peen” doesn’t add up. Sure large screens benefit from larger resolutions, so anyone who has their 1080, Titan X/P hooked up to their sexy 65” OLED 4K HDR monstrosity, my hats off to you! But pushing 4K at 27” is nigh pointless.
Agree with much of what you say in relation to pixels vs screen size. Screen size is of paramount importance to me. I cant go less than 27". And for 27" (up to 32"), 1440p is ideal for me. 1080p wont do. Have owned a 27" 1080p unit and the pixels on desktop, text, icons, were quite noticeable vs a 1440p of same size. 1080p makes sense for up to 24" (unless its a TV several feet back).

The larger the screen, the more involving and immersive (and practical) my PC experience is. The same applies to TVs and is why consumers prefer larger ones to small ones. Each may have there own criteria, but for me 1440p good enough for 27-32". 1080p only for up to 24" screens (unless its a TV several feet back). 4k for 32" and above.

So in essence, my argument against 1080p is entirely based on screen size. I just dont want a small screen which 1080p is more suitable for.
 
Reactions: guachi

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Have owned a 27" 1080p unit and the pixels on desktop, text, icons, were quite noticeable vs a 1440p of same size.
That does not mean you will see those same pixels in game. You need 4K at 27" not to be able to see pixels on the desktop.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I think the point is that the OP is upset that people consider a GTX 1080 overkill for 1080p. I don't think most of us give a crap if he buys a GTX 1080 for 1080p, but if we were asked if we thought it was overkill, we'd say yes. I buy GPU's that most would consider overkill too, but I consider my choice an exception, because 1) I use 3D Vision and 2) I get motion sickness in 1st person games when I have less than 80 FPS (60 FPS is tolerable, lower is not).
I still don't get it because you aren't bound by your monitor resolution. Unless you're stipulating that people only play at their native resolution then well I think if you're doing that and you have extra unused performance that someone needs to give that person a lesson in how to use the full power of their gpu.

I'd actually say people buy underpowered gpus in my opinion and expect more from a gpu in the long run than it's actually able to deliver
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I went from 27-inch 1080 to 27-inch 1440p less than a year ago. I don't know how someone cannot appreciate the vast improvement. This is a tremendous improvement in PPI and I can't understand downplaying it. But I at least understand 1080 has some strengths.

There's compromises at all 3 of the resolutions available.

1080 has very low PPI, and while it can limit aliasing just fine with proper AA, this introduces a lot of softness. The higher settings enabled from a less demanding resolution are partially wasted because of this, imo. A high refresh rate monitor is where 1080p can really shine though, since it's the easiest way to hit 100+ framerate.

4K is of course the opposite of 1080. Great PPI and sharpness of detail, but perhaps some of it is lost with lower quality settings? And you can only max out at 60-75Hz right now.

1440p is simply the best balance imo, especially because there are high refresh rate displays available now for the low demanding games (Doom runs like a dream at 1440p high refresh rate).

Just depends on preferences. Some people even like an overly soft image (reminds them of movies) and might actually prefer 1080 because of this since it's easier to max out and throw on some AA for softness+no aliasing. But I personally find that I like detail from extra PPI especially for long distance shooters like Battlefield so I'm glad I went 1440p.
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
A GPU can be absolute overkill today and absolute trash tomorrow. See Doos ex human revolution. Makes a new Titan X look like an underpowered 1080p card.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I still don't get it because you aren't bound by your monitor resolution. Unless you're stipulating that people only play at their native resolution then well I think if you're doing that and you have extra unused performance that someone needs to give that person a lesson in how to use the full power of their gpu.

I'd actually say people buy underpowered gpus in my opinion and expect more from a gpu in the long run than it's actually able to deliver

I would stipulate that if you feel a need to downsample or use VSR/DSR, you are not "most people". There is severe diminishing returns when you do so. When people say a GPU is overkill, they are typically referring to the typical person.

There is always a way to use more GPU power. There is SSAA, VSR, .ini hacks and mods. When people talk about overkill for 1080p or what ever resolution, they aren't talking about people playing at 4K on a 1080p screen, or going way beyond reasonable.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
I still don't get it because you aren't bound by your monitor resolution. Unless you're stipulating that people only play at their native resolution then well I think if you're doing that and you have extra unused performance that someone needs to give that person a lesson in how to use the full power of their gpu.

I'd actually say people buy underpowered gpus in my opinion and expect more from a gpu in the long run than it's actually able to deliver
Hmm well I am never changing resolution on an LCD display, it's like playing a game in low settings.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
I would stipulate that if you feel a need to downsample or use VSR/DSR, you are not "most people". There is severe diminishing returns when you do so. When people say a GPU is overkill, they are typically referring to the typical person.

There is always a way to use more GPU power. There is SSAA, VSR, .ini hacks and mods. When people talk about overkill for 1080p or what ever resolution, they aren't talking about people playing at 4K on a 1080p screen, or going way beyond reasonable.
So let me ask this why is it at all necessary to call a GPU "overkill" to begin with and potentially mislead people? What does this argument achieve? Nobody ever said "my new PC eats all my games for breakfast and now I am so regretting buying it". If someone is not happy with their monitor/TV they will say so themselves and nobody needs to convince them to dislike it.

On the other hand plenty of people complain about performance of their gaming PC sooner or later.

Constantly moving up in resolution impedes the progress of graphical fidelity. 1080p is enough resolution to provide photorealism.
 

WR4P-TP

Junior Member
Aug 25, 2016
6
1
81
I always play @ 1920x1200 60Hz Vsync with my GTX 580 (before), GTX 780 (present) and GTX 1080 (soon). I prefer smooth, stable 60fps gameplay for a long period of time, rather than all that fps and performance hiccups without it after a year or two. 30, 60 or 120Hz/fps are all fine but imo gpu should able to maintain it for a long time without crying 'upgrade now'. This makes my gaming experience much better.
 
Reactions: Thinker_145

Excessi0n

Member
Jul 25, 2014
140
36
101
As for resolution, 4K on anything less than a 55" TV monitor is wasted pixels. I've looked at 27” PC monitors with 1440p/4K and you would have to sit incredibly close to it to notice any difference vs 1080p., who wants to sit with his face glued to the screen when playing video games, and who wants to torture his wallet by artificially increasing pixel density for a nigh unnoticeable effect?

What's your eyesight like? Honest question. Because I sit ~2.5-3 feet from a 23-inch 1080p monitor and it's like looking through a screen door when gaming. Aliasing and a lack of fine detail is extremely obvious and honestly painful to look at. And yet you can seriously advocate using screens (4K 55") with a significantly lower density?

I'm looking at upgrading to a 27" 4K monitor next year because it's a near-doubling of pixel density with a screen size increase on top, and I imagine I'll upgrade to 8K (32" or so sounds good) when a GPU exists that can drive it.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So let me ask this why is it at all necessary to call a GPU "overkill" to begin with and potentially mislead people? What does this argument achieve? Nobody ever said "my new PC eats all my games for breakfast and now I am so regretting buying it". If someone is not happy with their monitor/TV they will say so themselves and nobody needs to convince them to dislike it.

On the other hand plenty of people complain about performance of their gaming PC sooner or later.

Constantly moving up in resolution impedes the progress of graphical fidelity. 1080p is enough resolution to provide photorealism.

We call it overkill, so people don't waste money on a component which will give them extremely small benefits over much cheaper cards. That money can be used on other components on the PC, or monitor, or saved for the future. Using VSR, SSAA, 16x MSAA are examples of overkill themselves. They give extremely small improvements, for insanely higher requirements.

Overkill doesn't mean there is 0 benefit, just that it's an extremely small benefit. If you know you want to go with overkill, nothing is stopping you. You've been told, and chose to ignore the advise of others. That's the beauty of free will. You don't have to follow the advise of others if you know you want to spend your money that way.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
What's your eyesight like? Honest question. Because I sit ~2.5-3 feet from a 23-inch 1080p monitor and it's like looking through a screen door when gaming. Aliasing and a lack of fine detail is extremely obvious and honestly painful to look at. And yet you can seriously advocate using screens (4K 55") with a significantly lower density?

I'm looking at upgrading to a 27" 4K monitor next year because it's a near-doubling of pixel density with a screen size increase on top, and I imagine I'll upgrade to 8K (32" or so sounds good) when a GPU exists that can drive it.

The distance you sit from your monitor makes a pretty big difference is immersion, how big something looks, and how obvious the pixels are. And you might also find when gaming, you don't look for pixels as much as when you are comparing two products.

2.5-3 feet from your monitor is pretty far away if you ask me.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
But yet in this particular argument you DON'T need to do any of that to push any card to its limits at 1080p. An RX 480 or GTX 1060 cannot do 1080p@4xMSAA with quite a lot of games which is NOT an unreasonable setting to want.

A GTX 1080 offers considerably worse value for money at any resolution. If the argument is that "you can always sacrifice a setting or 2" then I don't see a point of getting a 1080 at all even at 1440p since you are only a setting or 2 away from achieving the same performance with a 1070.

If someone is buying a 1080 I would tell them it's poor value for money if they ask for advise. But I am not going to say that a 1070 is all you need because it's simply not true.

If someone has plenty of money but yet game on 1080p then it's obvious they don't seem to care about it. A noob knows as well that they can buy a better screen for gaming than a 24" 1080p monitor so its not a matter of ignorance.

The point about spending the extra money is only valid when the equation is such that other components are part of the same budget which is often not the case. To balance out your rig is an entirely different argument anyways and has nothing to do with overkill.

Sent from my HTC One M9
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Good Grief, this lasted 8 pages? Dudes just buy a video card that can run the games you play with your display's native res at a reasonable speed.

I brought a Dell 3014 30" 1600p display three years ago, and there is no way in hell I'm playing games below that res.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
But yet in this particular argument you DON'T need to do any of that to push any card to its limits at 1080p. An RX 480 or GTX 1060 cannot do 1080p@4xMSAA with quite a lot of games which is NOT an unreasonable setting to want.

A GTX 1080 offers considerably worse value for money at any resolution. If the argument is that "you can always sacrifice a setting or 2" then I don't see a point of getting a 1080 at all even at 1440p since you are only a setting or 2 away from achieving the same performance with a 1070.

If someone is buying a 1080 I would tell them it's poor value for money if they ask for advise. But I am not going to say that a 1070 is all you need because it's simply not true.

If someone has plenty of money but yet game on 1080p then it's obvious they don't seem to care about it. A noob knows as well that they can buy a better screen for gaming than a 24" 1080p monitor so its not a matter of ignorance.

The point about spending the extra money is only valid when the equation is such that other components are part of the same budget which is often not the case. To balance out your rig is an entirely different argument anyways and has nothing to do with overkill.

Sent from my HTC One M9

I've only talked about the 1080 being overkill at 1080p. I say that because even the 1 game you can come up with that a 1080 can be used in, it is only when MSAAx4 is used. If you use TAA, it works great. I find it hard to consider that anything but overkill. It's not like you gain anything in 99% of the games you play, and in that 1% (or less), you are talking about a level of AA.

At 1440p, you see quite notable improvements in many games with a 1080.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
If there is a notable difference with a GTX 1080 at 1440p already then you can be assured it wont be long before there will be a notable difference at 1080p as well.

I will agree though that buying a 1080 for 1080p is bad only because of its price and that you are better off saving the money for a future upgrade.

PS: I really wish we had better high refresh rate 1080p monitors.

Sent from my HTC One M9
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
Hmm well I am never changing resolution on an LCD display, it's like playing a game in low settings.
Why would you not go above the resolution of your display if you have the extra gpu power? I'd prefer to super sample over anything. Just leaving your resolution static makes zero sense.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |