Can someone explain to me why people think they still need things like 4x MSAA on a 4k display?
Pixel size has a significant impact on the presence of aliasing. Is it really that necessary once they get that small?
Can someone explain to me why people think they still need things like 4x MSAA on a 4k display?
Pixel size has a significant impact on the presence of aliasing. Is it really that necessary once they get that small?
When 1024x768 was standard res everyone said the same thing about 1600x1200 (you won't need FSAA at that res you will barely see the jaggies!) and they said the same thing about 1080p and now people are saying the same thing about 4k... I think you'll always need some level of AA. Jaggies/shimmering on staircases and such drives me crazy and nothing but AA will fix it.
You still see aliasing. The jaggies were just as noticeable to me as they are at lower resolutions.
24 inch here. While I agree a trained eye will catch jaggies but to act likely some can live with the trade off, even then only when you can't run AA, is kind of silly.
My OP was about the general emotion about 4k's going around is that modern gpu's fail at the resolution. It is far from the case and I wanted to shed some light on the subject. A single 780 ti or even less is going to give you a great experience. Not one game I have makes playing a chore.
There was a Metro LL bench tossed in here as proof against my claim. That was merely showing a maxed out run sans AA. You could tick one setting off from the settings used and that game would fly. Even screen shots would be hard pressed to show the IQ difference. Sure, true max settings can sink any card combination out there, but if your goal is to enjoy 4k right now with NEARLY "maxed" out settings it won't break the bank.
My buddy is using the same screen on a $375 used 780 and claims to have great performance too. Not too hard to believe, eh?
Why would you pay 700+ for a 4k screen to run it at inferior settings? That's like buying a 60k Shelby and leaving it in first gear. I would rather drop down on the resolution and crank up the settings. Both will likely rock your socks but the latter is much cheaper.
I can't stand stutter or fps dropping. Once you play at 120hz its hard to look at 60 HZ. 20-40 fps no matter how pretty is painful
Just for the sake of beating my point to death. The OP was not about how 4 way Titans can drop below 60 fps when 4 grenades blow up in your face while parachuting into the the Shanghai building toppling onto the rest of your squad at 8x AA 4k.
My point is that for the other times in your gaming adventures, single gpu's are capable of blowing you away using a 4k screen.
Now in case anyone missed the other fact. I do have 2 cards and lets just say I couldn't be happier with my overall experience
I don't consider it inferior when it runs 25% slower just so I get slightly DARKER shadows.. I mean if you care about crap like that, go ahead and get QUAD-GPUs. I don't.
Your car analogy doesn't work.
Why get a lot of vehicles when public speed limits very rarely get you above 70 mph?
I drive a 911, does that mean I go 175mph regularly?
How would having to drop a setting or 2 equate to driving in first gear? Wouldn't it be more like not pressing the nitrous button on passing? Some settings only show themselves in random occurances. Not full on neutering the experience like driving in first gear.
What is better higher XXX setting, or higher resolution setting?So play at 1080 or 1440
Why would you pay 700+ for a 4k screen to run it at inferior settings? That's like buying a 60k Shelby and leaving it in first gear. I would rather drop down on the resolution and crank up the settings. Both will likely rock your socks but the latter is much cheaper.
I can't stand stutter or fps dropping. Once you play at 120hz its hard to look at 60 HZ. 20-40 fps no matter how pretty is painful
Which 4k panel do you have? Do you get 60Hz?