In that case you’re oblivious to aliasing, especially shader aliasing. That’s the only conclusion I can reach here. This is not unusual though, given the vast majority of the population didn’t notice the filtering transitions on the 5xxx series for example.
I am not oblivious mind you. My mind just doesn't work like yours - I look at a game from an artistic/realistic perspective. I place more emphasis on artistic representation of the game, and very little on exact geographical accuracy of "line smoothing". Beyond 4AA, the visual differences have such minor impact on visuals, that they are almost irrelevant to me in motion. Make the game more realistic/visually appealing and then we are talking.
Question: Do you focus your attention on the fact that every single brush stroke is visible in this Van-Gogh painting below? IF this was an actual game, would the fact that these brushe strokes aren't perfect irritate you?
I don't look at the brush strokes at
all. My mind sees the
setting being presented, the
mood created by the artist, and how I
feel looking at it. The brush strokes (i.e., pixels) are just a means to an end. To you, it's a deal breaker when you see tiny pixels. So your solution is just to throw more resolution, more AA, more AF at a game.
Your mind focuses these minor details (perhaps because you are used to it) while my mind looks at the overall picture instead and much less so at minute details such as edges of fences or how sharp the tip of the grass is. Anti-Aliasing /AF don't make anything more realistic whatsoever. Sure, these filters can make a crappy fence look sharper or the dirt on the ground appear to be more cleaner. However, 2 dimentional dirt is
still 2 dimensional and that crappy fence is still crappy because
the amount of polygons used to create the fence don't change at all! Without the artist designing dynamic foilage, all you are doing is applying a bunch of filtering techniques to primitive textures and polygons. That doesn't make the plants or the ground any more real.
Doom 3/ Quake 4/ Call of Duty 2, etc. series all look like horrible, esp. by today's standards of modern games. Esp. the ancient openGL games where all the characters look like they are made out of clay with shiny reflections applied on them, and the walls look like they are made from 100 polygons.
No amount of anti-aliasing or anisotropic filtering can fix
low polygon models,
static lighting techniques, and lack of
any realistic physics in ancient games. This is why I'll pick modern game graphics of Crysis, Metro 2033, Just Cause 2 with 0AA over
any old game any day of the week. Do you seriously think old games "look good" just because you threw them on a 30 inch monitor with 16AA? Look how awful the character models are, the primitive lighting, static shadows, etc.
Doom 3
Crysis
Go ahead, increase the resolution of Doom 3 to 2560x1600 and apply 16AA/16AF, the game's textures aren't going to get any better and its character models aren't going to gain 10x the amount of polygons.
There is no excuse why we
should stress videocards with high AA/AF. Videocards should be stressed by good graphics, not AA/AF. To me realistic graphics and high AA/AF are 2 completely unrelated concepts.
That image is a classic ray tracing example. Realistic ray tracing goes back decades, but also has nothing to do with gaming, so I'm not sure if I get your point. I've heard some talk about "real-time ray-tracing" in recent years, now that would be an interesting step up in realism.
In the future videocards may be able to do ray-tracing in real time and then rasterization as we know it will be obsolete.
My point is that you can increase graphics quality
without increasing the resolution. Yet, BFG is saying that 1920x1080 is a crappy resolution and that as a result of it, the games look crappy. So if a gamer wants to stress his videocard, he should upgrade the resolution. I am saying, the OP has a point in that the games themselves should look better. We shouldn't be forced on increasing resolution to "stress" videocards. There are no excuses why games can't look better despite the
low "1920x1080" or even "720P" resolution. I have showed many examples in this thread how resolution currently is not limiting games from being any more realistic. If you apply ray-tracing, you could make a 720P game look many times better than a 2560x1600 rasterized game. Perhaps in 20-30 years, 1920x1080 resolution may become a limiting factor for realism, but right now it's nowhere near that point.
Absolution75 brought a good point. Discussing resolution by itself without considering the viewing distance is not entirely accurate. The iPhone 4 has a low resolution but 3x the pixel pitch of a 30 inch monitor. Thus, (1) having higher resolution doesn't necessarily equate to more detail; (2) no current game can max out 720P area of pixels since alternative techniques to rasterization such as ray-tracing have already shown what is possible within the scope of just 720P.