The concept of expecting to play with massive amounts of AF/AA (more so AA) at a mid-range budget seems rather odd to me. Especially since it was not until gf6 that playing with AA or AF was even plausible at resolutions such as 1280x1024.
So what are we arguing about? This argument seems rather pointless.
My view is that artistic in-game visuals are the most important factor in advancing video game graphics in the near future, far more so than advancements in resolutions or AA/AF. I have provided examples of how something can be made visually stunning through ray-tracing despite a lower resolution than 2560x1600. Even at 720P, ray-tracing looks visually superior over rasterization at 2560x1600 16xAA. I have also provided examples of how modern games look vastly superior compared to older game, even without AA/AF, due to higher artistic details (i.e., polygon count on character models, facial features/animations, dynamic lighting effects, etc.). It is games like Crysis, Metro 2033 and STALKER:Cop that continue to push the envelope of graphics, not Call of Duty 4 with transparency anti-aliasing.
Therefore, I agree with OP's viewpoint that not a lot of games push the artistic envelope, which more or less leads us to either using high elevels of AA/AF or using a higher resolution monitor as the primary differentiators between a mid-range ($200-250) and a high-end videocard ($500). In the past, however, the main differentiator between a mid-range and a high-end videocard was not just AA/AF or the ability to run at 2560x1600, but
also the amount of in-game visual settings one could maximize (i.e., could you turn on HDR, soft shadows, dynamic lighting?). With today's games, you can generally max out almost
all in-game visuals in the control panel, outside of extreme tessellation and Depth of Field cases. You can most definitely play the majority of games at 2560x1600 0AA even on a GTX285. Before, you were more or less forced to upgrade your videocard every 2 years to continue to play games. Today, gaming hardware has far surpassed the software, which is mostly still stuck in DX9/10 land.
This implies that most games just aren't demanding enough from an artistic perspective. If games were demanding artistically, you would not be able to turn on any AA at all. Outside of Metro 2033, can you name even 1 game that looks better than Crysis from 2007? I can hardly come up with any better looking game even 3 years after its release (pretty shocking!). PC gaming graphics have stagnated and we need another revolution (which likely will come with the next gen of consoles).
The fact that some reviewers are now pushing super-sampling in reviews and comparing cards
almost exclusively at 2560x1600 only proves the point that PC gaming graphics have stagnated since Crysis, while the hardware has made gigantic leaps during the same period of time. I think NV and AMD are realizing this and are therefore pushing 3D gaming and multi-monitor gaming in order to sell videocards. I can't find the exact Jon Peddie article right now, but I remember reading that in the last 2-3 years the market of discrete desktop GPUs has shrunk almost in half from its peak. While I imagine the economy has taken some toll, or some users switched from using desktops to notebooks or jumped to consoles, I wouldn't be surprised if the average PC gamer takes a lot longer to upgrade a graphics card now than say in the period from 2002-2008.