What's happening is we're seeing the retarding of a generation of consumers. I think nvidia is ahead of the game recognizing this. They make less robust hardware than AMD and then lock it down so it can't be tweaked out its narrow margins for specifications, which is then made up for by marketing. You can see it in the idiots who post here that would rather talk about a company's financials (which they as consumers are being taken advantage of to improve) rather than the strengths of their products (which should be the focus of the consumer).
It's unfortunate because the enthusiast communities aren't what they were a decade ago when I got into computers. There are a lot more ignorant fanboys sh*tposting all day and less talk about new tech or mods or other interesting aspects of the hobby. In either case, this is the direction the community chose and what it has become, shame really.
In many ways, I think this is what integration and commoditization have wrought. I used to really like tinkering, because there was always something that could be done a bit better, a bit faster, a bit quieter, a bit cheaper, etc.. There's just not as much of that that needs doing, anymore, and what there is is far easier to accomplish. Keeping my Athlon XP overclocked and quiet, FI, was a real accomplishment (I got a FX 5900 succesfully cooled with a Zalman ZM80, FI--that was pretty awesome, at the time!). It took time, effort, planning, testing, note-taking, logging temperatures, etc.. Today, it takes careful part selection, and $100-200 more of a build budget. It's just not as interesting, any more, because while you can DIY some of it, there's no real need, and you'll end up spending as much or more than using COTS parts.
Back then, if you wanted more performance, you had to really push the hardware. The best hardware of the day couldn't get good enough visuals with current games maxed out. Half the first and third person games were like 75% of what Crysis was, for the time.
Back then, there were major differences in IQ between games and hardware, to the point that you could tell the brand of card someone used by looking at a shrunken screenshot, if it had AA and/or AF enabled, and sometimes not even that, based on mipmap and depth handling differences.
If you wanted to quiet things down, it took real effort, too, not unlike trying to squeeze the last drop of performance out. Also, mildly-overclocked Nehalems are almost as fast as IBs (IE, why spend $500+ on an upgrade, when all you'll get is gains from OCing the new CPU?). You couldn't get more than 1-2 extra years out of overclocking a CPU back in 1998, 2000, 2002, or 2004, and 2006 would only work if you got your timing just right, fast RAM, and a nice mobo.
Today, you can easily, and cheaply, overclock, get big coolers, get cases with good airflow, etc., and you can typically just
pay more for more performance. Today, everything
works. In yesteryear, though, hardware was indeed far more interesting. Notice how the backlash for limiting overclocks via overvolting has been barely visible, even on hardware and gaming forums.
"Good enough," can look like a vulture, with the right perspective, I guess.