Concillian
Diamond Member
- May 26, 2004
- 3,751
- 8
- 81
Originally posted by: Soviet
Short version of rant: Why do graphics cards go out of date and get brought to their knees so soon? It sucks! No other component becomes obsolete as fast as a graphics card. By obsolete i mean not useful for its original intended purpose anymore.
It's a vicious cycle:
1) Card introduced that can play all new games VERY well
2) Games support new features to take advantage of said card
3) Gamers enable said features deeming such card a necessity
4) new baseline exists
The reality is that games without ALL the eye candy still look damn good. Gamers could be happy with lower res/quality, but they aren't.
CPUs aren't obsolete because the fastest CPU is what, about 50% faster than the slowest modern CPU? The fastest video card is about 700% faster than the slowest modern video card... 7800GTX is 24 pipes and ~450 MHz, but a 6200 is 4 pipes and 350 MHz.
A video card is only as slow as the settings you use. I don't blame game designers for implementing features that modern video cards can barely handle, not if they want the possibility that their game might have a usable life beyond the current gen of cards. Many modern games scale performance quite well, but gamers often refuse to entertain anything other than 'max settings' even though there are usually a few settings where you can gain considerable performance at minimal quality loss.
If a game manufacturer doesn't include features that the best graphics cards can take advantage of, then they will be chided by gamers for not including them. They are pretty much forced to include the features and add scalability for lower performing cards.