- Feb 8, 2004
- 12,603
- 9
- 81
"This game alone gives players without high end monitors a real reason to justify saving up for a 7800 GTX. Those who want to play FEAR at the highest resolution and settings with AA enabled (without soft shadows) will basically have to use the 7800 GTX, as no other card available gets playable framerates at those settings, and the 7800 GTX does just barely (if uncomfortably)."
I took that from the F.E.A.R performance review on the front page. I think it sucks that a 7800GTX, the absolute best card out there is brought to its knees at 1600x1200 with aa/af on. This card is what.. 4 months old? 3? Many have been saying that none of todays games will even trouble this card at any setting. Well that view was correct for all of three months. A CPU would never be brought to its knees by any game in such a short amount of time. Buy an athlon 64 FX57/X2 4800 today, and three years from now you will still be playing games on it!. Hell my pentium 4 1.7ghz lasted me that long, all i needed to upgrade was the graphics card. With the healthy competition between nvidia and ati why cant they release somthing that will stay future proof for at least a little longer? I can accept that any card might start to stumble at some ridiculous resolutions/settings like 2048x2048x32 with 8xaa and 16xaf, but the settings fear were run at.... it dosent impress me one bit.
Sure, game programmers should utilise the graphics cards most up to date features, its just whenever they seem to do this it brings the thing to its knees and it takes a whole new generation to do what the older one couldnt at an acceptable framerate. Why is this? nvidia and ati suck? graphics cards are far more complex than cpus to design andmake future proof? games are becoming too graphically impressive too fast? newer game engines are incredibly inefficient at delivering a marginal increase in image quality? what am i missing here?
Short version of rant: Why do graphics cards go out of date and get brought to their knees so soon? It sucks! No other component becomes obsolete as fast as a graphics card. By obsolete i mean not useful for its original intended purpose anymore.
I took that from the F.E.A.R performance review on the front page. I think it sucks that a 7800GTX, the absolute best card out there is brought to its knees at 1600x1200 with aa/af on. This card is what.. 4 months old? 3? Many have been saying that none of todays games will even trouble this card at any setting. Well that view was correct for all of three months. A CPU would never be brought to its knees by any game in such a short amount of time. Buy an athlon 64 FX57/X2 4800 today, and three years from now you will still be playing games on it!. Hell my pentium 4 1.7ghz lasted me that long, all i needed to upgrade was the graphics card. With the healthy competition between nvidia and ati why cant they release somthing that will stay future proof for at least a little longer? I can accept that any card might start to stumble at some ridiculous resolutions/settings like 2048x2048x32 with 8xaa and 16xaf, but the settings fear were run at.... it dosent impress me one bit.
Sure, game programmers should utilise the graphics cards most up to date features, its just whenever they seem to do this it brings the thing to its knees and it takes a whole new generation to do what the older one couldnt at an acceptable framerate. Why is this? nvidia and ati suck? graphics cards are far more complex than cpus to design andmake future proof? games are becoming too graphically impressive too fast? newer game engines are incredibly inefficient at delivering a marginal increase in image quality? what am i missing here?
Short version of rant: Why do graphics cards go out of date and get brought to their knees so soon? It sucks! No other component becomes obsolete as fast as a graphics card. By obsolete i mean not useful for its original intended purpose anymore.