Originally posted by: akugami
Let's just say that with no physical mods and using only the included software, the X1800XT still comes out on top. Performance and feature wise. Now, each card has specific games it works better with. nVidia's stronghold being opengl type games. However, for the most part, it's not like each card performs terrible in games that the other will play. One has to remember that the wins and losses between the two cards are usually by a few frames. I mean, when you run the settings on high and one card only comes out on top by 2-3 frames...it's a win but at the same time it's not that significant. What makes the X1800XT a better buy at this time is the great deals that can be had for their cards at the moment. The price, and availability, makes ATI cards currently a much better buy.
Originally posted by: the Chase
I only went to the Quake benchies because I was just curious on recent ATI gains made on OPEN GL games. Anyone notice the problem with their results? Or at least an unlikely result for upping the resolution from 1024x768 to 1600x1200. And they couldn't get the 1900XTX to work at 1600x1200? HHmmm....
Edit- and the 7600GT is beating the 7900GT at 1024x768? A lot of weird results. Maybe they mislabeled the 2 cards in the 1024 graph??
Hey, could you show me some benchies for Oblivion with HDR+AA enabled? So much for "newer generation." :roll:Originally posted by: ForumMaster
7900gt ftw. it's a newer generation card and will perfrom better.
Originally posted by: ForumMaster
7900gt ftw. it's a newer generation card and will perfrom better.
Originally posted by: DerekWilson
While the 7900 GT generally spent its time at the bottom of our high end tests, remember that it performs slightly better than a stock 7800 GTX. This puts it squarely at or better than the X1800 XL and X1800 XT.
Originally posted by: thilan29
Oh and for the people telling him to keep his current card, he already said he has to give it back to his friend.
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.
Originally posted by: Crusader
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.
Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.
X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.
The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.
HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.
Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.
X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.
The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.
HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
Originally posted by: thilan29And why do you say it was a hack?? Bethesda said it wasn't possible so it wasn't built into the game. It had to be coded separately, I don't see anything wrong with that.
So NVidia PureVideo was a hack cause they had to enable it via the drivers even though the hardware was present on the card??
Originally posted by: DeathReborn
Originally posted by: thilan29And why do you say it was a hack?? Bethesda said it wasn't possible so it wasn't built into the game. It had to be coded separately, I don't see anything wrong with that.
So NVidia PureVideo was a hack cause they had to enable it via the drivers even though the hardware was present on the card??
The "Chuck Patch" was made by someone not employed by Bethesda therefore it is a hack and is in fact not supported by Bethesda or ATI themselves. PureVideo being added later is not a hack, it's a driver update by the company that created it.
OP, since you intend to play Prey/Quake Wars i'd say go with the 7900GT & overclock it.
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.
HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.
Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.
X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.
The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.
HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
LOL, BS! Let me play the tRollo card and ask "Do you have a x1k card to see how it performs with HDR+AA?" Didnt think so...
I play Oblivion with HDR+AA at 1280x960 on a single x1900xtx, and it's quite playable. It's so playable that I just can't force myself away from the game - .
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>
HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it. After 2 refreshes of the nv40 and still the same old HDR, that feature set is just primitive for today's high end cards.
Originally posted by: ST
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.
Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.
X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.
The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.
HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
LOL, BS! Let me play the tRollo card and ask "Do you have a x1k card to see how it performs with HDR+AA?" Didnt think so...
I play Oblivion with HDR+AA at 1280x960 on a single x1900xtx, and it's quite playable. It's so playable that I just can't force myself away from the game - .
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>
HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it. After 2 refreshes of the nv40 and still the same old HDR, that feature set is just primitive for today's high end cards.
I'll play your game...yes i have a x1900xt...yes i have tried it with HDR+AA. @ 1920x1080p no less. Without much elaboration, you will usually incur a 5-8fps hit with HDR+6XAA enabled; even more dramatic if you want AAA. for folks at lower resolution, this is a great feature with acceptable performance loss. At higher resolutions, the gain imho isn't worth it, as it is hard to distinguish between AA on/off unless you zoom in and look for it specifically. Would i want it on the NV71, sure...but is it a deal breaker, very subjective IMHO all again dependent on what resolutions you want to play at....
Originally posted by: Extelleron
Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.
Originally posted by: ST
Originally posted by: Extelleron
Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.
Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.
btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...
Originally posted by: Extelleron
Originally posted by: ST
Originally posted by: Extelleron
Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.
Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.
btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...
Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.
What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.
Originally posted by: ST
Originally posted by: Extelleron
Originally posted by: ST
Originally posted by: Extelleron
Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.
Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.
btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...
Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.
What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.
Without having much experiences with XFire, I will say that on my x1900xt, i dont have much issues at 1920x1080p HDR+6XAA+8XHQAF in outdoors area (tweaked to ALL high image settings), although it is a little slower (~20fps average), so i would suspect it would be fine.
edit: hmm i might try comparing IQ at lower res (1280x1024) with and without AA and HDR enabled....looks to be something most people utilize.
Originally posted by: Extelleron
Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.
What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.