7900gt or x1800xt

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Originally posted by: akugami
Let's just say that with no physical mods and using only the included software, the X1800XT still comes out on top. Performance and feature wise. Now, each card has specific games it works better with. nVidia's stronghold being opengl type games. However, for the most part, it's not like each card performs terrible in games that the other will play. One has to remember that the wins and losses between the two cards are usually by a few frames. I mean, when you run the settings on high and one card only comes out on top by 2-3 frames...it's a win but at the same time it's not that significant. What makes the X1800XT a better buy at this time is the great deals that can be had for their cards at the moment. The price, and availability, makes ATI cards currently a much better buy.

See this is the debatable part. As shown in previously posted benchies, the 7900GT pulls awa from the 18XT when you look at several of the retail cards out their. Most of the Higher clocked retail cards perform better then the reference 512 GTX and the Superclock/XXX 7900GTs gives the the 512MB GTX XXX (the fastest 7800GTX 512). Since the 7800GTX 512 took the lead over the X1800XT by a decent margin, and the XXX really puts the screws to the XT, then its Reasonable (for people with reason) that the 79GT at reference is at par with 18XT, at 512 GTX speeds beats the 18XT, and at 512GTX XXX speeds beats it be a decent margin.

as said before and after 512MB of ram can be helpful very few games take advantage of it. In fact several games that seemed to have a decent advantage with higher memory, where patched and differences disappeared.

So the X1800XT is a good bargian depending on which you choose, the 7900GT if you get one of the many higher clocked cards is worth he money, and finally that uber deal for the X1900XT is a hard price vs performance ratio to beat.

My one comment that may or not have an affect on the purchase of any of this is dual card configurations. the GT allows for enormous performance markers to be met at a reasonable $600-$700 price point, but not only is the the Xfire configuration clumsy (sorry can't get over external cables) but the CF Master card for the X1900 still expensive and the X1800 is almost completely removed from the market (if they were ever truely there). This would allow for an increase in performance now, and giant leap leap in a month or two that would put you above any of these other options and cost about the same as the highest performers from anybody with again higher performance.

the 7900GT is the SLI darling that Nvidia has been trying to make since day one.

Edit I put an X next to a T that shouldn't have been there.
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: Wreckage
Here are some benchmarks for a 7900GT vs X1800XT
LINK

I only went to the Quake benchies because I was just curious on recent ATI gains made on OPEN GL games. Anyone notice the problem with their results? Or at least an unlikely result for upping the resolution from 1024x768 to 1600x1200. And they couldn't get the 1900XTX to work at 1600x1200? HHmmm....

Edit- and the 7600GT is beating the 7900GT at 1024x768? A lot of weird results. Maybe they mislabeled the 2 cards in the 1024 graph??
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: the Chase
Originally posted by: Wreckage
Here are some benchmarks for a 7900GT vs X1800XT
LINK

I only went to the Quake benchies because I was just curious on recent ATI gains made on OPEN GL games. Anyone notice the problem with their results? Or at least an unlikely result for upping the resolution from 1024x768 to 1600x1200. And they couldn't get the 1900XTX to work at 1600x1200? HHmmm....

Edit- and the 7600GT is beating the 7900GT at 1024x768? A lot of weird results. Maybe they mislabeled the 2 cards in the 1024 graph??


Yeah they messed up something there. The 4AA benches look fine though I think.
 

Exsomnis

Banned
Nov 21, 2005
428
0
0
Originally posted by: ForumMaster
7900gt ftw. it's a newer generation card and will perfrom better.
Hey, could you show me some benchies for Oblivion with HDR+AA enabled? So much for "newer generation." :roll:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: ForumMaster
7900gt ftw. it's a newer generation card and will perfrom better.

That's a pretty vague and misleading comment. The 7900GT is newer yes, but differs in no way as far as features go from the 7800 series. All that is different if higher clocks, which is due to the fact that the GPU die was shrinked. I can tell from personal experience, coming from two 7800GT's overclocked at 535/1200 to a stock x1900xtx, the Radeon does better in all of my gaming. You'll find that the 7800GT SLI outperforms the 7900GT in just about...everything. I'm not saying that it wont compete well with an x1800 series, but its newer generation does not neccessarily mean it will perform better in. Heck, look at Oblivion. However, in OpenGL there is a lead to its side but that has always been there of course.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
If he has to give it back to his friend, I say 7900GT.

Originally posted by: DerekWilson
While the 7900 GT generally spent its time at the bottom of our high end tests, remember that it performs slightly better than a stock 7800 GTX. This puts it squarely at or better than the X1800 XL and X1800 XT.

I'd go with Derek Wilsons opinion.
Source
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
7900GT if youre overclocking, especially if youre willing to do the volt mod. (yes i did read the op, but the results are too awesome to not at least consider )

X1800XT if youre staying stock
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: thilan29
Oh and for the people telling him to keep his current card, he already said he has to give it back to his friend.

Whoops.. Sorry missed that. Umm. I would say toss up, but leaning a bit more toward 7900GT.

 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.

Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.

X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.

The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.

HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
 

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
I think if you overclocked the memory a bit, you'd be able to pull off 2xAA + HDR in oblivion. ATI's memory bus is really robust, and the performance hit from enabling AA is not as severe as in previous architectures or nVidia's. I think i saw a lot of people hitting 1.7GHz-1.8GHz for the X1800XT's memory, which would certainly help in this area.

Please don't call me an ATI fanboy, I happen to prefer nvidia most of the time, but in this case with 512MB X1800XTs being found for $250ish, the price/performance is simply too good to pass up. Also, the HDR + AA is a big deal for me, since even if this game cant run HDR with any antialiasing well (which i contend it probably can with some creativity on settings and a little overclocking), there are probably plenty of other games that will be able to take advantage of this.
 

1Dark1Sharigan1

Golden Member
Oct 5, 2005
1,466
0
0
Originally posted by: Crusader
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.

I know for a fact that 1680x1050 is pretty much playable on my 7900GT (690/1880) with HDR enabled. Adding some AA doesn't really decrease performance by much so I couldn't see how X1900XT crossfire could not handle HDR+AA.

In any case, I'll have my X1900XT in a couple of weeks so I'll see how HDR+AA turns out then . . .
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.

Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.

X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.

The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.

HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.


Serious Sam 2, Farcry, Splinter Cell:Chaos Theory, Age of Empires 3, Oblivion (can't think of any more off the top of my head) can all do HDR + AA.

And why do you say it was a hack?? Bethesda said it wasn't possible so it wasn't built into the game. It had to be coded separately, I don't see anything wrong with that.
So NVidia PureVideo was a hack cause they had to enable it via the drivers even though the hardware was present on the card??

By your logic the 6800 cards were useless because they took a massive performance hit when enabling HDR in a game like FarCry. The featureset argument was a favourite amongst NVidia fans before the X1800 series came out.

And going by the link you provided, the X1800XT beats the 7900GT so does that mean you recommend the X1800XT also??? I'd tell the OP to decide what games he's gonna use it for and the features he wants and then to decide on a card.

This time around ATI has a better featureset which is actually more playable than HDR was on the 6800 cards. Get over it. They're just video cards. Buy which one suits you and at least TRY to provide impartial information to others.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.

Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.

X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.

The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.

HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.

LOL, BS! Let me play the tRollo card and ask "Do you have a x1k card to see how it performs with HDR+AA?" Didnt think so...

I play Oblivion with HDR+AA at 1280x960 on a single x1900xtx, and it's quite playable. It's so playable that I just can't force myself away from the game - .
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>

HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it. After 2 refreshes of the nv40 and still the same old HDR, that feature set is just primitive for today's high end cards.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,770
775
136
Originally posted by: thilan29And why do you say it was a hack?? Bethesda said it wasn't possible so it wasn't built into the game. It had to be coded separately, I don't see anything wrong with that.
So NVidia PureVideo was a hack cause they had to enable it via the drivers even though the hardware was present on the card??

The "Chuck Patch" was made by someone not employed by Bethesda therefore it is a hack and is in fact not supported by Bethesda or ATI themselves. PureVideo being added later is not a hack, it's a driver update by the company that created it.


OP, since you intend to play Prey/Quake Wars i'd say go with the 7900GT & overclock it.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: DeathReborn
Originally posted by: thilan29And why do you say it was a hack?? Bethesda said it wasn't possible so it wasn't built into the game. It had to be coded separately, I don't see anything wrong with that.
So NVidia PureVideo was a hack cause they had to enable it via the drivers even though the hardware was present on the card??

The "Chuck Patch" was made by someone not employed by Bethesda therefore it is a hack and is in fact not supported by Bethesda or ATI themselves. PureVideo being added later is not a hack, it's a driver update by the company that created it.


OP, since you intend to play Prey/Quake Wars i'd say go with the 7900GT & overclock it.


Saying it's a hack implies something negative but how can the ability to add AA be classed as negative?? And regardless of what you consider it...it WORKS doesn't it??
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.


HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.

Really? I guess that's why I'm playing 1280x1024, Maximum w/ extra visual tweaks including more grass, and 2xAAA/16xHQAF in Oblivion on a Opteron 144 + X1900XT @ stock, and getting smooth framerates that are more than playable. :disgust:

Ah, fanboys who dont know what they're talking about make my day.

If I had Crossfire, it'd be an easy 1680x1050/1600x1200, 2x/16x.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.

Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.

X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.

The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.

HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.

LOL, BS! Let me play the tRollo card and ask "Do you have a x1k card to see how it performs with HDR+AA?" Didnt think so...

I play Oblivion with HDR+AA at 1280x960 on a single x1900xtx, and it's quite playable. It's so playable that I just can't force myself away from the game - .
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>

HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it. After 2 refreshes of the nv40 and still the same old HDR, that feature set is just primitive for today's high end cards.

I'll play your game...yes i have a x1900xt...yes i have tried it with HDR+AA. @ 1920x1080p no less. Without much elaboration, you will usually incur a 5-8fps hit with HDR+6XAA enabled; even more dramatic if you want AAA. for folks at lower resolution, this is a great feature with acceptable performance loss. At higher resolutions, the gain imho isn't worth it, as it is hard to distinguish between AA on/off unless you zoom in and look for it specifically. Would i want it on the NV71, sure...but is it a deal breaker, very subjective IMHO all again dependent on what resolutions you want to play at....

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: ST
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: thilan29
The featureset of the X1800XT is a bit better though.

Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution.

X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.

The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.

HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.

LOL, BS! Let me play the tRollo card and ask "Do you have a x1k card to see how it performs with HDR+AA?" Didnt think so...

I play Oblivion with HDR+AA at 1280x960 on a single x1900xtx, and it's quite playable. It's so playable that I just can't force myself away from the game - .
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>

HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it. After 2 refreshes of the nv40 and still the same old HDR, that feature set is just primitive for today's high end cards.

I'll play your game...yes i have a x1900xt...yes i have tried it with HDR+AA. @ 1920x1080p no less. Without much elaboration, you will usually incur a 5-8fps hit with HDR+6XAA enabled; even more dramatic if you want AAA. for folks at lower resolution, this is a great feature with acceptable performance loss. At higher resolutions, the gain imho isn't worth it, as it is hard to distinguish between AA on/off unless you zoom in and look for it specifically. Would i want it on the NV71, sure...but is it a deal breaker, very subjective IMHO all again dependent on what resolutions you want to play at....

Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: Extelleron

Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.

Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.

btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: ST
Originally posted by: Extelleron

Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.

Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.

btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...

Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.

What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: Extelleron
Originally posted by: ST
Originally posted by: Extelleron

Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.

Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.

btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...

Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.

What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.


Without having much experiences with XFire, I will say that on my x1900xt, i dont have much issues at 1920x1080p HDR+6XAA+8XHQAF in outdoors area (tweaked to ALL high image settings), although it is a little slower (~20fps average), so i would suspect it would be fine.

edit: hmm i might try comparing IQ at lower res (1280x1024) with and without AA and HDR enabled....looks to be something most people utilize.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: ST
Originally posted by: Extelleron
Originally posted by: ST
Originally posted by: Extelleron

Jaggies will occur at any resolution, be it 2560x1600, 1920x1080, or 1600x1200. Anti-Aliasing is useful no matter what you say, saying jaggies arent noticeable and therefore HDR+AA is useless at high resolutions is just making excuses for the lazyness of nVidia.

Agreed, but at higher resolutions, it is diminished, and the differences in perceived quality is very subjective.

btw> what is your "smooth" framerates with AAA on? I find it to be a huge, i mean huge fps impact when enabled at high res....with neglible differences at that...

Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.

What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.


Without having much experiences with XFire, I will say that on my x1900xt, i dont have much issues at 1920x1080p HDR+6XAA+8XHQAF in outdoors area (tweaked to ALL high image settings), although it is a little slower (~20fps average), so i would suspect it would be fine.

edit: hmm i might try comparing IQ at lower res (1280x1024) with and without AA and HDR enabled....looks to be something most people utilize.

So you're saying it's playable (20FPS~, Oblivion as an RPG doesnt demand high FPS) @ 1920x1080, max settings with HDR, and 6xAA/8xHQAF? I find that amazing, even if it is 20 FPS. I would have thought Crossfire would be required to even consider using resolutions that high w/ HDR+AA.

And yes, I think most people use 1280x1024 with Oblivion at least. Unless you have X1800/1900 or 7800GTX/7900, playing @ higher than 1280x1024 with high settings is impossible.

 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Extelleron
Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.

What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.

Not according to Anandtechs results at Oblivion Gate.
35FPS minimum at OG@1280x1024 with HDR+ No AA. Bump up the res by a large margin (1600x1200 as you stated), add in AA and you are pooped out.
Sure you can lag around with a XTX Crossfire rig + FX60.. but its not worth the cash necessary and certainly not possible with a single X1800XT!

The OP needs the 7900GT.. its the better card considering everything, not as loud/hot/noisy and performs the same or better according to the ultimate authority around here, Derek Wilson. The 7900 also overclocks extremely well esp with volt mod so theres more potential there. Not to mention its a single slot solution (ATI kids love single slot solutions right? At least they used to!)
X1800XT is to slow to run HDR+AA at any high resolution and if you intend to, better pair it with a very fast A64 cuz that old card needs all the help it can get.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |