7900gt or x1800xt

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Crusader
Originally posted by: Extelleron
Actually I've never tested the framerate. It's smooth for the most part w/ my settings (1280x1024, max w/HDR, some optimizations to performance/visual quality) + 2xAAA/16xHQAF. I remember having it @ 4xAAA and it being slow, but that's also when I was making tons of crazy visual optimizations that killed performance and caused all kinds of glitches. Anyway, it's not "smooth as butter", there are a few slowdowns in the heavily forested/grass areas, but nothing like the 360 version if you've played that.

What Crusader said is completely wrong, X1900 Crossfire would plow through even 1600x1200 4x/16x without a problem.

Not according to Anandtechs results at Oblivion Gate.
35FPS minimum at OG@1280x1024 with HDR+ No AA. Bump up the res by a large margin (1600x1200 as you stated), add in AA and you are pooped out.
Sure you can lag around with a XTX Crossfire rig + FX60.. but its not worth the cash necessary and certainly not possible with a single X1800XT!

The OP needs the 7900GT.. its the better card considering everything, not as loud/hot/noisy and performs the same or better according to the ultimate authority around here, Derek Wilson. The 7900 also overclocks extremely well esp with volt mod so theres more potential there. Not to mention its a single slot solution (ATI kids love single slot solutions right? At least they used to!)
X1800XT is to slow to run HDR+AA at any high resolution and if you intend to, better pair it with a very fast A64 cuz that old card needs all the help it can get.

Yes, but there is something you forget: The Oblivion gates test here at Anandtech is one of those "worst-case scenario" things." No where else in the game will you get THAT bad of performance. Most of the time it'll be running at 35-40FPS on those settings with a single card. With Crossfire, you can pretty easily enable 1600x1200/1680x1050 with some AA/AF, probably even 4xAAA/16xHQAF, and still get decent performance.

And finally, thats MINIMUM framerate, not average. The average is 46. Last I checked, 46 FPS was very, very good. In a game such as Oblivion, as long as average is above 25FPS~ or so, and minimum above 20~, you're fine.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: Extelleron
So you're saying it's playable (20FPS~, Oblivion as an RPG doesnt demand high FPS) @ 1920x1080, max settings with HDR, and 6xAA/8xHQAF? I find that amazing, even if it is 20 FPS. I would have thought Crossfire would be required to even consider using resolutions that high w/ HDR+AA.

And yes, I think most people use 1280x1024 with Oblivion at least. Unless you have X1800/1900 or 7800GTX/7900, playing @ higher than 1280x1024 with high settings is impossible.

yes, these are some of the results I have had on my system (Opty 165 3.0GHz, 2GB RAM, 1.5TB Raid 0) in a specific scenario (FRAPPED run of ~5-10minutes through outdoor scenery)...it can be choppy at times though (min frame rates can get quite low), but it is not an FPS game as you stated.. My initial observations will be posted in a new thread, albeit with 1920x1080p results of my setup on oblivion first. btw> this was on an OC'd x1900xt (655/790)... i don't recollect the fps on stock settings off hand.

btw> While it is true you can get acceptabel fps with HDR+AA+AF, but enabling AAA will yield a huge performance impact (1/2 normal framerates at least), so that might be pushing it.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: 5150Joker
Of course AAA will give a huge perf. impact in Oblivion, there's an assload of grass outdoors.

Aye, but even with 2xAAA and some visual tweaks to make everything look better, including more grass, I'm still running fine.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
at times, i feel like my sli setup is inadequete
but i'm using 1680x1050 with hdr with tweaks in .ini file
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Crusader you conveniently skipped mine and Munky's posts??

You're going on an on about ONE level in Oblivion and the MINIMUM framerate(average is pretty respectable). You're telling us the X1800/X1900 sucks when in fact it does better than the 7900GT and then you turn around and say get the 7900GT. If you're gonna tell us that the X1K cards suck then at least don't use a benchmark that it WINS.

I know you brought up that benchmark to show something about HDR performance but as several people have pointed out in this thread (including ST who has both a 7900GT and a X1900XT) it's very playable even with HDR + AA. If you don't mind I'd like a reply to the post I made on the previous page.

I think the big framerate killer is the shadows/self shadows. After turning them off I get consistently above 25fps at 1280x1024 HDR with detail other than shadows turned up and some of the texture mods and tweaks.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Crusader
Thats with a hack and only in 1 game vs thousands of games on the market.
The X1800 is too slow to use HDR+AA at any reasonable resolution. [/b]

my native res is 1440x900 (widescreen). how is that not a "reasonable resolution"? 2xaaa and concurrent HDR has lows around 30; with 4xaaa it's in the low 20's - and that's just in certain outdoor areas; most of the time it's 30+. that's with 16x HQAF. turning off adaptive increases it a couple fps, as does turning down AF or turning HQ mode off. there are plenty of ways to adj. for perforance given one's personal taste.

and frankly, 1280 res is by far the most common.

Originally posted by: Crusader
X1900XT Crossfire gets minimum FPS of 35FPS at Oblivion Gate with HDR and NO AA @1280x1024.
Thats a little to close for comfort for me @1280x1024!
Dont forget this is on a FX60.

The X1800XT by its lonesome gets under 20FPS with HDR@1280x1024. source If you actually run 1280x1024 (most people I know run 1680x1050 or 1920x1200), you'll be sitting at 10FPS or less with HDR+AA.. possibly 5FPS.

it's certainly never hit below about 19 fps (measured w/ fraps), and again, that's in certain outdoor areas. most of the time it's nearer to 30fps than 20.

Originally posted by: Crusader
HDR+AA is simply unplayable in intensive areas of the game with these cards, even Crossfire X1900XTX + FX60!

i've yet to run into any situation that's "unplayable".

Originally posted by: Crusader
The X1800/X1900 are to slow to run HDR+AA.
Have to wait for the Geforce8 or Radeon X2K if you want HDR+AA to actually be playable.
while we would certainly have to wait for future hardware to run it at consistently 30+fps, it's still quite useable now.

at any rate, if you are bummed because you can't do it, and would rather hold to the opinion it's useless, that's fine. it's certainly your perogative, but you shouldn't try to justify your opinion and force it on others as fact when you have not had any experience with it whatsoever.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Extelleron
Ah, fanboys who dont know what they're talking about make my day.

LOL! your day is made just by looking in the mirror when you wake up :laugh:

Quake Wars and Prey will play better on a 7900GT. Just look at any Quake4 benchmark. (quake wars uses a modified Quake4 engine).

Besides a factory overclocked XFX will equal or top a X1800 so it's an easy choice to take the nice, cool single slot GT. :thumbsup:
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Originally posted by: nguyen1025
I don't play oblivion but i play alot of FPS and plan on playing quake wars/prey.

for all the ppl who r talking about oblivion! plz read first.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Wreckage
Originally posted by: Extelleron
Ah, fanboys who dont know what they're talking about make my day.

LOL! your day is made just by looking in the mirror when you wake up :laugh:

Quake Wars and Prey will play better on a 7900GT. Just look at any Quake4 benchmark. (quake wars uses a modified Quake4 engine).

Besides a factory overclocked XFX will equal or top a X1800 so it's an easy choice to take the nice, cool single slot GT. :thumbsup:

umm.. according to your own benchmarks linked above, even the XXX edition (560mhz core) is even with the x1800xt @ 1600 with AA/AF (lower res the XT is a nudge faster). how does that equate to "will play better on a 7900GT"?

while in some cases the single slot may be an advantage, i'd hardly say the GT is quieter (the GTX certainly is, but the GT comes with a crap cooler).

even then, prey and quake wars are not the only games in town (oblivion being the hottest at this time, and runs significantly better on the XT, as well as offering more features), and the textures would arguable look much better on the XT due to the superior AF mode it offers.

not knocking the GT at all as it's certainly a very good value, but it seems to me the choice is not quite so clear as you'd like people to believe, and largely depends on what's most important for the individual who is buying it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: CaiNaM
while in some cases the single slot may be an advantage, i'd hardly say the GT is quieter (the GTX certainly is, but the GT comes with a crap cooler).
The GT's cooler is still worlds better than the XT. EVGA has a new cooler out the is quieter and cooler still.
even then, prey and quake wars are not the only games in town (oblivion being the hottest at this time, and runs significantly better on the XT, as well as offering more features), and the textures would arguable look much better on the XT due to the superior AF mode it offers.
Oblivion is just ONE game and is a sad glimmer of hope to hang on to for some people here. I look at overall performance in all games as I tend to play as many as possible. Look at Pacific Figheters, Chronicles of Riddick, Black & White 2, etc. If you blow $300+ on a video card just for one game, well I wish I had your money.
not knocking the GT at all as it's certainly a very good value, but it seems to me the choice is not quite so clear as you'd like people to believe, and largely depends on what's most important for the individual who is buying it.

The OP stated Quake Wars & Prey, so the choice becomes pretty clear.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Wreckage
The GT's cooler is still worlds better than the XT. EVGA has a new cooler out the is quieter and cooler still.
they are within a few decibles of each other.

nice to hear tho that evga will come out with a better one.

Originally posted by: Wreckage
Oblivion is just ONE game and is a sad glimmer of hope to hang on to for some people here. I look at overall performance in all games as I tend to play as many as possible. Look at Pacific Figheters, Chronicles of Riddick, Black & White 2, etc. If you blow $300+ on a video card just for one game, well I wish I had your money.

but that's not really the issue here. you stated the GT was much better for Q4 based games, when in fact that's false, as your own (linked) benchmarks showed the 560mhz XXX edition (which, incidently isn't even avail. on newegg at this time, but the slightly slower 550mhz version is for $339 - considerably more than the 512mb XT) basically caught up to the XT.

and why would you choose the ignore the games in which the XT is faster only by listing the ones that support your view? list them all if you're going to list them.

Originally posted by: Wreckage
The OP stated Quake Wars & Prey, so the choice becomes pretty clear.

since you conveniently ignored the question completely the first time, i'll ask again: why? when your own examples show the (more expensive and harder to find) 560mhz XXX edition only catches up to the XT in Q4, offers an inferior quality AF method, and costs considerably more?

how is that "clear"?

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: 1Dark1Sharigan1
Originally posted by: Wreckage
Look at Pacific Figheters, Chronicles of Riddick, Black & White 2, etc.

What about COD2, F.E.A.R., BF2, FarCry, etc.

heh.. yes, funny examples to use when he made a big deal that the OP play FPS - so why does PF, BW2, and CoR matter? aren't CoD2, FEAR, BF2, and FarCry FPS games? lol...

don't get me wrong, i think the GT would be a fine choice; it's just funny to see how fanbois offer such one-sided examples just so "their" card appears better
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: CaiNaM
Originally posted by: 1Dark1Sharigan1
Originally posted by: Wreckage
Look at Pacific Figheters, Chronicles of Riddick, Black & White 2, etc.

What about COD2, F.E.A.R., BF2, FarCry, etc.

heh.. yes, funny examples to use when he made a big deal that the OP play FPS - so why does PF, BW2, and CoR matter? aren't CoD2, FEAR, BF2, and FarCry FPS games? lol...

don't get me wrong, i think the GT would be a fine choice; it's just funny to see how fanbois offer such one-sided examples just so "their" card appears better

Exactly.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: DeathReborn
The "Chuck Patch" was made by someone not employed by Bethesda therefore it is a hack and is in fact not supported by Bethesda or ATI themselves. PureVideo being added later is not a hack, it's a driver update by the company that created it.


I guess every beta driver from nvzone in reviews, is a hack then as well. According to your logic. The fact is, anyone can download it from ATi, and it will be in a future Cat release. Then all the NV fans wont have an argument. ATi wouldnt have had to do this, if Bethesda wasnt either A) Too lazy, B) Incompetent, C) Paid off.

OP, for a good review, check out FS's; http://www.firingsquad.com/hardware/powercolor_radeon_x1800_gto/page4.asp Its a GTO review, but has the 7900GT and X1800XT in it as well. The 512MB XT is $255 shipped at newegg right now. Faster, cheaper, and better IQ? Easy choice to me.

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: CaiNaM
since you conveniently ignored the question completely the first time, i'll ask again: why? when your own examples show the (more expensive and harder to find) 560mhz XXX edition only catches up to the XT in Q4, offers an inferior quality AF method, and costs considerably more?

how is that "clear"?
At high resolutions the GT was ahead. That's pretty clear. ATI's AF method does not apply to OpenGL now does it? With the coupon expiring soon on the XT your cost argument will also be moot. With better AA, better drivers, better OpenGL, better H.264 support, better HSF, better Linux support, Better AA, etc. The choice is again, very clear.

 

Elfear

Diamond Member
May 30, 2004
7,126
738
126
Originally posted by: Wreckage

At high resolutions the GT was ahead. That's pretty clear. ATI's AF method does not apply to OpenGL now does it? With the coupon expired on the XT your cost argument is also moot. With better AA, better drivers, better OpenGL, better H.264 support, better HSF, better Linux support, Better AA, etc. The choice is again, very clear.

I don't have the time to look up benchmarks to disprove your argument but the bolded statement is rather easy to disspell. Try putting the promo code in before you start posting about it being dead.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Wreckage
Originally posted by: CaiNaM
since you conveniently ignored the question completely the first time, i'll ask again: why? when your own examples show the (more expensive and harder to find) 560mhz XXX edition only catches up to the XT in Q4, offers an inferior quality AF method, and costs considerably more?

how is that "clear"?
At high resolutions the GT was ahead. That's pretty clear. ATI's AF method does not apply to OpenGL now does it? With the coupon expired on the XT your cost argument is also moot. With better AA, better drivers, better OpenGL, better H.264 support, better HSF, better Linux support, Better AA, etc. The choice is again, very clear.

The coupon is still working well, last I tried. You can still get an X1800XT 512MB for $236, when you'd be lucky to get a GT at all, and if you could, you'd get one well above MSRP ($300+, good XFX/EVGA ones, $350~) And you're right, I agree. The choice is again, very clear: The X1800XT is much, much cheaper (nearly $100), has better IQ, and has equal performance, sometimes better. Heck, with the cost of the 7900GT, the X1900XT can almost compete with it in price, and dont try to argue the 7900GT is even half the card the X1900 is.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Elfear
Originally posted by: Wreckage

At high resolutions the GT was ahead. That's pretty clear. ATI's AF method does not apply to OpenGL now does it? With the coupon expired on the XT your cost argument is also moot. With better AA, better drivers, better OpenGL, better H.264 support, better HSF, better Linux support, Better AA, etc. The choice is again, very clear.

I don't have the time to look up benchmarks to disprove your argument but the bolded statement is rather easy to disspell. Try putting the promo code in before you start posting about it being dead.

I rescind the statement. It ends May 3rd. I was repeating information someone else had posted. I went back and looked it up.

It ends May 3rd. I apologize for the error.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: thilan29
Serious Sam 2, Farcry, Splinter Cell:Chaos Theory, Age of Empires 3, Oblivion (can't think of any more off the top of my head) can all do HDR + AA.

Originally posted by: munky
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>

HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it.

You do realize that NV supports HDR+AA in AOE3 as well.

This guy wants Prey/Quake Wars performance and thats all Nvidia FTW with the cool, quiet, single slot solution.
/thread
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Wreckage
At high resolutions the GT was ahead. That's pretty clear.

dude... by 2 freakin fps! lol...

ATI's AF method does not apply to OpenGL now does it?

why wouldn't it (you're the first person who's brought this to my attention; i've never heard that before)?

Originally posted by: Wreckage
With the coupon expiring soon on the XT your cost argument will also be moot.

how is that? the XXX still can't be found, and the slower version is $339... how is that competetive with a $289 price on a 512mb card?

Originally posted by: Wreckage
With better AA
how so? you'll find "reviews" quite contradictory on this matter, and overall they are pretty close in comparison.

Originally posted by: Wreckage
better drivers,

eh?

Originally posted by: Wreckage
better OpenGL,

eh? aside for the few titles that actually use it, the most popular (id software titles) play equally well on ati hardware.

Originally posted by: Wreckage
better H.264 support,

lol.. what does it matter if your card doesn't support HDCP (for whatever reason both nv and ati ignored this to date)? without it you can't use a digital connection to play HD content at full resolution.

Originally posted by: Wreckage
better HSF

that cheap thing found on a GT? NOT. the GTX, yes, far suprerior. the GT? try again.

Originally posted by: Wreckage
better Linux support,

your first valid point (for the small % it matters), however little relevance it may actually hold. i actually use ati on some of my linux boxes and they work just fine (even on my laptop w/ widescreen support), but for gaming, nv is clearly superior when it comes to 'nix support - but just what % of gamers use linux for gaming? and how many games are avail for linux?

Originally posted by: Wreckage
Better AA, etc.

is there an echo where you are at? is it so difficult to tray and support your own point of view you have to use the same reason over again? at any rate, i covered this above..

Originally posted by: Wreckage
The choice is again, very clear.

no offense, but it's pretty obvious that the only thing "clear" here is your lack of objectivity...
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: Crusader
Originally posted by: thilan29
Serious Sam 2, Farcry, Splinter Cell:Chaos Theory, Age of Empires 3, Oblivion (can't think of any more off the top of my head) can all do HDR + AA.

Originally posted by: munky
And here's a list of some other games that would be even more playable with HDR+AA:
SS2
AOE3
Farcry
<insert future TWIMTBP title here...>

HDR+AA is something that should have been supported by NV, but they took the lazy way out and did not implement it.

You do realize that NV supports HDR+AA in AOE3 as well.

This guy wants Prey/Quake Wars performance and thats all Nvidia FTW with the cool, quiet, single slot solution.
/thread

Did he say he ONLY wants to play Prey and Quake Wars?? He said he plays fps. And those games aren't even out and you're assuming that it's already won.

I guess we should all assume that since Ati cards are better in FarCry that we should all buy ATI cards to play CrySis which is also highly anticipated.
 

Elfear

Diamond Member
May 30, 2004
7,126
738
126
Originally posted by: Wreckage

I rescind the statement. It ends May 3rd. I was repeating information someone else had posted. I went back and looked it up.

It ends May 3rd. I apologize for the error.

Thanks for withdrawing that statement. It was big of you.

(The above commentary contains no sarcasm and is really meant as a compliment)
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
All I can say, and I think Elfear will agree since he had a very similar setup, is that I had two 7800GT's, got the x1900xtx and I've noticed the difference.

OP, games look better on the ATI, isn't that what everyone wants? As long as the ATI isn't behind by 10~15, hell even 20 frames per second, you will love the visual quality in most first person shooters and still be able to experience it at (noticably with the naked eye) the same performance. The X1*** series does compete very closely to the current OpenGL games, and probably will continue to compete closely to the 7900 when Quake Wars and Prey hit. Plus, you'll have the option to do great in every other fps if you ever get bored with one of those games.

Crusader, I don't know what you were smoking about your Oblivion and the x1800 comments. I'm really suprised with how closely the x1800 competes with the x1900 in some games. The x1900 is an awesome card. I'll play at 1680 x 1050, 16xHQAF, 4xAA + HDR, + ini tweaks and mods that make it look freakin awesome (512 MB helps with those especially), and at stock speeds...only goes below 20 in heavy, heavy trees and grass or, obviously, Oblivion gates (which I just walk through, its not like I stand there and fight for hours). Most of the time it gets 25-40fps.

Yes, the 7 series is nice, but not as revolutionized as the x1*** series. Heck, ATI is going to be releasing a x1900gt (which may have unlockable pixel shaders) which might be good as well. I don't know when it will hit though. I was just glad that I saw ATI actually spending time to make a quality midranged card rather than a worthless Quad Crossfire. Nvidia's 7 series GPU's are just about as good as they can get right now, and right now the majority of games are being played better on ATI. OpenGL will yeild little, but victorious results towards Nvidia. I just don't see the point in buying one card for one or two games, and neither does Wreckage:

"Oblivion is just ONE game and is a sad glimmer of hope to hang on to for some people here. I look at overall performance in all games as I tend to play as many as possible. Look at Pacific Figheters, Chronicles of Riddick, Black & White 2, etc. If you blow $300+ on a video card just for one game, well I wish I had your money."

I agree with him. The x1*** series will give the best overall performance, and be close to Nvidias OpenGL wins, so IMHO, the x1*** series is the best bang for the buck.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |