X1800 XT or 7900 GT

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: CaiNaM
Originally posted by: ST
Originally posted by: CaiNaM

and what clocks are you talking about? most ppl who volt mod the GT are only attaining core speeds in the 600's...

wow, just wow... everytime i read one of your posts, it less and less credible...so volt moded GT's are in the "600s", does this mean it ranges from 601-699 then? I guess my 685 core is just "adequate" then, although it is higher than a 7900gtx stock clock. wait, does this mean then that a x1800xt will also outperform a 7900gtx now? next thing you'll know, even a 850xt will outperform any G70/G71 based cores.../rolleyes

do you have a reading comprehension problem?

i stated 745mhz obtained in the article is an exception (your uber oc didn't get there, did it?), and oc's in the 600 range are the best ppl are reporting (yours falls into that range, doesn't it? and frankly you're getting better than most).

and why is it you ignore the fact that XT's can be overclocked as well? you ignore what's actually posted, and manipulate the context of your arguments to support your bias.

i never stated a GT @ 685mhz is slower than a XT @ 625, did i? so why are you making that comparison?

how would it compare to an XT @ 700mhz? 725mhz? higher? you don't have that info, and neither do i. i only stated i'd be interested in seeing that.

and you call me a fanboy? lol...

first off don't attempt to ASSume what you know about my card. The 685Mhz (and testing) was on my last 7900GT. I have yet to play with my new one, due to my present business trip. secondly, if you'd shut your trap for 1 second and stop the bsing of a card you have no experience with whatsoever, then you'd get a lot more credibility. trying to to cite an ambiguous range of 601-699 MHz is a tell-tale sign of your ignorance on the subject; you think there's no difference from 625 to 685 MHz?

and on the matter of x1ks overclocking, i don't ignore. but i dont make a dumbass out of myself speaking of matters i am not familiiar with. i only speak of what i can account for and with facts to back it up, not some lame ass hypothetical numbers concocted for the sake of my arguement.

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ST
first off don't attempt to ASSume what you know about my card. The 685Mhz (and testing) was on my last 7900GT. I have yet to play with my new one, due to my present business trip.

yet it's ok for you to assume you "new" one will achieve a higher overclock? again, you apply different rules to your own arguments to support your bias :Q

Originally posted by: ST
secondly, if you'd shut your trap for 1 second and stop the bsing of a card you have no experience with whatsoever, then you'd get a lot more credibility. trying to to cite an ambiguous range of 601-699 MHz is a tell-tale sign of your ignorance on the subject; you think there's no difference from 625 to 685 MHz?

but it's okay for you to talk about a card (x1800xt) you have no experience with? yet you have the audacity to call me ignorant, lol.... again, apply the same rules to yourself you apply to others.

and what do you think 60mhz gets you? 2-3 fps? 4, 5?

Originally posted by: ST
and on the matter of x1ks overclocking, i don't ignore. but i dont make a dumbass out of myself speaking of matters i am not familiiar with. i only speak of what i can account for and with facts to back it up, not some lame ass hypothetical numbers concocted for the sake of my arguement.

what, exaclty have you backed up? you oc the hell out of 1 card, compare it to another, and say yours is faster? you've offered nothing to support that. nothing whatsoever.

all you've shown is your propensity for name calling and your ability to ignore pertinent facts while changing the argument to support your own biased conclusions.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: CaiNaM
yet it's ok for you to assume you "new" one will achieve a higher overclock? again, you apply different rules to your own arguments to support your bias :Q

but it's okay for you to talk about a card (x1800xt) you have no experience with? yet you have the audacity to call me ignorant, lol.... again, apply the same rules to yourself you apply to others.

and what do you think 60mhz gets you? 2-3 fps? 4, 5?

what, exaclty have you backed up? you oc the hell out of 1 card, compare it to another, and say yours is faster? you've offered nothing to support that. nothing whatsoever.

all you've shown is your propensity for name calling and your ability to ignore pertinent facts while changing the argument to support your own biased conclusions.

lol....maybe i "assume" my new card will overclock better, because i might possibly be going a differnt route, possible in cooling, possibly in voltages, possibly in ..... nah you already know it don't you?

what did i say about the x1800xt hehe...did i not ask whether another members or yours seriously thinks they can beat out a 7900gtx? did i get a response in either case?

biased conclusions or facts i know about my own card? maybe you can cite something about yours for once and not be hypocritical about it? impress me please, i dare you!

 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Well yeah my 1800XT can beat out a stock 7900GTX in various things. Not a fair comparison as you can OC the GTX and thus beat out the XT quite easily I would think. I score over 11K in 3D Mark 05 as an easy comparison to the GTX's 10500-10800? stock scores. Of course CPU plays a role in all of this as well. But I am also running it on a POS Chaintech motherboard at 2T timing so that can't be helping much.....
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ST
lol....maybe i "assume" my new card will overclock better, because i might possibly be going a differnt route, possible in cooling, possibly in voltages, possibly in ..... nah you already know it don't you?

again, i'll reiterate. apply the same rules to yourself as others. don't call out others for drawing conclusions when you do it to support your viewpoint. that's not a hard concept to understand, is it?

Originally posted by: ST
what did i say about the x1800xt hehe...did i not ask whether another members or yours seriously thinks they can beat out a 7900gtx? did i get a response in either case?

i don't recall.. and are you asking how an overclocked XT compares to a stock GTX? not sure where you are going here...

in the post above, chase claims hit XT beats out the GTX in varous things (no idea if that's true or not).. is that an example? or do you wish to "call him out" as well?

Originally posted by: ST
biased conclusions or facts i know about my own card?

when you are comparing it to another card based on assumptions of your own bias, yes.

Originally posted by: ST
maybe you can cite something about yours for once and not be hypocritical about it?

such as what?

Originally posted by: ST
impress me please, i dare you!

but i don't have a need, nor any desire to impress you. i favor accuracy over misinformation. whether you are impressed or not means nothing to me, i simply prefer people post accurate info rather then FUD.

 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Regardless of what card I choose....Im ordering in 2 days....What is the best aftermarket cooler for each ?
 

thepd7

Diamond Member
Jan 2, 2005
9,429
0
0
Originally posted by: ST
Originally posted by: CaiNaM
...i favor accuracy over misinformation...



irony at its finest!

roflmao

LOL you are making yourself look dumber and dumber with your responses and he isnot falling into your trap to get in a huge internet fight. Good for you CaiNaM.
 

shortylickens

No Lifer
Jul 15, 2003
82,854
17,365
136
Well, I actually went with evga as you can see in this thread.
After being a fan boy since the Rage days I finally got sick of ATI.
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=1848190

EVGA has the step up program and they gave me full retail credit for my old card. So I get to move from a 7800GT to a 7900GTX for a reasonable price.

And I dont get what all this image quality bickering is about.
At 1600x1200 with no AA/AF both a 7800GT and an X800 Pro produce the exact same image in older and newer games. The only difference is the frame rate, which is noticebly different in FEAR and BF2.

And maybe my eyes arent what they used to be but in order to tell the difference between nothing and 6X Anti-aliasing, I have to turn the resolution down a ways just to see it. And I would much rather play any game at 1600x1200 than 800x600. Especially if I now have the harware to do so.

Anisotropic filtering is similar. I cant tell unless I run some very specific tests with textures and geometry designed to show the difference.
Running around in a game, I TRULY cannot tell.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Games I play are these....Age Of Empires 3, Rome Total War, Doom 3, Quake 4, NFS Most Wanted, COD 2, Unreal 2004. Would the GT with VF900 be good for these ? Or would the XT with an X2 be better ?
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
Originally posted by: x80064
Games I play are these....Age Of Empires 3, Rome Total War, Doom 3, Quake 4, NFS Most Wanted, COD 2, Unreal 2004. Would the GT with VF900 be good for these ? Or would the XT with an X2 be better ?

Just get whichever one is cheaper. The GT will however play worse in upcoming shader games as proven by oblivion.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: x80064
Games I play are these....Age Of Empires 3, Rome Total War, Doom 3, Quake 4, NFS Most Wanted, COD 2, Unreal 2004. Would the GT with VF900 be good for these ? Or would the XT with an X2 be better ?

The GT would be perfect for those games. Although other than COD2 those games are old enough that any new card would be fine.
 

Crashedout

Member
Jan 11, 2000
177
0
0
Originally posted by: Zstream
Originally posted by: x80064
Games I play are these....Age Of Empires 3, Rome Total War, Doom 3, Quake 4, NFS Most Wanted, COD 2, Unreal 2004. Would the GT with VF900 be good for these ? Or would the XT with an X2 be better ?

Just get whichever one is cheaper. The GT will however play worse in upcoming shader games as proven by oblivion.

I would not use that as proof yet. A few more sm 3.0 games and I may agree but basing that opinion on a game that was cross-developed on a console with an ATI GPU is too premature. Again this may prove correct, but the facts are not in yet.

I do agree with the price argument.
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
A game that is a TWIMTBP game. Obviously NV had more to do with developing the game, than ATi did.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
A game that is a TWIMTBP game. Obviously NV had more to do with developing the game, than ATi did.

Not so obvious I think. It is possible the dev just took NV's TWIMTBP money and put the logo on the game. What was actually done to the game code to make it perform any better on certain products could be just based on a few sentences at the opening of the game. Total speculation at best.

We may now have to wait for patches from Oblivion devs for NV hardware to improve performance. Kind of like Crytek did with FarCry. Who knows.

 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: keysplayr2003
Originally posted by: Ackmed
A game that is a TWIMTBP game. Obviously NV had more to do with developing the game, than ATi did.

Not so obvious I think. It is possible the dev just took NV's TWIMTBP money and put the logo on the game. What was actually done to the game code to make it perform any better on certain products is based on a few sentences at the opening of the game. Total speculation at best.

This is evident in the fact that many if not most if the TWIMTBP games run better on ATI hardware.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ZimZum
Originally posted by: keysplayr2003
Originally posted by: Ackmed
A game that is a TWIMTBP game. Obviously NV had more to do with developing the game, than ATi did.

Not so obvious I think. It is possible the dev just took NV's TWIMTBP money and put the logo on the game. What was actually done to the game code to make it perform any better on certain products is based on a few sentences at the opening of the game. Total speculation at best.

This is evident in the fact that many if not most if the TWIMTBP games run better on ATI hardware.

It's why I felt it was worth mentioning.

 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: ZimZum
Originally posted by: keysplayr2003
Originally posted by: Ackmed
A game that is a TWIMTBP game. Obviously NV had more to do with developing the game, than ATi did.

Not so obvious I think. It is possible the dev just took NV's TWIMTBP money and put the logo on the game. What was actually done to the game code to make it perform any better on certain products is based on a few sentences at the opening of the game. Total speculation at best.

This is evident in the fact that many if not most if the TWIMTBP games run better on ATI hardware.

Maybe because ATi just has better cards? I doubt NV would let Bethesda just take their money, and run as claimed. The fact is, its an NV supported game. And Bethesda didnt work on HDR+AA with ATi, as we have come to find out. Logically the signs point to Bethesda working with NV more, than ATi. They even go as far as to recommend an NV card, which is not the best choice. However, since we do not know for 100% certain that they worked with NV more than ATi, its up for debate. I think common sense would dictate that NV was more closely worked with than ATi though.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
Originally posted by: ZimZum
Originally posted by: keysplayr2003
Originally posted by: Ackmed
A game that is a TWIMTBP game. Obviously NV had more to do with developing the game, than ATi did.

Not so obvious I think. It is possible the dev just took NV's TWIMTBP money and put the logo on the game. What was actually done to the game code to make it perform any better on certain products is based on a few sentences at the opening of the game. Total speculation at best.

This is evident in the fact that many if not most if the TWIMTBP games run better on ATI hardware.

Maybe because ATi just has better cards? I doubt NV would let Bethesda just take their money, and run as claimed. The fact is, its an NV supported game. And Bethesda didnt work on HDR+AA with ATi, as we have come to find out. Logically the signs point to Bethesda working with NV more, than ATi. They even go as far as to recommend an NV card, which is not the best choice. However, since we do not know for 100% certain that they worked with NV more than ATi, its up for debate. I think common sense would dictate that NV was more closely worked with than ATi though.

Common sense has no place in corporations when it comes to money changing hands.
So Ackmed, it is very safe for both you and I and anyone else to say, we know nothing about it. And we would all do well to leave it at that.

 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
I didnt say anything about common sense and corporations. Believe what you want. The facts we do have, support that Bethesda worked with NV closer than ATi. But since that thinking favors ATi, I dont expect you to believe in it.
 

Crashedout

Member
Jan 11, 2000
177
0
0
My point was based on the hype MS piled on Oblivion leading up to and after the 360 launch. This IMPLIED that Bethesda was focusing their coding efforts on that platform. From this I deduced that if you were coding for platform with an ATI GPU you would get to know that type of coding better than a competing model. That could explain why they called in NV, they already knew the tricks with ATI and needed help. On Bethesda knows for sure and since they have vested interest in both sides I doubt they will say anything.

To further confound the argument, MS makes the 360 coding tools and they have impied that if you code for the 360 you can easily make a pc port. Since the coding tools would be focused on the 360's GPU, those tools could give an advantage to ATI. GRAW for the pc may support this assumption as well.

None-the-less...basing future predictions on one bit of information is premature. NV may have the inferior design or it may not...we will not know untill more games come out. Plus with all the tweaks out there, unless you have tried both I don't even know if you can say for sure one way or the other.

Sorry to start a flame war. I think you will be happy with either. If you can wait I suspect the ATI prices will drop even more.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
http://www.elderscrolls.com/forums/index.php?showtopic=303890&hl=#

What the developers are using

In addition to the official system specs, we also know some of the machines the actual developers are using. Here is the standard set of specs used in what appears to be the "standard" dev machine, according to the German magazine GameStar:
o AMD AthlonXP 2500+ or 2600+ (Barton Core, 333MHz FSB)
o 1024MB (1GB) RAM
o ATi Radeon 9800pro, AGP with 256MB video RAM
The above settings would likely run the game at a fairly decent clip, and more importantly, if such machines are in use for development, they obviously run the games with all the settings "turned on."

Note that there are MANY different types of PC in use at the BethSoft HQ. From what I understand, the PC used to render that mind-blowing trailer used a pair of GeForce 6800 (GT or ultra) cards in SLi-mode. Also, the machines have varied processors; a dev (Steve Meister, a.k.a. MSFD) noted that Dell is the company's main supplier of PCs; he apparently works at debugging and optimizing from his computer, which is the same as the one mentioned by GameStar, except that it has a 3.0GHz Pentium 4. MSFD also stated that he still uses the same computer he had when Bloodmoon shipped, (in 2003) to code, test, and play Oblivion. It is believed that he has the following specs:
o Intel Pentium 4 3.0GHz with hyper-threading technology
o 1024MB RAM
o ATi Radeon 9800pro, AGP with 256MB video RAM
Likewise, that?s similar to the machine used by animator Gary Noonan (VXSS a.k.a. Wormgod)
o Intel Pentium 4 2.6GHz with hyper-threading technology
o 1024MB RAM
o ATi Radeon 9800pro, AGP with 256MB video RAM
Lastly, when the final build was done, or close to it, BethSoft finally pulled out the big guns, demonstrating how it plays on a really high-end system; most recent previews saw it on the following specs:
o Intel Pentium 4 3.2GHz with-hyper-threading technology (possibly a Pentium D 3.2GHz instead)
o 2048MB (2GB) RAM
o ATi Radeon X1900XT, PCI-e with 512MB video RAM
The last system was capable of playing it at a nearly flawless 60fps, if reporter?s eyes are to be believed, at a resolution of perhaps around 1600x1200.

Take it for what you will.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
I didnt say anything about common sense and corporations. Believe what you want. The facts we do have, support that Bethesda worked with NV closer than ATi. But since that thinking favors ATi, I dont expect you to believe in it.

You mentioned common sense and how it dictates the point you are trying to make.
I mentioned how common sense has no place in this "debate" and why, if you can call it that.
A mere child could have understood this. But you insist on playing the semantics card? Why are you wasting my time?

 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: ST


and on the matter of x1ks overclocking, i don't ignore. but i dont make a dumbass out of myself speaking of matters i am not familiiar with. i only speak of what i can account for and with facts to back it up, not some lame ass hypothetical numbers concocted for the sake of my arguement.


Seems to me you do it all too often. Fact is to get the 7900 GT up to par with a higher end part (assuming they don't kill it in the process) one has to do the following:

1. Buy a conductive pen for $13 + $5.50 shipping and/or tax
2. Multimeter for $20 + $5.50 shipping and/or tax
3. Aftermarket cooler for another $30 + $6 shippiing + tax (if inside CA)


So lets total that up:

1. $319 + $4.65 shipping for an eVGA 7900 GT = $323.65
2. $20 for a multimeter + $5.50 shipping = $25.50
3. $13 + $5.50 for shipping = $18.50
4. Aftermarket cooler for $36

Total cost to volt mod a 7900 GT (while killing it's warranty in the process): $403.65 + voided warranty

vs.

Price of an X1800XT at Newegg: $259 + $4.89 shipping=$263.89

or

Price of an X1900XT at Newegg: $419 + $5.09 shipping=$424.09


So one can choose to buy a 7900 GT, order all the junk above to volt mod it and kill your warranty for $403.65 or buy an X1800XT that costs $263.89 shipped which amounts to a whopping 53% difference in price between them and you still get to keep the warranty on your X1800XT. Or for approximately $20 more than a volt modded 7900 GT you can go with an X1900XT that will give you the same performance as a volt modded 7900 GT and you don't have to spend a dime extra on mods or aftermarket coolers AND you keep your warranty. This is all ignoring the fact that ATi cards have HQ AF, HDR+AA and more stable drivers in addition to their lower prices. Seems to me the 7900 GT simply is not worth getting.
 

Crashedout

Member
Jan 11, 2000
177
0
0
What if you have conductive paint from your athlon xp days, and you have a multimeter for the PS1/PS2 modding you did and a universal cooler from your last video card...but I do agree with you if you have to buy all of it get the 1900.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |