OMG X800 only 30+ fps

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Apr 14, 2004
1,599
0
0
Whether you can buy the card or not is irrelevant as I hear the X800XT's are having clocking problems in production and its slim pickins for them as well. And since you only choose to acknowledge the Ultra, then yes, the XT at 16x12 is a whole 5 fps ,50 to 55, 10% than nvidia's 2nd fasteset card. Not the fastest.
Where can I buy an Ultra Extreme? I wasn't aware the card even hit market yet. At least the XT is available in some places to buy now, and many more places for preorder.
 

NokiaDude

Diamond Member
Oct 13, 2002
3,966
0
0
Currently I have a BBA Radeon 9800 Pro. I can play @ 1024x768 perfectly. Even 1280x1024 runs great. Even if I could run 1600x1200, I'd rather play at lower resolutions with AA/AF turned up. Right now, if I enalble AA/AF, my FPS drop alot. So when I have the $$$, I'm definitely grabbing an X800-XT of 6800GT, whichever is cheaper in the long run.
 

Shad0hawK

Banned
May 26, 2003
1,456
0
0
Originally posted by: Slimline
From what I hear, the x800 will walk all over the GT in half life 2...so it all depends on the game. And isnt doom3 open gl? Ati has never been THAT successful with open gl...Directr 3d is ther accelerant.

dont know where your "heard" that, the only benchmarksa available were on the old beta and aTI did not "pwn" then...
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
I think I will load up HL2 beta again tonight and see how it does. It played great on my 9500 Pro @ 10x7. I would exect even more with my GT. Either way I am betting the retail release will have much better performance than any beta out there. So I don't think anyone is going to have to worry NV or ATI.

As for DIII If the X800 can play it at 16x12 with AA and AF on I don't think you should worry in the least... Just turn AA down or AA off completely with a resolution like that. I bet you can get much highier frame rates no problem....
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
This'll probably come off as "fanboyish" to some (welcome to the internet, where applying labels is easier than thinking critically!), but the following two posts just don't seem to be accurate at all. More like "BS and baseless rhetoric," to me....

Originally posted by: Genx87
Doubt it to be honest. The HL2 beta being used by Digitlife is probably the worst case scenario for the 6800. The game has not optimized worth salt and if it was is only minimally optimized for the 5900.
Where'd you get that idea from? Read Anandtech's HL2 preview to see how the FX cards got their own paths for HL2. IIRC, they performed on par with ATi in DX8 mode, and only lost (big) in DX9 mode, so they've obviously got the speed, just not in DX9. This is not news to anyone who's been following the FX line.

Unless by "optimized for the 5900" you mean "avoid DX9 usage," or "wait for nV to replace shaders in their drivers."

But "worst case scenario" for the 6800? Its performance (in a leaked beta) doesn't look very worst case to me. Suboptimal, maybe, as they're testing on a six-month-old leaked beta with hardware that wasn't around when the game was coded, but not worst case. Unless you think the GF6 series still requires a custom "DX9" path to not show "worst case" performance (like the GF5 series), rather than simply accepting regular DX9 commands?

Originally posted by: Shad0hawK
dont know where your "heard" that, the only benchmarksa available were on the old beta and aTI did not "pwn" then...
I don't claim that the situation is the same today (updated game, drivers, and cards), but ATi did embarass nVidia in the first public and official HL2 benchmarks. IIRC, the 9600P was as fast as the 5900 in the DX9 mode. I'm purposefully ignoring the Digit-Life (aka ixbt-labs in Russian, not to be confused with xbitlabs) benches of the HL2 leak that this thread is referencing.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Pete
This'll probably come off as "fanboyish" to some (welcome to the internet, where applying labels is easier than thinking critically!), but the following two posts just don't seem to be accurate at all. More like "BS and baseless rhetoric," to me....

Originally posted by: Genx87
Doubt it to be honest. The HL2 beta being used by Digitlife is probably the worst case scenario for the 6800. The game has not optimized worth salt and if it was is only minimally optimized for the 5900.
Where'd you get that idea from? Read Anandtech's HL2 preview to see how the FX cards got their own paths for HL2. IIRC, they performed on par with ATi in DX8 mode, and only lost (big) in DX9 mode, so they've obviously got the speed, just not in DX9. This is not news to anyone who's been following the FX line.

Unless by "optimized for the 5900" you mean "avoid DX9 usage," or "wait for nV to replace shaders in their drivers."

But "worst case scenario" for the 6800? Its performance (in a leaked beta) doesn't look very worst case to me. Suboptimal, maybe, as they're testing on a six-month-old leaked beta with hardware that wasn't around when the game was coded, but not worst case. Unless you think the GF6 series still requires a custom "DX9" path to not show "worst case" performance (like the GF5 series), rather than simply accepting regular DX9 commands?

Consider the source, Pete - Genx87 is the less subtle evil twin of Rollo in the pandering for Nvidia crusade.

'Not been optimized worth salt' for the 6800 and minimally optimized for the 5900? He was part of the arguments when the news came out of Valve taking 5X longer working on the NV3x "mixed mode" path. He knows exactly what Valve said about doing extra work to get a mixed PS 1.1/2.0 path. But I guess since Gabe Newell is a very heavyset guy and Genx87 and Rollo started all of those "Gabe Newell is a fat pig and a liar" veined threads, his word is worthless. Carmack, on the other hand apparently can't lie because his game runs so well on Nv hardware ("Doom 3 will be out in 2002, I mean 2003, I mean 2004!!! ).


And what kind of drugs are you on when you say the card is 'not optimized worth salt' for the 6800? The 6800 has full PS2.0 support last I checked, so it's optimized very well for that card! I guess unless a card is under Nvidia's TWIMTBP umbrella, Rollo, ehrm, Genx87 considers the card unoptimized .
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: GeneralGrievous
Whether you can buy the card or not is irrelevant as I hear the X800XT's are having clocking problems in production and its slim pickins for them as well. And since you only choose to acknowledge the Ultra, then yes, the XT at 16x12 is a whole 5 fps ,50 to 55, 10% than nvidia's 2nd fasteset card. Not the fastest.
Where can I buy an Ultra Extreme? I wasn't aware the card even hit market yet. At least the XT is available in some places to buy now, and many more places for preorder.

On EVGAs website? They're selling them as they make them.
 

welst10

Platinum Member
Mar 2, 2004
2,562
1
0
Originally posted by: GeneralGrievous
Let me know when you pay $400+ for a card to have it crap out 2 months after you get it. People paying that much expect a lot; when I paid $400 for a gpu I expect rock solid 1600x1200 gaming for at least a year, preferrably with some AF.

who told you to buy so early, before the official benchmarks are out? you guys deserve it.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
jl2^10, no need to pile on. I was harsh enough as it is, I'd just like those two to respond. Yes, G is typically on nV's side of the fence, but he doesn't always post from there, and I don't think we accomplish anything by focusing solely on his less admirable tendencies.

Speaking of which, let's not get into the past name-calling, either--the technical discussions are depressing enough. :\
 

Shad0hawK

Banned
May 26, 2003
1,456
0
0
Originally posted by: jiffylube1024
'Not been optimized worth salt' for the 6800 and minimally optimized for the 5900? He was part of the arguments when the news came out of Valve taking 5X longer working on the NV3x "mixed mode" path. He knows exactly what Valve said about doing extra work to get a mixed PS 1.1/2.0 path. But I guess since Gabe Newell is a very heavyset guy and Genx87 and Rollo started all of those "Gabe Newell is a fat pig and a liar" veined threads, his word is worthless. Carmack, on the other hand apparently can't lie because his game runs so well on Nv hardware ("Doom 3 will be out in 2002, I mean 2003, I mean 2004!!! ).


And what kind of drugs are you on when you say the card is 'not optimized worth salt' for the 6800? The 6800 has full PS2.0 support last I checked, so it's optimized very well for that card! I guess unless a card is under Nvidia's TWIMTBP umbrella, Rollo, ehrm, Genx87 considers the card unoptimized .


i guess that just shows id has the programming(and objectivity) skills valve does not, after all ID's game apparantly will run reasonably well on all relatively modern cards, not just the vid card company "sponsoring" them...

do i ned sarcasm tags?
 

Shad0hawK

Banned
May 26, 2003
1,456
0
0
Originally posted by: Pete
jl2^10, no need to pile on. I was harsh enough as it is, I'd just like those two to respond. Yes, G is typically on nV's side of the fence, but he doesn't always post from there, and I don't think we accomplish anything by focusing solely on his less admirable tendencies.

Speaking of which, let's not get into the past name-calling, either--the technical discussions are depressing enough. :\


what i wonder is since the 5900 is "fully" programable, why not a nice firmware upgrade to bring the nv3xx into dx9b spec...(which is the whole issue with nv3.xx)they could do it...if they wanted to
 

imported_obsidian

Senior member
May 4, 2004
438
0
0
Originally posted by: GeneralGrievous
Thought this has been disproved already? Unless your listening to Gabe Newell
God forbid we listen to the guy who made the game.....

According to digitlife, the XT is 10% ahead of the Ultra in the HL2 beta @ 1600x1200. Adding AA/AF will likely increase that lead. In Farcry the XT leads the Ultra by 15% at 1600x1200, and at 34% once AA/AF are enabled.

http://www.anandtech.com/video/showdoc.aspx?i=2113&p=7

I gave up on listening to the "guy" when he started flat out lying about release dates. Valve could say the sky was blue and I wouldn't believe them until I stepped outside.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Shad0hawK
card is under Nvidia's TWIMTBP umbrella, Rollo, ehrm, Genx87 considers the card unoptimized .

i guess that just shows id has the programming(and objectivity) skills valve does not, after all ID's game apparantly will run reasonably well on all relatively modern cards, not just the vid card company "sponsoring" them...

do i ned sarcasm tags?[/quote]

Do you really want to display your bias so nakedly, Shad0hawK ? Even if you think Half Life was a piece of cr@p, you have to admit that Valve has a talented team of developers that developed an extremely well received and universally praised (by review sites) game in Half Life, and Half Life 2 is one of the most anticipated PC games ever, along with Doom3.

All of the issues of developing Doom3 and Half Life 2 are right on the table, and there's no need to take potshots at either company, who have both shown their aptitude in developing game engines.

id decided to go with Carmack's preference, OpenGL. OpenGL is traditionally a better performer on Nvidia hardware; kudos to Nvidia for their better optimized OpenGL drivers. Doom3 is almost out; it's taken 5+ years to develop.

Half Life 2 won best of E3 last year, and it's still being updated and worked on to this date. It was one of the most brilliant tech demos ever; their physics engine looks amazing and the game, if ever completed ( ) is going to sell millions. They chose what is more or less the industry standard, DirectX, and decided to go with the (then-current) DirectX9.0 standard, established by Microsoft, not ATI.

Nvidia decided not to design full DX9 (PS/VS 2.0) compatibility into their NV3x architecture; obviously this was because the card was in planning stages before the DX9 spec was finalized. So instead of ignoring Nvidia altogether, Valve released a public statement saying they worked 5X longer completing a PS 1.1/2.0 mixed mode for NV3x cards, to go along with their 'full' PS 2.0 mode.

Nvidia could have put resources into making NV35 fully DX9.0 compatible, and they had a couple of years to do this while NV30 waited for .13 to be mature enough for release. Instead they (IMO wisely) waited and put all their eggs into the 'future' basket and planned out the NV4x series to be the advanced chipset it is today. However, their misstep of NV30 is a mistake that is impossible to just ignore.


Doom3 looks to run well on Radeon 9700+ /FX5800+ architectures, but looks to really be beautiful on current gen hardware.

HL2 is in beta and looks to be a similar parformer to Doom3 when finished, maybe a bit slower (judging by beta results); playable on 9700+ hardware but running even better on current gen hardware. Valve has spent more like 6 years on HL2, and the game is supposedly set to release this fall (although, we know how accurate these predictions can be...)


Where is the lack of programming skills and "objectivity" (???) skills on Valve's part? It's all on the table - the rendering paths that the two development houses have chosen, the strengths and weaknesses of ATI's and Nvidia's hardware, and ATI's and NVidia's choices for supported features in their architectures. How does that translate to a lack of foresight by Valve, who made the most of a bad situation with the NV3x architecture (see: moratorium on NV3x hardware @ Anandtech)? Will you not admit to this being an error on Nvidia's part?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Shad0hawK
Originally posted by: Pete
jl2^10, no need to pile on. I was harsh enough as it is, I'd just like those two to respond. Yes, G is typically on nV's side of the fence, but he doesn't always post from there, and I don't think we accomplish anything by focusing solely on his less admirable tendencies.

Speaking of which, let's not get into the past name-calling, either--the technical discussions are depressing enough. :\


what i wonder is since the 5900 is "fully" programable, why not a nice firmware upgrade to bring the nv3xx into dx9b spec...(which is the whole issue with nv3.xx)they could do it...if they wanted to

I realize I wasted my time writing that lengthy previous post, as you have no idea what you're talking about with this statement.

I don't even want to get into an explanation as to why you can't flash more transistors into a card; it's not worth it.
 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
the way I see it I rather have a 6800 GT to a X800 do to the fact the thier are going to be a whole lot more games based on doom 3 tech over the next few year than games based on the source engine. I mean look at how many games were based on the Q3 engine.
 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
Originally posted by: Falloutboy
the way I see it I rather have a 6800 GT to a X800 do to the fact the thier are going to be a whole lot more games based on doom 3 tech over the next few year than games based on the source engine. I mean look at how many games were based on the Q3 engine.

There are also a lot of games coming out using the CryEngine. And don't forget HL2 and all of it's mods.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: bamacre
Originally posted by: Falloutboy
the way I see it I rather have a 6800 GT to a X800 do to the fact the thier are going to be a whole lot more games based on doom 3 tech over the next few year than games based on the source engine. I mean look at how many games were based on the Q3 engine.

There are also a lot of games coming out using the CryEngine. And don't forget HL2 and all of it's mods.

I have the solution guys and theres no reason to ever post anything again in this forum. R U ready?

Buy both cards.................... I dont care if you dont have the money. Sell blood... Anything to get you sniveling nitpicking snobby A$$ES out of this frickin forum!!!

/rant Ahhhhhhhhhhh.... Much better now..
 

Overkast

Senior member
Aug 1, 2003
337
0
0
I think it's funny when I hear everyone bitching about Valve for their constant game-release delay "antics" regarding HL2.

Have any of you given thought to the fact that Valve might actually be doing all of this on purpose???

Think of all the controversy/publicity that has developed around this game by now. All the "disappointment" that has been generated by constant release delays essentially gets more and more people talking about the game, and therefore more anxious for the true release date when it comes.

In this case, the more "let downs" there are, the more mystery there is behind the whole project. "Delayed again? What the heck is Valve doing? Why is it taking so long? What is wrong with them? Is this game really that complex? It better be a whopper of a game!"

Can you see the trend? And the end result is going to be EXACTLY what JiffyLube said.... Valve is going to make MILLIONS off this game. Because every last one of you is going to run right out and buy it after all this time and waiting.

Whether you have an x800 or a GT, you know you're gonna buy it. Heck, I have a 9600Pro and I don't give a crap on the requirements... I'm gonna buy it too.

So considering the fact that we're all just really sheep in the herd... this whole thread is pretty much a moot point (although fun reading nonetheless).
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Overkast
I think it's funny when I hear everyone bitching about Valve for their constant game-release delay "antics" regarding HL2.

Have any of you given thought to the fact that Valve might actually be doing all of this on purpose???

Think of all the controversy/publicity that has developed around this game by now. All the "disappointment" that has been generated by constant release delays essentially gets more and more people talking about the game, and therefore more anxious for the true release date when it comes.

In this case, the more "let downs" there are, the more mystery there is behind the whole project. "Delayed again? What the heck is Valve doing? Why is it taking so long? What is wrong with them? Is this game really that complex? It better be a whopper of a game!"

Can you see the trend? And the end result is going to be EXACTLY what JiffyLube said.... Valve is going to make MILLIONS off this game. Because every last one of you is going to run right out and buy it after all this time and waiting.

Whether you have an x800 or a GT, you know you're gonna buy it. Heck, I have a 9600Pro and I don't give a crap on the requirements... I'm gonna buy it too.

So considering the fact that we're all just really sheep in the herd... this whole thread is pretty much a moot point (although fun reading nonetheless).

Why is it funny again? I missed it. Besides, I don't think anyone cares how much money Valve makes as long as they get the damn game to market.
 
Apr 14, 2004
1,599
0
0
It's probably for the better that HL2 was delayed. Without the R420/NV40, people would have been bltching constantly that their gpu requirements were too high. I'd have preferred to wait a year to run it on my GT as opposed to at 1024x768 on my 9800 pro.

But now that they are (somewhat) here, it's really time for valve to come through.
 

Overkast

Senior member
Aug 1, 2003
337
0
0
I have the solution guys and theres no reason to ever post anything again in this forum. R U ready?

Buy both cards.................... I dont care if you dont have the money. Sell blood... Anything to get you sniveling nitpicking snobby A$$ES out of this frickin forum!!!

How is this a solution again? I missed it. I don't think anyone cares if you don't want them in this forum since they're obviously quite happy to post here.

pfft
 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
Originally posted by: keysplayr2003
Originally posted by: bamacre
Originally posted by: Falloutboy
the way I see it I rather have a 6800 GT to a X800 do to the fact the thier are going to be a whole lot more games based on doom 3 tech over the next few year than games based on the source engine. I mean look at how many games were based on the Q3 engine.

There are also a lot of games coming out using the CryEngine. And don't forget HL2 and all of it's mods.

I have the solution guys and theres no reason to ever post anything again in this forum. R U ready?

Buy both cards.................... I dont care if you dont have the money. Sell blood... Anything to get you sniveling nitpicking snobby A$$ES out of this frickin forum!!!

/rant Ahhhhhhhhhhh.... Much better now..

:disgust:
 

Selso2109

Member
Jun 20, 2004
71
0
0
30 FPS should run completely fine, My X800 XT is also powered by my Athlon FX-53 and I'm sure ill be getting over 35 FPS.

And if i want to ill buy a 6800 Ultra to... because i make good cash and I spend it how I please..... hows that for an idea?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |