X800XL vs. 6800GT

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
OP:
If you want to wait for the AGP X800XL, and see what it costs, that is your perogative.

I will tell you this, which no one can dispute: the 6800GT is faster at most games/settings now. The 6800GT has a more advanced feature set. The 6800GT will allow you to see things you cannot see with a X800XL. (SM3 on Splinter Cell Chaos Theory, Soft Stencil Shadows on Riddick, and HDR on Far Cry)

If the AGP X800XL is "good enough" for you, you should buy one. It's not a gamble I would take, we've already seen three big games with nVidia only features released or demo'd in the year the nV40 has been out, how many more will there be over the two years you say you want to own this card?

Good luck with it.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Yes, but those features are not worth the extra $100+ for a lot of people including me. I doubt anyone could tell the differance in performace.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BouZouki
Yes, but those features are not worth the extra $100+ for a lot of people including me. I doubt anyone could tell the differance in performace.

I can tell you really squeeze the pennies on your system Bouzouki. Those FXs, 6800Us, and 2-2-2 RAM sticks come cheap.

For me, with these two cards the difference in the Doom3 engine alone would seal the deal. Even my much higher end X800XT PE couldn't run that as smooth as one 6800GT, and I'm likely to want some D3 licenses in the next two years.

You might not be able to tell the difference in performance, but you can't see the HDR or Soft Shadows at all with the X800XL, and SM3 vs SM1.1 will look different at Splinter Cell.
 

sellmen

Senior member
May 4, 2003
459
0
0
Farcry HDR might matter, if you could use AA along with it. Crytek's next generation games seem to support HDR on ATI's cards as well.

Soft Shadows (I assume you mean riddick) might matter, if using the feature didn't cripple the framerate to the point that the game is unplayable. Here are some benchmarks; unplayable on a 6800 Ultra even at 10x7, 0x AA/AF.

Splinter Cell - we'll see.
 

ponyo

Lifer
Feb 14, 2002
19,688
2,810
126
By the time SM3 matters, next gen or two of Nvidia and ATI cards will be out and both will support it.

As for Farcry HDR, I tried it when I had my 6800 Ultra OC. While it was neat to check out, it made Farcry basically unplayable even at 1024x768. It totally kills performance and even 6800 Ultra OC was too slow to use it. Same with Riddick as it seems from the benchmarks. So what's the point in having these features when the current cards itself are too slow to take advantage of it? Will the current Nvidia cards suddenly become faster in the future?
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
Farcry HDR might matter, if you could use AA along with it. Crytek's next generation games seem to support HDR on ATI's cards as well.

Soft Shadows (I assume you mean riddick) might matter, if using the feature didn't cripple the framerate to the point that the game is unplayable. Here are some benchmarks; unplayable on a 6800 Ultra even at 10x7, 0x AA/AF.

Splinter Cell - we'll see.

As for Farcry HDR, I tried it when I had my 6800 Ultra OC. While it was neat to check out, it made Farcry basically unplayable even at 1024x768. It totally kills performance and even 6800 Ultra OC was too slow to use it. Same with Riddick as it seems from the benchmarks. So what's the point in having these features when the current cards itself are too slow to take advantage of it? Will the current Nvidia cards suddenly become faster in the future?

LOL. Great to have those nvidia card features, yet you can't play at decent frame rates.
Me happy with my $249 x800xl. Works great too with the included component outputs.

I've seen the difference between sm3 and 2. And when playing a game, you can't notice the difference. So the arguement is a mute point. Hell, you can't tell the difference when looking at a screen shot.

HDR while nice in Far Cry could have been implemented for the ATI if Crytek had chosen to. I'm sure there was a deal made with nvidia for this. ATI has had HDR as far back as the 9700 but in 12bit, not 16bit. The difference would not be noticable if they did it right, plus the performance hit wouldn't be there.

So this cry that ATI can't do HDR is false, its the programmer that chose to make it nvidia specific. In its grandest form, its just an extreme contrast lighting enhancement. In some cases it actually makes the scene look worse than without it.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
LOL. Great to have those nvidia card features, yet you can't play at decent frame rates.
Hmmm. I ran three of the Far Cry benchmarks at 10X7 0X8X HDR enabled, and got 42,58,55fps.
Seems playable to me.
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
Well, I'm assuming those are average framerates, which means that the lows might be annoying to some.

My feeling is this:

AGP - 6800GT - Price seems to be about $365 at well known places, and there really is no AGP competition @ that speed and price currently.

PCI-E - X800XL - @ $400, the price of the PCIe 6800GT is too high IMO, and the X800XL gives the 6800GT good competition from a performance standpoint and is much less expensive. If you have $100 to burn, than there is nothing wrong with a 6800GT, but I'd rather have 2 free games than a marginally faster card.

SM3 won't really matter in the next couple years, since most games will look the same and run the same either way (given that the newest engines out now - the engines the games of the next couple years will be based on - don't really support SM3). If you are expecting a video card to last more than a couple years, you will have to turn down the features/resolution anyway, so future performance won't be high end for either card.

Finally, the X800XL is not very quiet, but it's not too loud either. I think it is louder than a 6600GT (which I have @ work for CAD while I wait for a NV Quadro 1400), but I don't know how it compares to a 6800GT. I think for a truly silent computer enthusiast, both cards will be too loud and will require aftermarket cooling.

Oh yeah, the 6800GT consumes more power than an X800XL, but it is not THAT much more, and a good 350W supply is certainly enough to handle either card.

-D'oh!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: AnnoyedGrunt
Well, I'm assuming those are average framerates, which means that the lows might be annoying to some.
http://www.firingsquad.com/hardware/msi_nx6800/page8.asp
A Radeon X800XL gets 58fps average at 12X10 4X8X at the Volcano level of Far Cry on a A64 4000+. I get 58 fps average at 10X7 0X8X HDR at the Volcano level of Far Cry on a A64 3800+.
If your position is the framerates are too low for HDR, your position must also be that the X800XL is limited to 10X7 4X8X at Far Cry? So much for ATIs "High Res High detail gaming".
http://www.firingsquad.com/hardware/sapphire_radeon_x850_xt/page7.asp
It must also be your position that no single card is capable of running Far Cry at 16X12 4X8X, as the X850 XT PE is at 63fps at this setting.
Interesting.

My feeling is this:

SM3 won't really matter in the next couple years, since most games will look the same and run the same either way (given that the newest engines out now - the engines the games of the next couple years will be based on - don't really support SM3).
1. Tell that to the guys producing Splinter Cell Chaos Theory 2. Your theory assumes all games in the next two years will be based on engines that are out now, which isn't true.

I agree with the rest of what you said.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
Tell that to the guys producing Splinter Cell Chaos Theory 2.

He was referring to the norm, not the exception. Splinter Cell Chaos Theory 2 is the only title announced that doesn't plan to support SM2.0 (which happens to look nearly identical to SM3.0).




 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
What the big deal about nvidia's hdr? I downloaded the ati sdk from their developer website, and they have a hdr demo using OpenGL, which runs fine on my 9800pro. It seems like the only big deal is that developers chose to use 32bit precision for hdr, and therefore only nvidia cards can run hdr. But if you're using 24bit precision, it seems like even older ati cards have the hardware capability to do hdr. Anyone else care to discuss this?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: munky
What the big deal about nvidia's hdr? I downloaded the ati sdk from their developer website, and they have a hdr demo using OpenGL, which runs fine on my 9800pro. It seems like the only big deal is that developers chose to use 32bit precision for hdr, and therefore only nvidia cards can run hdr. But if you're using 24bit precision, it seems like even older ati cards have the hardware capability to do hdr. Anyone else care to discuss this?

As I've said several times in this thread, FarCry's HDR does not use pixel shaders; it is implemented through a floating point framebuffer, which is (currently) a GF6-only feature. It has nothing to do with 24- versus 32-bit precision, or SM2.0/2.0b/3.0. If ATI's next card has FP framebuffer support, Farcry's HDR will work on it as well.

HDR using pixel shaders, while more complex to implement (and possibly slower, although I have yet to see any hard numbers comparing them) will work on any SM2.0/3.0 card, and you could probably get most of the performance benefits of SM3.0 with SM2.0b on ATI hardware. I suppose you could write it so that it only works with SM3.0, or requires 32-bit precision (which would limit it to the GF5/6, since ATI "only" has 24-bit available in the R3XX and R4XX), but that would be pretty goofy IMO.

Look, we can keep going back and forth on this for days.

The 6800GT is slightly faster overall than the X800XL (faster in Doom3 and most OpenGL games, slower in HL2, close to even in everything else)
The 6800GT has SM3.0, which may give slight performance improvements in future games that support it, and may enable better visuals in some future games (although such visuals may also run so slowly as to be unusable).
All GF6 cards have floating-point framebuffers, which enables HDR in Far Cry. HDR support for future games depends on how they implement it -- it is possible to do HDR through pixel shaders, which will work on both ATI and NVIDIA hardware (HL2 plans to do this at some point; I am less sure about other future games).

ATI has PS2.0b, which (if the developer chooses to support it) can offer most of the performance benefits of PS3.0 (Far Cry does this).

In AGP, currently there's no question -- the X800XL is either unavailable or overpriced, so the 6800GT is it. In a month or two this may change.

In PCIe, for $100 less, I'd take the X800XL. If having SM3.0 and FP framebuffers is worth paying 33% more (~$400 vs. ~$300), or you highly value Doom3 engine performance, by all means don't let us stop you.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
Originally posted by: RussianSensation
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.

Agreed, o wait, I hope hans doesn't come and tell me I need SM3.0
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Matthias99
Originally posted by: munky
What the big deal about nvidia's hdr? I downloaded the ati sdk from their developer website, and they have a hdr demo using OpenGL, which runs fine on my 9800pro. It seems like the only big deal is that developers chose to use 32bit precision for hdr, and therefore only nvidia cards can run hdr. But if you're using 24bit precision, it seems like even older ati cards have the hardware capability to do hdr. Anyone else care to discuss this?

As I've said several times in this thread, FarCry's HDR does not use pixel shaders; it is implemented through a floating point framebuffer, which is (currently) a GF6-only feature. It has nothing to do with 24- versus 32-bit precision, or SM2.0/2.0b/3.0. If ATI's next card has FP framebuffer support, Farcry's HDR will work on it as well.

HDR using pixel shaders, while more complex to implement (and possibly slower, although I have yet to see any hard numbers comparing them) will work on any SM2.0/3.0 card, and you could probably get most of the performance benefits of SM3.0 with SM2.0b on ATI hardware. I suppose you could write it so that it only works with SM3.0, or requires 32-bit precision (which would limit it to the GF5/6, since ATI "only" has 24-bit available in the R3XX and R4XX), but that would be pretty goofy IMO.

Look, we can keep going back and forth on this for days.

The 6800GT is slightly faster overall than the X800XL (faster in Doom3 and most OpenGL games, slower in HL2, close to even in everything else)
The 6800GT has SM3.0, which may give slight performance improvements in future games that support it, and may enable better visuals in some future games (although such visuals may also run so slowly as to be unusable).
All GF6 cards have floating-point framebuffers, which enables HDR in Far Cry. HDR support for future games depends on how they implement it -- it is possible to do HDR through pixel shaders, which will work on both ATI and NVIDIA hardware (HL2 plans to do this at some point; I am less sure about other future games).

ATI has PS2.0b, which (if the developer chooses to support it) can offer most of the performance benefits of PS3.0 (Far Cry does this).

In AGP, currently there's no question -- the X800XL is either unavailable or overpriced, so the 6800GT is it. In a month or two this may change.

In PCIe, for $100 less, I'd take the X800XL. If having SM3.0 and FP framebuffers is worth paying 33% more (~$400 vs. ~$300), or you highly value Doom3 engine performance, by all means don't let us stop you.


OMG a serious post in here...
 

doublejbass

Banned
May 30, 2004
258
0
0
Originally posted by: RussianSensation
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.

Show me a single-slot AGP 6800GT with quality build, decent noise management (once again, the BFG is LOUD), and a 3-year warranty minimum.

EDIT: I'm not saying there isn't one, I'm saying show me.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: doublejbass
Originally posted by: RussianSensation
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.

Show me a single-slot AGP 6800GT with quality build, decent noise management (once again, the BFG is LOUD), and a 3-year warranty minimum.


Albatron 6800GT
 

doublejbass

Banned
May 30, 2004
258
0
0
Originally posted by: jim1976
Originally posted by: doublejbass
Originally posted by: RussianSensation
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.

Show me a single-slot AGP 6800GT with quality build, decent noise management (once again, the BFG is LOUD), and a 3-year warranty minimum.


Albatron 6800GT

Looks like they have the reference cooler on it, which is the same one that's on the non-dual-fan BFG GTs, and is really loud. Is there a difference I'm not seeing?
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: doublejbass
Originally posted by: jim1976
Originally posted by: doublejbass
Originally posted by: RussianSensation
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.

Show me a single-slot AGP 6800GT with quality build, decent noise management (once again, the BFG is LOUD), and a 3-year warranty minimum.


Albatron 6800GT

Looks like they have the reference cooler on it, which is the same one that's on the non-dual-fan BFG GTs, and is really loud. Is there a difference I'm not seeing?

So you want one slot greater HSF and 3 year warranty minimum ? I think you'll wait for a long time my friend it's difficult to find all these together. I can guarantee you that the Albatron card is really good and highly o/cable.
 

doublejbass

Banned
May 30, 2004
258
0
0
That's why I'd go with a BBATI card. I haven't tried out the X800XL's HSF, but the X800 Pro reference HSF wasn't anywhere NEAR as loud as the BFG 6800GT's (Reference cooler edition).

Those restrictions may seem extreme, but in an SFF system (which will not accept a two-slot card) that is engineered for as low noise as possible in a high-performance system, and someone who cares about service and support, those criteria would be what makes BBATI cards attractive.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: doublejbass
Originally posted by: RussianSensation
Ok there should be 0 argument in this thread.

$300 X800XL for PCIe systems
$340 6800GT for AGP systems.

end of story.

Show me a single-slot AGP 6800GT with quality build, decent noise management (once again, the BFG is LOUD), and a 3-year warranty minimum.

EDIT: I'm not saying there isn't one, I'm saying show me.

1) How often do videocards fail though? I am certain the quality of 6800GTs is just as good as ATI cards given that neither of us has any statistical data to see which fails more often. If Nvidia videocards were of much worse quality, this would be in the news .... remember on forums it's biased since you'll only see ppl who report problems - it's not like someone is gonan start a post saying "oh my 6800GT hasnt failed yet, oh no!" Also given that 6800GT sold more units than ATI's high end cards, the probability of consumers having more problems is increased simply due to more cards being distributed to owned. This doesn't explain anything wrt issues per 1000 units sold, for instance. Also how many ppl keep a $400 videocard for 3 years?

2) if you dont like the noise, download gainward expert tool and reduce the fan speed by 50%

3) given that todays cards have onboard sound, lan, etc, a 2 slot solution is no longer an excuse for anyone with a mid-tower case. Your motherboard probably has 4 slots open; so how is this an issue in the real world? Besides you can find single slot solutions on the market if you really wanted to put it in a SFF PC.
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
Originally posted by: Rollo
Originally posted by: AnnoyedGrunt
Well, I'm assuming those are average framerates, which means that the lows might be annoying to some.
http://www.firingsquad.com/hardware/msi_nx6800/page8.asp
A Radeon X800XL gets 58fps average at 12X10 4X8X at the Volcano level of Far Cry on a A64 4000+. I get 58 fps average at 10X7 0X8X HDR at the Volcano level of Far Cry on a A64 3800+.
If your position is the framerates are too low for HDR, your position must also be that the X800XL is limited to 10X7 4X8X at Far Cry? So much for ATIs "High Res High detail gaming".
http://www.firingsquad.com/hardware/sapphire_radeon_x850_xt/page7.asp
It must also be your position that no single card is capable of running Far Cry at 16X12 4X8X, as the X850 XT PE is at 63fps at this setting.
Interesting.

My feeling is this:

SM3 won't really matter in the next couple years, since most games will look the same and run the same either way (given that the newest engines out now - the engines the games of the next couple years will be based on - don't really support SM3).
1. Tell that to the guys producing Splinter Cell Chaos Theory 2. Your theory assumes all games in the next two years will be based on engines that are out now, which isn't true.

I agree with the rest of what you said.

Well, I never actually said MY position was that the frames were too low, just that some might find them that way. Those benches are strange in that the X800XL seemed to be faster with 16X AF than with 8X, but in either case the minimum frames were above 30. I'd consider that an acceptable level, but I certainly wouldn't want to go any slower. So, I'd then have to ask whether HDR is worth the resolution and AA/AF tradeoff. I haven't seen HDR first had, but I'm guessing that I would rather play @ 12x10 with AA/AF than 10x7 with HDR and no AA. Also, we have a situation where the 6800GT is basically @ marginal performance (some might say) with only this single effect. And the only way to improve performance is to turn it off in which case you are running about the same as the X800XL for $100 more. Again, which route someone chooses is going to be based on personal preference, my only point is that IMO SM3/HDR (the features that are being touted as the reasons for spending the extra money) are not a big deal.

Also, regarding next gen games, if you double check my post you'll see that I said in MOST next gen games having SM3 won't make a difference (again, IMO, based on what we know now and what we've seen in the past).

Anyhow, what was trying to say was basically what Mattias said, and what has been echoed a few times since, so the point has been made more clearly than I was able initially.

-D'oh!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |