Rollo's 6800NU SLI Benches

nRollo

Banned
Jan 11, 2002
10,460
0
0
I have noticed there is an incredible dearth of information regarding the 6800NU SLI configuration since the launch of SLI. AFAIK, the only online review would be Firing Squad's excellent article here.

For those unfamiliar with the PCIE 6800NU, it's a 12 pipeline, 5 vertex processor card that is PCIE native- the nV41 chip at 325MHz. It sports 256MB of 600MHz effective DDR1 RAM, as well as the second generation video processor with WMV9 decode acceleration. I think this card is not soft-moddable, will check on that later.

I've been having a lot of fun using and benchmarking my 6600GT SLI rig, so when I saw a couple Leadtek 6800NU refurbs on newegg the other night for $239/each, I jumped. (note: Kudos neweggg. Both cards arrived full retail pack and seem to work flawlessly. My first "refurb" experience with them definitely positive)

I'll get more games up in the next week or so, but I wanted to share my initial impressions.

I played 45 minutes of Half Life 2 at 16X12, 4X8X, found it very smooth and responsive. I agree with H on this one, SLI just feels more fluid.

I did some benching, all ran smooth and without a hitch.

System:
A64 3800+/Asus A8N SLI/ 1GB Geil PC3500/2 Leadtek 6800 256MB/SB Audigy 2
Drivers: 71.80, High Quality

Doom3

12X10 4X8X = 57fps

12X10 4X8X=77fps

16X12 2X8X=61fps

16X12 4X8X=42fps

Half Life 2

12X9 4X8X, all "High"

at_c17_12=58fps

at_canals_ 08=88fps

at_coast_05=91fps

at_prison_05=90fps

16X12 4X8X, all "High"

at_c17_12=53fps

at_canals_ 08=76fps

at_coast_05=85fps

at_prison_05=70fps

3Mark2003

14275


3DMark 2005

6761

Far Cry Maximum Details

10X7 4X8X
Regulator=57fps
Research=91fps
Training=63fps
Volcano=91fps

12X10 4X8X
Regulator=44fps
Research=66fps
Training=50fps
Volcano=65fps

16X12 4X8X
Regulator=34fps
Research=50fps
Training=39fps
Volcano=49fps

Painkiller
16X12 4X8X =67fps
19X14 0X0X= 66fps

Half Life2/High Res
19X14 0X0X
at_c17_12=56fps

at_canals_ 08=77fps

at_coast_12=76fps

Far Cry/High Res
19X14 0X0X
Unplayable, very sluggish. Didn't bench, unusable. Far Cry uses the SplitFrameRendering method of SLI, which doesn't get as large of a performance boost as the Alternate Frame Rendering method, which I think explains the difference.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Cool thanks, Rollo, can't say I have seen any 6800 NU's benchmarked before.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
:thumbsup:
you would be right about the softmodding, no go for the PCIe version
not that you need it

You should run some benches on your x800 rig and see how it compares to the SLi rig. BTW, did you sell your 6600's or did you keep them?
 

joliver

Junior Member
Mar 1, 2005
15
0
0
So are these going to be replacing the 6600GT's? Seems like a much better setup! I can't imagine what it would do OC'd (not that anyone should ever do such a thing)!
-Jason
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
The memory seems to be begging for an overclock. You'd get a healthy increase in bandwidth which could be fairly noticeable at the highest of settings (16x12 @ 4x/8x).
I'm guessing you won't be risking refurbs, though. Neat benches.

I love my 6800NU
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: zakee00
:thumbsup:
you would be right about the softmodding, no go for the PCIe version
not that you need it

You should run some benches on your x800 rig and see how it compares to the SLi rig. BTW, did you sell your 6600's or did you keep them?

I am keeping the 6600GTs to have a entry level SLI benchmarking rig at this point. (not to mention I like them!)

I don't know how informative it would be to bench the XT PE against the SLI sets, 3800+ vs 3000+, and single channel vs. dual channel RAM.
In any case, the XT PE will soon be property of Keysplayr2003, who wanted to try an X800 card and is purchasing it.

 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
That's not bad but it's only a coupla 100 points over my single Sapphire Radeon X800XT PE overclocked to 560/590. I'd like to see two GeForce 6800 GTs or Ultras in SLI mode. That'd wipe the floor with my X800XT PE.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ScrewFace
That's not bad but it's only a coupla 100 points over my single Sapphire Radeon X800XT PE overclocked to 560/590. I'd like to see two GeForce 6800 GTs or Ultras in SLI mode. That'd wipe the floor with my X800XT PE.

That may be true, but:
1. Splinter Cell, Chaos Theory developer says "No SM2 for us"
Originally Posted by Dany Lepage, Lead Programmer of Splinter Cell: Chaos Theory (1/25/05)
In my opinion, Shader Model 3.0 is a huge step forward compared with Shader Model 2.0. Shader Model 3.0 adds dynamic branching in the pixel shader and while it's not required, I'm expecting the major IHVs to provide a complete orthogonal solution (FP16 texture filtering/blending) in their next HW iteration.

At this point (things may change), I'm expecting that Splinter Cell - X will only support SM 1.1 and SM 3.0 when it comes out.

-Very significant performance improvement because of dynamic branching in the PS unit.
-Orthogonal FP16 operations
-Market (can't discuss that yet)

SM 3.0 is going to be good enough for some time. There is only one big step left (before GPU start evolving just like CPUs --> performance only) that should allow classic global illumination algorithms to be efficient on GPUs. I doubt SM 4.0 will provide that.

The wind of change is blowing, and SM2 is on the way out.

2. No WMV9 decode acceleration for the X800XT PE.

3. No stencil shadows for the X800XT PE
http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/
Chronicles of Riddick takes advantage of normal maps, while the game?s ?2.0++? mode enables soft stencil shadows for GeForce 6 users

4. No HDR for X800XT PE users in Far Cry.

Don't get me wrong, X800s of all flavors are great cards and perform very well in most games that are currently out.

I think more and more this year X800 cards are going to become like 3dfx cards became in the age of GeForces- they could run the old games fast, but people missed out on the new features in the games being released. Far Cry, Painkiller, Riddick, Splinter Cell CT, etc. are the beginning of this.

BTW- I haven't OCd yet either, and the FS article showed a huge jump with their OCing.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Rollo
2. No WMV9 decode acceleration for the X800XT PE.
I've read that ATi has been waiting on an MS patch before they can enable their WMV decode acceleration.

3. No stencil shadows for the X800XT PE
Huh, Doom 3 must be using something else, then. If you mean soft stencil shadows, well, apparently they exact a mighty performance hit (on the level of Far Cry's HDR?). While they may be pretty, people are also reporting a huge performance hit (2-3x slower for "bevelled" edges--nice, but no thanks).

4. No HDR for X800XT PE users in Far Cry.
Ah, here you're actually technically correct (), but that Crytek demo they did for ATi (The Project) features the same bloom effect, doesn't it? Far Cry's HDR being nV-specific and SC:CT's being GF6-only appear to be more marketing moves than anything else.

But then, what isn't?

Yeah, SM3 (and FP buffers) will become the new standard once ATi gets with the program, but there still seems to be no point in game devs not catering to the decent amount of Radeon 9500-X850 owners. HL2 packs a pretty punch even without HDR or bloom, and it can do so at a good framerate on even 9600XTs. Doesn't seem nec'y to ditch SM2 just yet.

I think more and more this year X800 cards are going to become like 3dfx cards became in the age of GeForces- they could run the old games fast, but people missed out on the new features in the games being released. Far Cry, Painkiller, Riddick, Splinter Cell CT, etc. are the beginning of this.
The key was that nV eventually equalled 3dfx in speed, and also gave devs neat new features to play with. ATi is still even with nV in terms of performance. I don't deny that these extra features are very nice, though.

Anyway, Painkiller has SM3-only features?
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
2. No WMV9 decode acceleration for the X800XT PE.

Really? Why don't you come up with something to back that one up? You keep pimping Nvidias WMV acceleration for whatever reason, and then dodging the Q's every time I ask you about it (which of course still doesn't work as of March 05).

ATI touts WMV decode for X800 that is still not yet enabled either
RADEON® X800 graphics technology takes advantage of its advanced shader processing engine for user programmable video effects, video quality enhancement, and encoding and decoding of many video standards, including MPEG1/2/4, Real Media, DivX and WMV9.
However, ATI will use the proven shader pipeline that already supports RM deblocking (8500 going forward) and Divx acceleration (9500 going forward) Something Nvidia still hasn't been able to do with their so called "progammable processor". I think my money is on ATI on this one.

ATI has already proven they can program the shader pipeline for video decode/encode acceleration, deblocking, and filtering however Nvidia hasn't programmed anything into the PVP on NV4xx, the MPEG decoding was hardcoded into the chip and it still took them 7 months to get it working with decoders you have to pay extra for. (to be honest, I don't even think the PVP is doing much work there either since my AIW has nearly the same performance with the same exact software, if I didn't know better DXVA is doing most of the work and any DX9 card supports that out of the box)

I don't see how you can keep pimping Nvidias WMV acceleration, when clearly it does not work yet. To put it out as a feature difference is laughable considering that what does support compared to what ATI has supported for sometime. I'm interested to see your response after crapping virtually every PVP thread with your nonsense.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: rbV5
2. No WMV9 decode acceleration for the X800XT PE.

Really? Why don't you come up with something to back that one up? You keep pimping Nvidias WMV acceleration for whatever reason, and then dodging the Q's every time I ask you about it (which of course still doesn't work as of March 05).

ATI touts WMV decode for X800 that is still not yet enabled either
RADEON® X800 graphics technology takes advantage of its advanced shader processing engine for user programmable video effects, video quality enhancement, and encoding and decoding of many video standards, including MPEG1/2/4, Real Media, DivX and WMV9.
However, ATI will use the proven shader pipeline that already supports RM deblocking (8500 going forward) and Divx acceleration (9500 going forward) Something Nvidia still hasn't been able to do with their so called "progammable processor". I think my money is on ATI on this one.

ATI has already proven they can program the shader pipeline for video decode/encode acceleration, deblocking, and filtering however Nvidia hasn't programmed anything into the PVP on NV4xx, the MPEG decoding was hardcoded into the chip and it still took them 7 months to get it working with decoders you have to pay extra for. (to be honest, I don't even think the PVP is doing much work there either since my AIW has nearly the same performance with the same exact software, if I didn't know better DXVA is doing most of the work and any DX9 card supports that out of the box)

I don't see how you can keep pimping Nvidias WMV acceleration, when clearly it does not work yet. To put it out as a feature difference is laughable considering that what does support compared to what ATI has supported for sometime. I'm interested to see your response after crapping virtually every PVP thread with your nonsense.

I don't consider ATIs software hack, using the shader pipeline, to be "dedicated hardware" acceleration of WMV9.

The lengthy posts about the nVidia PVP tell me that the people here agree- they howled for blood when it was suggested nVidia might be trying to do the same thing.

I'd test the WMV9 on my 6600GTs and 6800NUs, but alas, MS's fabled .dll stays my hand.

All of which you know, so there hasn't been any point in responding to your questions about it.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: rbV5
2. No WMV9 decode acceleration for the X800XT PE.

Really? Why don't you come up with something to back that one up? You keep pimping Nvidias WMV acceleration for whatever reason, and then dodging the Q's every time I ask you about it (which of course still doesn't work as of March 05).

ATI touts WMV decode for X800 that is still not yet enabled either
RADEON® X800 graphics technology takes advantage of its advanced shader processing engine for user programmable video effects, video quality enhancement, and encoding and decoding of many video standards, including MPEG1/2/4, Real Media, DivX and WMV9.
However, ATI will use the proven shader pipeline that already supports RM deblocking (8500 going forward) and Divx acceleration (9500 going forward) Something Nvidia still hasn't been able to do with their so called "progammable processor". I think my money is on ATI on this one.

ATI has already proven they can program the shader pipeline for video decode/encode acceleration, deblocking, and filtering however Nvidia hasn't programmed anything into the PVP on NV4xx, the MPEG decoding was hardcoded into the chip and it still took them 7 months to get it working with decoders you have to pay extra for. (to be honest, I don't even think the PVP is doing much work there either since my AIW has nearly the same performance with the same exact software, if I didn't know better DXVA is doing most of the work and any DX9 card supports that out of the box)

I don't see how you can keep pimping Nvidias WMV acceleration, when clearly it does not work yet. To put it out as a feature difference is laughable considering that what does support compared to what ATI has supported for sometime. I'm interested to see your response after crapping virtually every PVP thread with your nonsense.

Remember the PureVideo article. ATI may support Hardware acceleration, but the VPU on the Nvidia cards is far superior than channeling it throuhg the pixel pipeline. Although they get similiar performance out of it, Nvidias quality is muc much better than ATIs. (Both cannot attain the level of detail that a 3rd party TV Capture card can).

-Kevin
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Pete
2. No WMV9 decode acceleration for the X800XT PE.
I've read that ATi has been waiting on an MS patch before they can enable their WMV decode acceleration.[/quote]
As you know from the famous PVP wars of days gone by, using the pixels shaders as a workaround for video acceleration isn't looked highly upon around here.

3. No stencil shadows for the X800XT PE
Huh, Doom 3 must be using something else, then. If you mean soft stencil shadows, well, apparently they exact a mighty performance hit (on the level of Far Cry's HDR?). While they may be pretty, people are also reporting a huge performance hit (2-3x slower for "bevelled" edges--nice, but no thanks).[/quote]
You know I was talking about the soft shadows in Riddick. Their performance hit is irrelevant because a. may improve with driver revisions b. we haven't seen how well they run on SLI c. Just HAVING it is better. Remember DX9/HL2? ATs performance hit chart where they showed HL2 taking a 50% performance hit on the 9800Pro? Back then the 9800Pro was heralded as the "second coming of 3d" because it ONLY had a 50% performance hit, compared to the FX series. Can't have it both ways.

4. No HDR for X800XT PE users in Far Cry.
Ah, here you're actually technically correct (), but that Crytek demo they did for ATi (The Project) features the same bloom effect, doesn't it? Far Cry's HDR being nV-specific and SC:CT's being GF6-only appear to be more marketing moves than anything else.[/quote]
You will see more and more of this as time passes. nVs vendor relations with the TWIMTBP program encompass 90% of the major developers, who can be found touting SM3 on nVidia's website last year. If they have been developing on nVidia, are funded by nVidia to make use of SM3 functionality, do you think they will spend their time coding for 9800s? Sure thing.

Yeah, SM3 (and FP buffers) will become the new standard once ATi gets with the program, but there still seems to be no point in game devs not catering to the decent amount of Radeon 9500-X850 owners. HL2 packs a pretty punch even without HDR or bloom, and it can do so at a good framerate on even 9600XTs. Doesn't seem nec'y to ditch SM2 just yet.
HL2 is an example where ATI tried to do what I believe nVidia does: provide hardware and funding to software developers to optimize for their hardware. Unfortunately for all R3XX-R4XX owners, Valve seems to be the only major developer who is working with ATI.
I'd love to see a list of games with GITG on the box- I can point you to a very, very long list of TWIMTBP games.

I think more and more this year X800 cards are going to become like 3dfx cards became in the age of GeForces- they could run the old games fast, but people missed out on the new features in the games being released. Far Cry, Painkiller, Riddick, Splinter Cell CT, etc. are the beginning of this.
The key was that nV eventually equalled 3dfx in speed, and also gave devs neat new features to play with. ATi is still even with nV in terms of performance. I don't deny that these extra features are very nice, though.[/quote]
Then we fundementally agree on everything. I've never said the X800 line isn't great for 99.9% of existing games. It's just my position that ATI doesn't offer anything with enough value to make me gamble that SM3/nV40 specific features won't become more widespread. ATIs functionally transparent fps advantages in some games aren't enough to make me give up seeing the new features.

Anyway, Painkiller has SM3-only features?
It has a SM3 patch that I haven't gotten around to trying yet. I've heard you need that "Battle Out of Hell" expansion pack, which I currently lack.



 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
That's the point. For the foreseeable future game developers will be using Shader Model 2.0b while issueing voluntary patches for Shader Model 3.0 for GeForce 6*** (GT)(Ultra) cards. They want to sell games and they're not going to limit themselves to just nVidia cards. They would not be able to sell to the other half of the market who are Radeon 9*** or X***(Pro)(XL)(XT)(PE) owners. So I think us ATI owners are safe for awhile yet.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
SM2.0b is not a recognized standard. It was developed only by ATI and it is more a marketing ploy to say "Hey, we improved something!". You name one game, just one that says "SUPPORTS SM2.0B!!"

If they wanted to sell the most games possible a majority of users still have last gens hardware so why even look to SM3 if you are thinking like that.

Why do people have to turn this into Nvidia vs. ATI everytime Rollo posts some benchmarks?

-Kevin
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I don't consider ATIs software hack, using the shader pipeline, to be "dedicated hardware" acceleration of WMV9.

The lengthy posts about the nVidia PVP tell me that the people here agree- they howled for blood when it was suggested nVidia might be trying to do the same thing.

I'd test the WMV9 on my 6600GTs and 6800NUs, but alas, MS's fabled .dll stays my hand.

All of which you know, so there hasn't been any point in responding to your questions about it.
.

Get it straight...there is no hardware acceleration on ATI or Nvidias cards currently. ATI's use of the 3D pipeline is a mature , well thought out use of the hardware. Even DVXA acceleration (using the 3D pipeline) is part of the DX spec...I suppose thats a hack as well. People howled because they expected a dedicated, programmable video processor in the Nvidia cards they bought, and they haven't got it. You crapped up every PVP thread with your nonsense and "inside" information...we know how that played out didn't we. Why do you continue to claim a feature set that continues to be MIA, and spread FUD about ATI's features? You don't respond because you don't have the answer.

Don't you think its a little suspect that ATI and Nvidia are waiting for the same patch to deliver the same feature on the same software...yet Nvidias is allegedly using a discrete processor and ATI is using a hack? Something tells me, they may be using similar approach...what does it tell you?



Remember the PureVideo article. ATI may support Hardware acceleration, but the VPU on the Nvidia cards is far superior than channeling it throuhg the pixel pipeline. Although they get similiar performance out of it, Nvidias quality is muc much better than ATIs. .

I'm sorry but the Purevideo was pretty weak compared Anandtechs normal standard. It was hardly comprehensive. Why didn't he compare the cards directly using the same software for instance? "You" may say its far superior, but my testing with the cards I own show not a whole lot of difference in PQ or CPU utilization for MPEG decoding. I do think the 6800 cards have very good PQ decoding video, but I also have a good eye for detail, and my AIW is very close...perhaps just as good.

(Both cannot attain the level of detail that a 3rd party TV Capture card can)

Kevin, a TV Capture card has virtually nothing to do with video playback.

As you know from the famous PVP wars of days gone by, using the pixels shaders as a workaround for video acceleration isn't looked highly upon around here.

Its not when you promise discrete hardware support. ATI's use of the pixel shaders is a proven solution that works and has worked for some time, they don't claim it to be otherwise, or something it isn't. I won't be a bit suprised that we find that the shader pipeline on NV4xx is used to provide what the PVP could not.

Why do people have to turn this into Nvidia vs. ATI everytime Rollo posts some benchmarks?

Think about it:roll:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Gamingphreek
SM2.0b is not a recognized standard. It was developed only by ATI and it is more a marketing ploy to say "Hey, we improved something!". You name one game, just one that says "SUPPORTS SM2.0B!!"

If they wanted to sell the most games possible a majority of users still have last gens hardware so why even look to SM3 if you are thinking like that.

Why do people have to turn this into Nvidia vs. ATI everytime Rollo posts some benchmarks?

-Kevin


I'd like to know that as well.

Thought I was doing the board a favor, spending $480 and some time to provide benches on a card setup no one on the web is reviewing, and I'm fighting same tired old "Ford vs Chevy" argument these always devolve into.

I suppose it can't be avoided.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |