ATI tries to downplay SLI

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: lordtyranus
Originally posted by: Acanthus
Originally posted by: lordtyranus
Check again.

<a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.c...o...px?i=2182&amp;p=5
">"><a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.com/video...px?i=2182&amp;p=5
"><b"><a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.com/video/showdoc.aspx?i=2182&amp;p=5
<b">http://ww...o...px?i=2182&amp;p=5
</a>
</a>
</a>

25% difference at 16x12 4x/8x, 32% difference at 20x15

With COLOR BANDING!!! WOOOO!!!!

Evidence?

You can only see it in motion. Screenshots wont work.

The AF Optimization makes clear bands on the changes in mip levels on the ground, as you walk there is a "circle" around your view out on the horizon of lookin like ass.

But of course no one investigates that because its ati. Looking at frame 3478374879 in 3dmark 2001 is far more important.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Acanthus
Originally posted by: lordtyranus
Originally posted by: Acanthus
Originally posted by: lordtyranus
Check again.

<a target=_blank class=ftalternatingbarlinklarge href="http://ww...o...px?i=2182&amp;p=5
">"><a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.c...o...px?i=2182&amp;p=5
">"><b"><a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.com/video...px?i=2182&amp;p=5
<b"><...o...px?i=2182&amp;p=5
</a>
</a>
</a>
]http://www.anandtech.com/video/showdoc.aspx?i=2182&amp;p=5
<b[/L]

25% difference at 16x12 4x/8x, 32% difference at 20x15

With COLOR BANDING!!! WOOOO!!!!

Evidence?

You can only see it in motion. Screenshots wont work.

The AF Optimization makes clear bands on the changes in mip levels on the ground, as you walk there is a "circle" around your view out on the horizon of lookin like ass.

But of course no one investigates that because its ati. Looking at frame 3478374879 in 3dmark 2001 is far more important.

that's strange.. mipmap borders aren't circular...

also there's one a few specific instances where i've seen this on ati.. one example was doac, and i must assume it's something with the game, textures, or even the optimizations by both ati and nvidia, as i started playing the game again this weekend, and my GT exhibits the exact same effect.

 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: CaiNaM
Originally posted by: Acanthus
Originally posted by: lordtyranus
Originally posted by: Acanthus
Originally posted by: lordtyranus
Check again.

<a target=_blank class=ftalternatingbarlinklarge href="http://ww...o...px?i=2182&amp;p=5
">"><a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.c...o...px?i=2182&amp;p=5
<b">">"><b">[L...px?i=...&amp;p=5
</a>
</a>
]http://www.anandtech.com/video...px?i=2182&amp;p=5
<b[/L]
]http://www.anandtech.com/video/showdoc.aspx?i=2182&amp;p=5
<b[/L]

25% difference at 16x12 4x/8x, 32% difference at 20x15

With COLOR BANDING!!! WOOOO!!!!

Evidence?

You can only see it in motion. Screenshots wont work.

The AF Optimization makes clear bands on the changes in mip levels on the ground, as you walk there is a "circle" around your view out on the horizon of lookin like ass.

But of course no one investigates that because its ati. Looking at frame 3478374879 in 3dmark 2001 is far more important.

that's strange.. mipmap borders aren't circular...

also there's one a few specific instances where i've seen this on ati.. one example was doac, and i must assume it's something with the game, textures, or even the optimizations by both ati and nvidia, as i started playing the game again this weekend, and my GT exhibits the exact same effect.

Its not literally a circle, its a wedge shape in the view outside of your viewpoint.

The Current engine for DAOC is a perfect example of what it looks like, but that is for both cards and engine related.
 

lordtyranus

Banned
Aug 23, 2004
1,324
0
0
Originally posted by: Acanthus
Originally posted by: lordtyranus
Originally posted by: Acanthus
Originally posted by: lordtyranus
Check again.

<a target=_blank class=ftalternatingbarlinklarge href="http://ww...o...px?i=2182&amp;p=5
">"><a target=_blank class=ftalternatingbarlinklarge href="http://www.anandtech.c...o...px?i=2182&amp;p=5
<b">">"><b">[L...px?i=...&amp;p=5
</a>
</a>
]http://www.anandtech.com/video...px?i=2182&amp;amp;p=5
<b[/L]
]http://www.anandtech.com/video/showdoc.aspx?i=2182&amp;amp;amp;p=5
&amp;lt;b[/L]

25% difference at 16x12 4x/8x, 32% difference at 20x15

With COLOR BANDING!!! WOOOO!!!!

Evidence?

You can only see it in motion. Screenshots wont work.

The AF Optimization makes clear bands on the changes in mip levels on the ground, as you walk there is a "circle" around your view out on the horizon of lookin like ass.

But of course no one investigates that because its ati. Looking at frame 3478374879 in 3dmark 2001 is far more important.

link to source for this information? I assume you don't have an x800 card.

cainam, besides daoc what other games is the x800 optimizations noticable?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: lordtyranuslink to source for this information? I assume you don't have an x800 card.

cainam, besides daoc what other games is the x800 optimizations noticable?

hmm.. had call of duty, far cry, daoc, rise of nations, ut2k4, doom3, halo, and painkiller. doac was the only one where i noticed it. now, that's not to say one couldn't find an example in some map on a particular texture, but daoc was the only game where i noticed this effect.

the GT is also showing this effect, so I'm kind of thinking the engine brings out the optimizations on both - the 9800 doesn't exhibit this (at least not nearly the same extent). i haven't gotten around to running "opts off" on the GT, so perhaps i should check that later. i'll update after i try that.

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
You need to get a life beyond worrying so much about which video cards other people buy?
Said the man who purchased three 5800s and endlessly pimps them as the greatest thing since sliced bread.

What do you care?
And what do you care if I respond?

1. I can't win the argument, so there's no point to try.
You got the first half correct yet you still appear to be "trying" the second half.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
B3D is already using a game that doesn't support AA for benching, Halo could bring any of the last gen parts to their knees without AA and AF thanks to its shader load.
I'm reasonably happy to chuck in Halo as a benchmark (the more the merrier I say), I just cringe at Gearbox's horrendously poor shader writing skills. They really need to take some lessons from Crytek.

But none of them were PS 2.0 level shaders which is what they were pushing so hard.
I dunno, I suspect the newest games (COD, EF2 and JA) might have some SM 2.0 shaders in there. I know COD was showing a large advantage on the R300 series despite OpenGL being nVidia's turf.

Only GeForce for DX7 level hardware, it won't run at all on the others.
The point is that there are better examples of minimum shader requirements than Doom III.

Just saw that for Thief3, DeusEX2 doesn't claim to have any shader requirement that I can find anywhere.
Both use the exact same Unreal 2 derivative engine and both require a DirectX 8.0 card as a minimum; they will not launch on any DirectX 7 or lower hardware.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
So if Unreal 2 support SM2 or PS2, or whatever
The core engine doesn't although some derivative Unreal 2 engines may support SM 2.0 (DEIW and Thief 3 for example).

doesnt that mean we already have over 30 games which support PS2?
I'd estimate the number of current SM 2.0 games to be one or two dozen.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
that's strange.. mipmap borders aren't circular...

Most of the time they should be(you may not be able to see far enough out to make that out depending on the situation though).

BFG-

I'm reasonably happy to chuck in Halo as a benchmark (the more the merrier I say), I just cringe at Gearbox's horrendously poor shader writing skills. They really need to take some lessons from Crytek.

You don't have Halo then. Check it out sometime and see for yourself. FarCry's shader load is quite a bit lighter then Halo's(Halo's has many levels where nearly every pixel on the screen is covered with shaders, sometimes multiple shaders per pixel). Most people don't understand what they are looking at though, they just see cool water and think 'man what a shader heavy game'.

I dunno, I suspect the newest games (COD, EF2 and JA) might have some SM 2.0 shaders in there. I know COD was showing a large advantage on the R300 series despite OpenGL being nVidia's turf.

The functionality isn't even there without using proprietary extensions. For CoD it has more to do with the texture load then anything else(nV's double full buffer when running AA kills them there).

The point is that there are better examples of minimum shader requirements than Doom III.

I guess it depends, if you are running ATi it makes no difference, nor does it if you are running Matrox, PowerVR or 3dfx. It only impacts nV boards.

Both use the exact same Unreal 2 derivative engine and both require a DirectX 8.0 card as a minimum; they will not launch on any DirectX 7 or lower hardware.

I don't doubt you, just hadn't seen it before(still haven't seen it listed anywhere for DeusEX2 though I don't doubt you at all- no way would I buy a game from those low lifes at Eidos to check though).

So how does this strengthen Dave's laughable assertion two years ago that PS 2.0 was some crucial technology? I'm still waiting for him to answer about his 'number of games' which of course he won't(nor will he about his claims that D3 will not be a popular engine, nor will he comment on the amount of games he has played through). Maybe you can try and answer for him? You have forgotten more about games in the last week then Dave will ever know.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Most people don't understand what they are looking at though, they just see cool water and think 'man what a shader heavy game'.
Perhaps. But it's also a fact that Gearbox's implementation of these shaders is rather heinous to say the least.

The functionality isn't even there without using proprietary extensions.
And?

For CoD it has more to do with the texture load then anything else(nV's double full buffer when running AA kills them there).
MSAA shouldn't be allocating full buffers like that.

It only impacts nV boards.
Again, my point is Doom III is not at the top of the shader requirements ladder. FC for example looks nothing like it should on DX7 cards (unlike D3) and both TH3 and DEIW require DirectX 8 hardware to run at all.

I don't doubt you,
It's just as well because I'm correct.

So how does this strengthen Dave's laughable assertion two years ago that PS 2.0 was some crucial technology?
Two years ago it was in its infancy; these days there's a high chance that the latest game you're playing has SM 2.0 in it.

You have forgotten more about games in the last week then Dave will ever know.
What exactly have I forgotten this week?
 
Aug 6, 2004
33
0
0
I'm planning on building a new computer as soon as possible. I want a 939 cpu (probabally a 3500+) with a 6800GT. It don't see any reason to buy a Nforce3 board with an AGP card today and not be able to use that card in a later system.

If I end up with a 3500+, 6800GT, and 520w OCZ PowerStream am I gonna have bottlenecking and/or power problems if I add another 6800GT in a couple years for SLI action?
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Everything is bottlenecking the new GPU's except at the very highest detail levels and resolutions. I woulnd't worry too much about it though. Just go for it either way youll have an insanely fat PC.

-Kevin
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Childs
Originally posted by: jim1976
Great. ROTFLMAO
Now the thread is more like "ATI tries to downplay SLI. But who gives a $hit? Let's discuss what the fvck we want"

I don't really care to partake in the pissing match, but I'd just like to point out that STALKER will probably do well on the NV cards. At E3 this year it was one of the games NV was pimping. In fact they gave me a STALKER tshirt with a big NV logo on it.

It was just an example. Noone knows for STALKER AFAIK.I just mentioned it because NV30 seems to have problems with shadow intensive games. As for the E3 pimping should I remind you UT2004?

PS: I stop here because I'm starting to hear like a fanatic...

I'm not sure what this and a couple of comments after yours means. There is 10fps difference from an X800XTPE and a 6800GT at 1600x1200 4xAA/8xAF, or about $10 a frame. 3fps seperating the PE from a 6800UE. 4fps seperating the XT from the Ultra, and 1.3fps seperating the Pro and the GT. If the performance of the 6800s are bad, then the x800 can't really be considered good either.

Perhaps I should have stated that they weren't pimping the NV30s at this years E3. Heck, the difference between a 9800XT and 5950U is 2.5fps. Anyways, I still think NV40s will do well in STALKER. I can't imagine NV promoting a game where it cant compete with the competition.

Source

LOL. The upper comment was not meant for you...
As for my quote I wasn't reffering of course to the NV40 cards of course. I was talkin for possible problems with NV3x based cards. And I mentioned UT2K4 because it's a game that Nvidia was pimping about but eventually ATI was better at.



 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Apoppin
For Stalker, it Looks like the GeForce Fxes are hangin' with The Radeons.

Yeah I've seen this benches a long time ago now. Excuse me for taking these with a grain of salt. If you look at prev pg "Highly antic game 1" 6800U is better than X800XT :roll:

But I don't doubt the fact that there's a high possibility that NV3x cards could do well at STALKER. I just state my opinion based on the fact that there's no real benches for this game out there and the game will use if I'm not mistaken the HL2 engine?? (or they just use the same physics engine Havok?). If the first is true then there might be some similarities in perf.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: jim1976
Apoppin
For Stalker, it Looks like the GeForce Fxes are hangin' with The Radeons.

Yeah I've seen this benches a long time ago now. Excuse me for taking these with a grain of salt. If you look at prev pg "Highly antic game 1" 6800U is better than X800XT :roll:

But I don't doubt the fact that there's a high possibility that NV3x cards could do well at STALKER. I just state my opinion based on the fact that there's no real benches for this game out there and the game will use if I'm not mistaken the HL2 engine?? (or they just use the same physics engine Havok?). If the first is true then there might be some similarities in perf.

Stalker has their own propriety engine, but they are using the same Havok 2 Physics, but that wasnt made by ATI, that was made, by well, the Havok team, wihch license it out to companies, as the physics engine is what u buy, the speed of calculations the choice of changing wieght and stuff, but everything is up to the devs to change.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Perhaps. But it's also a fact that Gearbox's implementation of these shaders is rather heinous to say the least.

I suppose that is possible, but given the shear amount of shaders in the game I thought it performed quite well actually.


There is no way that the different boards are running the same shaders if they are SM2 level. Besides that, they aren't running any 2.0 level shaders.

MSAA shouldn't be allocating full buffers like that.

That's how nV does it.

Two years ago it was in its infancy; these days there's a high chance that the latest game you're playing has SM 2.0 in it.

No, there's all of one game out that has PS2.0 support that I don't own, and that one isn't limited by them in terms of performance in anyway.

What exactly have I forgotten this week?

Likely very small, obscure, unimportant facts from many years ago- more then Dave will ever know.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
I suppose that is possible, but given the shear amount of shaders in the game I thought it performed quite well actually.
Did you read the press release from Gearbox where they basically admitted they had stuffed up the shaders and promised a new patch that improved things by up to 50%? Also just look at 007 Nightfire to see past trends of Gearbox's poor development.

There is no way that the different boards are running the same shaders if they are SM2 level.
Usually it's just ATi and nVidia supported by specific extensions for each. Of course now with OGL 1.5 there should be a lot more shader stuff in there at the core level.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: BFG10K
I suppose that is possible, but given the shear amount of shaders in the game I thought it performed quite well actually.
Did you read the press release from Gearbox where they basically admitted they had stuffed up the shaders and promised a new patch that improved things by up to 50%? Also just look at 007 Nightfire to see past trends of Gearbox's poor development.

There is no way that the different boards are running the same shaders if they are SM2 level.
Usually it's just ATi and nVidia supported by specific extensions for each. Of course now with OGL 1.5 there should be a lot more shader stuff in there at the core level.

Theres an enormous amount of shader effects available in OGL1.5 through extensions as well.

Id even go as far to say it matches DX9.0c in programmability through extensions.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: sandorskiIt was great, for awhile. Once 3Dfx was able to acheive better performance in a single chip configuration they dropped SLI like a bad habit. It pushed gaming to new heights though.

What in the world are you talking about...the V5500 had on-board SLI, and that was the last board 3dfx made before going bankrupt...

3dfx just decided to put all their chips onto one board, although the real reason they went bankrupt was because they attempted to start manufacturing boards themselves and not just chips, and this alienated a lot of their distributors. That and that they couldn't quite get the V6000 to work right or use power properly.

I notice that nvidia has lately started showing a lot of weird 3dfx-ish leanings, but SLI was something I think that 3dfx really did right back in the day (before they started trying to shove 4 CPUs onto one board anyway)...

I toyed with a V2 SLI setup once although this was WAY after it's heyday when it had become pretty cheap, and it was still able to hold it's own in the then brand new Counter-Strike (although to be fair it had some unfortunate FPS dips even running wickedgl drivers). I think speed wise it was still quite potent even compared with the then brand new G400 I had bought (matrox), but visual quality wise the Matrox obviously beat the utter living daylights out of it (it looked like 2 different games...almost).

Anyway, I'm curious exactly how well their dynamic load balancing works...I know that ATI DOES have a similar technology already available, where the screen is split for rendering between two GPUs. The original MAXX technology didn't do that and instead simply had each GPU render alternate frames, which made it quite inefficient in terms of load balancing since often one frame can be much harder to render than another. So ATI went back to the drawing boards and came up with a similar "each GPU will draw half of the screen" idea with load balancing (at least this is what I recall...I might just be insane). However, I don't know how well they got load balancing to work since they never actually released another MAXX product ever again, but I'd be willing to bet that if these SLI'ed nvidia's actually sell, ATI will quickly release their next-gen MAXX line.

But this totally brings back memories of when I would drool over pictures of Voodoo2 SLIs in boot magazine...*drool*
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Did you read the press release from Gearbox where they basically admitted they had stuffed up the shaders and promised a new patch that improved things by up to 50%? Also just look at 007 Nightfire to see past trends of Gearbox's poor development.

And we never saw this rumored patch either. I think they were trying to CYA over people's anger with performance(which went away when people realized their boards just sucked running shaders). As far os Nightfire- Opposing Force kicked all sorts of @ss.

Usually it's just ATi and nVidia supported by specific extensions for each.

nV tends to have extensions supported under OGL where ATi has nothing comparable.

Of course now with OGL 1.5 there should be a lot more shader stuff in there at the core level.

IIRC didn't OGL 2.0 come out too? Not that there are any drivers available for it or anything like that, but IIRC it was released a couple weeks back.
 

sandorski

No Lifer
Oct 10, 1999
70,218
5,797
126
Originally posted by: TekDemon
Originally posted by: sandorskiIt was great, for awhile. Once 3Dfx was able to acheive better performance in a single chip configuration they dropped SLI like a bad habit. It pushed gaming to new heights though.

What in the world are you talking about...the V5500 had on-board SLI, and that was the last board 3dfx made before going bankrupt...

3dfx just decided to put all their chips onto one board, although the real reason they went bankrupt was because they attempted to start manufacturing boards themselves and not just chips, and this alienated a lot of their distributors. That and that they couldn't quite get the V6000 to work right or use power properly.

I notice that nvidia has lately started showing a lot of weird 3dfx-ish leanings, but SLI was something I think that 3dfx really did right back in the day (before they started trying to shove 4 CPUs onto one board anyway)...

I toyed with a V2 SLI setup once although this was WAY after it's heyday when it had become pretty cheap, and it was still able to hold it's own in the then brand new Counter-Strike (although to be fair it had some unfortunate FPS dips even running wickedgl drivers). I think speed wise it was still quite potent even compared with the then brand new G400 I had bought (matrox), but visual quality wise the Matrox obviously beat the utter living daylights out of it (it looked like 2 different games...almost).

Anyway, I'm curious exactly how well their dynamic load balancing works...I know that ATI DOES have a similar technology already available, where the screen is split for rendering between two GPUs. The original MAXX technology didn't do that and instead simply had each GPU render alternate frames, which made it quite inefficient in terms of load balancing since often one frame can be much harder to render than another. So ATI went back to the drawing boards and came up with a similar "each GPU will draw half of the screen" idea with load balancing (at least this is what I recall...I might just be insane). However, I don't know how well they got load balancing to work since they never actually released another MAXX product ever again, but I'd be willing to bet that if these SLI'ed nvidia's actually sell, ATI will quickly release their next-gen MAXX line.

But this totally brings back memories of when I would drool over pictures of Voodoo2 SLIs in boot magazine...*drool*

I was talking about Voodoo 2/3. Yes, 3dfx did go SLI, in a manner, again with Voodoo 5, but their next chip was again going to be a single chip(Rampage) solution. SLI is ok as a temporary solution, but a single chip solution is always preferable.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |