HD 2900XTX Benches

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

what are the fan boys gonna say when DT "conclusions" are supported by *everyone* else?


In conclusion, what did the HD 2900 XTX benchmarks really show? I think the most obvious answer is that the difference between 1GB GDDR4 and 512MB GDDR3 is certainly not going to be a viable option for R600. Like we had mentioned before, this card is at least partially scrapped at this point, save a few OEMs who will be putting it into workstations.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Two more posts on Rage3D/HardForum:
Originally Posted by loafer87gt
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XT. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??
Originally Posted by Kombatant
They christened the OEM version we've known and loved for quite a few months as an "XTX". That should tell you a lot about their credibility actually.

Haha DT must've forgotten to enter the super sekret performance unlock code....up up down down left right...

Seriously though, there's little doubt DT did the benchmarks, just as there's little doubt there may be some holes or flaws in their testing. Performance might improve with drivers or be closer with other site's test results, but will it be enough to beat a GTX at the end of the day? I'm inclined to think no, and AMD said the same at their press conference.

WTF at the ATI guy saying the card is worth the wait.....another 3-4 months for something close to a GTX when you can just get a GTX last year. And another WTF at the random comment about CF results against the GTX being much different. Um...no sh1t?

Anyways, glad there's a buzz again in the community, video was getting pretty boring and I think this is the spark people have been waiting for (the last 6 months to a year).
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Originally posted by: jpeyton
Two more posts on Rage3D/HardForum:
Originally Posted by loafer87gt
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XT. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??

lol, this is soooo typical AMD. Vague statements, talk about "some sort" of surprise for "ATI faithful" (even though ATI doesn't even exist anymore). The irony of the whole statement is that the "ATI faithful" will be buying this card whether it's a flop or not out of blind loyalty. Kombatant should be more worried about producing a card that will bring intelligent, right-thinking consumers and "nVidia faithful" into the AMD camp, not appeasing rabid AMD fanboys.

I can see it now... if you buy an XTX and sign a pledge stating "I'm faithful to ATI, a company that no longer exists, and will never purchase an nVidia product ever again" you get a free "AMD Faithful" T-shirt! Awesome surprise!

I wouldn't be surprised to see it based on the performance of AMD's pathetic PR department lately. :thumbsdown:
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

And won't matter for another 2-4 (to 10) months until its set to release. Or maybe we can get an 9900 GTX ES for sh1ts and giggles lol.

 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

Have you seen the card DT used to benchmark personally? Do you have a link to anything saying there will be an OEM XT with 1GB GDDR4? Guessing no on both those.

I'm sure you "just know" that this "can't be" the state of the XTX because it doesn't crush the GTX. :roll:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

we don't *know* that it isn't an XTX ... looks at what DT posted:
you've probably noticed we have access to a few R600 cards.

One thing that concerns me is that many of our readers may think the 750 MHz core clock on the Radeon HD 2900 XTX, if it even comes to market, will be the final clock. I can almost guarantee you that if anyone decides to bring this to channel, the core clock will get a bump. Keep in mind that Sven managed to overclock the core to 845 MHz on the XT card.

In addition, I was quite pleased with the HD 2900 XT tests. I do believe when overclocked -- and this card seems to have a bit of headroom -- the card does well against the 8800 GTS and starts to encroach on the 8800 GTX territory. In our testing, it doesn't beat a GTX, but then again it won't be a $500 graphics card either. That task was reserved for the XTX and I think it?s quite obvious that increasing the memory size and frequencies on the R600 GPU won't usurp the 8800 GTX.

it's pretty *obvious* that it won't matter what clock speeds the XTX gets ... his conclusions are accurate relative to their testing

maybe r660
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: SexyK
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

Have you seen the card DT used to benchmark personally? Do you have a link to anything saying there will be an OEM XT with 1GB GDDR4? Guessing no on both those.

I'm sure you "just know" that this "can't be" the state of the XTX because it doesn't crush the GTX. :roll:

Does it really matter XT or XTX? even with the optimized drivers you will get around 10 to 20 % boost, big deal for fans who waited for this since March 2006

I think GTX owners made the sanest choice months ago (including me) instead of waiting for R600 where its performance is still doubtful.
 

BlizzardOne

Member
Nov 4, 2006
88
0
0
Originally posted by: apoppin
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

we don't *know* that it isn't an XTX ... looks at what DT posted:
you've probably noticed we have access to a few R600 cards.

One thing that concerns me is that many of our readers may think the 750 MHz core clock on the Radeon HD 2900 XTX, if it even comes to market, will be the final clock. I can almost guarantee you that if anyone decides to bring this to channel, the core clock will get a bump. Keep in mind that Sven managed to overclock the core to 845 MHz on the XT card.

In addition, I was quite pleased with the HD 2900 XT tests. I do believe when overclocked -- and this card seems to have a bit of headroom -- the card does well against the 8800 GTS and starts to encroach on the 8800 GTX territory. In our testing, it doesn't beat a GTX, but then again it won't be a $500 graphics card either. That task was reserved for the XTX and I think it?s quite obvious that increasing the memory size and frequencies on the R600 GPU won't usurp the 8800 GTX.

it's pretty *obvious* that it won't matter what clock speeds the XTX gets ... his conclusions are accurate relative to their testing

maybe r660


not quite..

I believe that last bolded comment is refering to the memory frequency only, and does not take core frequency into consideration. The core on the "XT" was 742mhz, the core of the supposed "XTX" was 750mhz.. 8mhz? that's pretty much within the margin of error. The only substantial frequency difference between the cards tested by DT, was the memory frequency, which clearly had irtually no impact on the results.

So, I think your comment that XTX frequencies won't matter, is premature. At least IMO
 

olmer

Senior member
Dec 28, 2006
324
0
0
Well, i am glad i listened to Kyle and bought 8800gtx for my main gaming rig. Now i guess 2900xt/8800gts price war outcome will decide what goes to my HTPC/racing games rig (unless xt is a way too hot for a Fusion). Overall if Dailytech benchmarks are correct-ish - it would mean a serious price war in middle sector.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Wow... You people even know Kombatant? He tells the truth, no more, no less, Ill take his word over DT any day

Looks like everyones just enjoying jumping at the AMD hate bandwagon, sort of like Sony

Well, i am glad i listened to Kyle

Dont ever do that again
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BlizzardOne
Originally posted by: apoppin
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

we don't *know* that it isn't an XTX ... looks at what DT posted:
you've probably noticed we have access to a few R600 cards.

One thing that concerns me is that many of our readers may think the 750 MHz core clock on the Radeon HD 2900 XTX, if it even comes to market, will be the final clock. I can almost guarantee you that if anyone decides to bring this to channel, the core clock will get a bump. Keep in mind that Sven managed to overclock the core to 845 MHz on the XT card.

In addition, I was quite pleased with the HD 2900 XT tests. I do believe when overclocked -- and this card seems to have a bit of headroom -- the card does well against the 8800 GTS and starts to encroach on the 8800 GTX territory. In our testing, it doesn't beat a GTX, but then again it won't be a $500 graphics card either. That task was reserved for the XTX and I think it?s quite obvious that increasing the memory size and frequencies on the R600 GPU won't usurp the 8800 GTX.

it's pretty *obvious* that it won't matter what clock speeds the XTX gets ... his conclusions are accurate relative to their testing

maybe r660


not quite..

I believe that last bolded comment is refering to the memory frequency only, and does not take core frequency into consideration. The core on the "XT" was 742mhz, the core of the supposed "XTX" was 750mhz.. 8mhz? that's pretty much within the margin of error. The only substantial frequency difference between the cards tested by DT, was the memory frequency, which clearly had irtually no impact on the results.

So, I think your comment that XTX frequencies won't matter, is premature. At least IMO

premature ... i don't think so

but to give you the 'benefit of the doubt' ...

... we'll wait a couple of weeks to *confirm* it

and THEN what will you say ?


you must be preparing your "losing" speech
[if not, get right on it ... you will need it]
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: apoppin
Originally posted by: BlizzardOne
Originally posted by: apoppin
Originally posted by: yacoub
Originally posted by: apoppin
Originally posted by: SexyK
Only time will tell I guess. After all the FUD, lies and general BS that AMD and their "followers" have been spreading about R600 and this launch, though, I tend to believe that the DT benchmarks are valid and represent a close approximation of final performance.

we now *know* AMD was lying all the time

or at the veryleast trying to obfuscate the truth about their seriously under-performing GPU

You can stop now. It was an OEM XT, not an XTX. That is why it performed the same as the XT. Real XTX benchmarks have yet to be seen.

we don't *know* that it isn't an XTX ... looks at what DT posted:
you've probably noticed we have access to a few R600 cards.

One thing that concerns me is that many of our readers may think the 750 MHz core clock on the Radeon HD 2900 XTX, if it even comes to market, will be the final clock. I can almost guarantee you that if anyone decides to bring this to channel, the core clock will get a bump. Keep in mind that Sven managed to overclock the core to 845 MHz on the XT card.

In addition, I was quite pleased with the HD 2900 XT tests. I do believe when overclocked -- and this card seems to have a bit of headroom -- the card does well against the 8800 GTS and starts to encroach on the 8800 GTX territory. In our testing, it doesn't beat a GTX, but then again it won't be a $500 graphics card either. That task was reserved for the XTX and I think it?s quite obvious that increasing the memory size and frequencies on the R600 GPU won't usurp the 8800 GTX.

it's pretty *obvious* that it won't matter what clock speeds the XTX gets ... his conclusions are accurate relative to their testing

maybe r660


not quite..

I believe that last bolded comment is refering to the memory frequency only, and does not take core frequency into consideration. The core on the "XT" was 742mhz, the core of the supposed "XTX" was 750mhz.. 8mhz? that's pretty much within the margin of error. The only substantial frequency difference between the cards tested by DT, was the memory frequency, which clearly had irtually no impact on the results.

So, I think your comment that XTX frequencies won't matter, is premature. At least IMO

premature ... i don't think so

but to give you the 'benefit of the doubt' ...

... we'll wait a couple of weeks to *confirm* it

and THEN what will you say ?


you must be preparing your "losing" speech
[if not, get right on it ... you will need it]


thems is fighting words!

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShadowOfMyself
Wow... You people even know Kombatant? He tells the truth, no more, no less, Ill take his word over DT any day

Looks like everyones just enjoying jumping at the AMD hate bandwagon, sort of like Sony

you mean flop-wagon

kinda like the PS3's failure to sell ... or have exclusives ...

--yeah ... you DO get it

a failure is a failure .... AMD failed to take the performance crown

worst of all it isn't even close ... we have a "reverse DustBuster" ... with AMD having the DB and nvidia having 9700p
- and it's fans are crying "it IS competitive"

sure ... cause it will be cheap .... and lots of nifty "features"
:roll:

... this AMD debacle was not unexpected --if your eyes weren't tightly shut for the last 3 months

AMD made us wait SIX months ... for this underperformer ?
:brokenheart:

:thumbsdown:

yeah ... more "fighting words", swtethan

except there is *nothing to defend*

AMD has a new performance turkey ... but they really missed Thanksgiving


makes me feel better about ATi's LAST gen card

--too bad to see ATi go out that way
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: n7
The R600 has 64x5 Vec5D units which ... but makes it's worst case scenario a lot worse with 64 stream ops per second
...
Nvidia has ... 128 scalar units, in a worst case scenario you'd still issue 128 stream ops.
That is the issue in my opinion. The scalar units are just easier to utilize at a higher level of efficiency than the vector units. The way pixel shaders work, there's a lot of scalar math involved. Even though Vec5D has a higher ultimate gflop rating, it's tough to get situations where you're using mostly vector math in pixel shaders. Geometry shaders may be a different story. They may be more vector heavy.

Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.
A lot of scientific apps do mostly matrix multiplying so the R600 architechture would rock in such a situation.

 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
I believe that last bolded comment is refering to the memory frequency only, and does not take core frequency into consideration. The core on the "XT" was 742mhz, the core of the supposed "XTX" was 750mhz.. 8mhz? that's pretty much within the margin of error. The only substantial frequency difference between the cards tested by DT, was the memory frequency, which clearly had irtually no impact on the results.

What are you talking about "margin of error". Its a clockspeed, it either is or it isn't. As someone proved in the DT comments, AMD has a history of making their Bleeding Edge High End card not much higher clocked than the card right before it.

So, I think your comment that XTX frequencies won't matter, is premature. At least IMO

To further debunk your little defense of AMD, they overclocked the XT extensively earlier and claimed that there was VERY VERY little difference in the results.

People are really grasping at nothing here. DT released some results that didn't go along with what everyone wanted and now are immediately getting called for libel. Get a grip people!! If you look at Kristophers responses in the thread it clears up almost every single argument people make!

-Kevin
 

HopJokey

Platinum Member
May 6, 2005
2,110
0
0
Whether or not the DT benchmarks are totally accurate of the released R600 XT/XTX, I think it is safe to say that the R600 will not blow away Nvidia's offerrings at the time. With a 6 month or so difference, the R600 should trounce the older 8800GTX card, but it doesn't; at best it is barely on par with it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gamingphreek
I believe that last bolded comment is refering to the memory frequency only, and does not take core frequency into consideration. The core on the "XT" was 742mhz, the core of the supposed "XTX" was 750mhz.. 8mhz? that's pretty much within the margin of error. The only substantial frequency difference between the cards tested by DT, was the memory frequency, which clearly had irtually no impact on the results.

What are you talking about "margin of error". Its a clockspeed, it either is or it isn't. As someone proved in the DT comments, AMD has a history of making their Bleeding Edge High End card not much higher clocked than the card right before it.

So, I think your comment that XTX frequencies won't matter, is premature. At least IMO

To further debunk your little defense of AMD, they overclocked the XT extensively earlier and claimed that there was VERY VERY little difference in the results.

People are really grasping at nothing here. DT released some results that didn't go along with what everyone wanted and now are immediately getting called for libel. Get a grip people!! If you look at Kristophers responses in the thread it clears up almost every single argument people make!

-Kevin

hey Kev,

how long have you been here? ... 4 years ... did you remember when the *reverse* happened here?


NV30 vs r300

ultra dustBuster vs 9700p

*same thing* identical responses ... like C&P from old posts

IF the 8800GTX was TWICE as fast as XTX - across the board in every bench and at every site - you'd *still* have AMD fans nitpicking benchs and saying "IF they Only tested with blahblahblah ... " ... and touting up their "IQ" [which is extraordinarily low, btw] and cool little "features" that have noting to do with gaming or performance

--and NEVER FORGET we ALSO have AMD's viral marketing at work here on this forum
[and *bitter* amd fans]

--OTOH nvidia's fud-spinners get a Well-Deserved "break" ... thanks to their engineering/marketing of G80 and r600's flop.
:shocked:



there is a LOT of high-fiveing and celebrating at nvidia ... they don't even need to *answer* r600

profits soar

prices stay high

bad for us
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
IF the 8800GTX was TWICE as fast as XTX - across the board in every bench and at every site - you'd *still* have AMD fans nitpicking benchs and saying "IF they Only tested with blahblahblah ... " ... and touting up their "IQ" [which is extraordinarily low, btw] and cool little "features" that have noting to do with gaming or performance

--and NEVER FORGET we ALSO have AMD's viral marketing at work here on this forum
[and *bitter* amd fans]

--OTOH nvidia's fud-spinners get a Well-Deserved "break" ... thanks to their engineering/marketing of G80 and r600's flop.

Humm... What do you mean by extraordinarily low IQ? Are you just using a random example? We know nothing about R600 IQ other than the fact that is has 24xAA
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
apoppin - I sincerely doubt that we have many 'bitter AMD fans' or 'bitter' nVidia fans on this forum. What we do have are a lot of 'bitter' people who feel that they have been treated with little or no respect when posting on this board. I can't count how many times I've seen arguments get personal here, and they shouldn't. We are discussing ideas, not our children. I don't immediately become worthless if I get an idea wrong. Nor do I immediately become 'king of the hill' if I get one right.

Let's not mistake bitterness over rude behavior for bitterness over how a particular company performs.

Similarly, I'd be hard pressed to name many posters that I thought were 'fanboys' in the sense that everyone seems to use the word: mindless followers of a particular company at the expense of all reason and reliable thought. There just aren't many on AT who are actually like that. Most folks seem to lean towards one company or another, probably because of past business dealings, but very few folks fall into the 'fanboy' category.

The real purpose for using the term 'fanboy' is to be able to utterly disregard any and all points made by someone with whom you disagree. After all, they are simply 'mindless followers of a particular company at the expense of all reason and reliable thought'. You can't trust what *those* people say.


You know, I wasn't around when it was revealed that 'viral marketing' took place on these boards. Between that revelation and the already existing belief in die-hard 'fanboyism' there isn't much chance relaxed, but insightful, discussions about the GPU industry are going to be common on these boards, and that's too bad.

In all probability, there are only a few (a very few) posters who honestly fit the mold of a 'fanboy' or who consciously are espousing FUD. The rest are just giving their opinion about something. If that opinion doesn't make sense to someone, by all means offer a different one, but there is no need to deride anyone in the process. Calling someone a 'fanboy' and responding to them in a rude fashion is what creates the bitterness.



On topic - if this is an OEM XT that would explain some, not all, failings of the card. If the architecture of the XTX is precisely the same, then, given that they pushed the XT's clocks and got mediocre results, I'd be surprised if we see much improvement later.

Here's my thoughts (thoughts, not baby children) on the status of things right now.

The current released HD2900XT architecture is what was expected, with the XTX to include more (and higher rated) memory as well as a bump in clockspeed. I looked at the 2900XT overclocking in detail. The card does seem to need a balance between memory and core overclocks to perform efficiently. For whatever reason, in the released benchmarks, the 2900XT performed better when the ratio of core to memory overclock was a little more than 1:2 (almost twice as much of a bump, in terms of %, for memory than core). For a part which theoretically looks like it shouldn't be limited by the memory subsystem this seems awfully odd. But it seems to indicate that some benefit may be possible when moving to GDDR4. The question is how much. I buy the argument that with the 64 Vec5 shader system that the bad scenario for the R600 is still worse. So while bumps in clocks and memory may get us a little performance boost, I doubt we'll see that much. DX10 is still the wildcard; if it uses the Vec4 system better then you might actually get much, much better scaling out of the R600 architecture.

Second, my guess is that these current parts would have spawned the XTX (at naturally higher core and memory frequencies) had the performance been there. Indeed, that's what everyone has been assuming--both an XT and an XTX part on this process node and with this architecture. But the performance (thus far) just isn't there. It isn't. I'm beginning to wonder if the 'surprise' is something that has already been discussed ad nauseam on these boards: AMD going to 65nm. I'm beginning to wonder if AMD is frantically working on the refresh at 65nm, and is planning to dub that part the 'XTX'. Still, they've gone down the Vec4 path. If they ditch that with their next part, but then it ends up being better for DX10, then that doesn't make much sense. Doubling the Vec5 shaders to 128 also seems like it would cost too much money. So I'm a little cautious about what kind of 'surprise' that AMD could have. Still, nVidia went from the 7900gtx to the 8800gtx in one generation. ATI had the 9700Pro. So it's possible; it just doesn't seem likely.

At any rate, I don't think any of this is good for AMD at the moment. They seemed to have made a bad judgment in terms of R600's design. Best case, it was a bad judgment relative to the anticipated release of DX10 games. Worst case, it was just a bad judgment, period. Middle-of-the road case (as some have mentioned) AMD did this deliberately to be able to leverage the Vec4 system in markets other than the gaming market (drop-in Lab supercomputing).

[shameless bad joke]

I'm sure I overheard Henry Richard the other day saying, "If I say it's safe to make a Vec4 GPU, then it's safe to make a Vec4 GPU!"

[/shameless bad joke]

AMD is a little like the surfer on the beach in Apocalypse Now.


[edited for Vec5 mistake]
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: zephyrprime
Originally posted by: n7
The R600 has 64x5 Vec5D units which ... but makes it's worst case scenario a lot worse with 64 stream ops per second
...
Nvidia has ... 128 scalar units, in a worst case scenario you'd still issue 128 stream ops.
That is the issue in my opinion. The scalar units are just easier to utilize at a higher level of efficiency than the vector units. The way pixel shaders work, there's a lot of scalar math involved. Even though Vec5D has a higher ultimate gflop rating, it's tough to get situations where you're using mostly vector math in pixel shaders. Geometry shaders may be a different story. They may be more vector heavy.

Actually, I've written shaders myself and looked at some shaders use by games, and... vector math is at least as common as scalar math, if not more so. Data like texture coordinates, vertex position, color values... are all represented by 2, 3 or 4-component vectors. True, with scalar shaders it's easier to get better efficiency and utilization, but that's not to say vector shaders are not appropriate for games. Vector shaders are more dependent on how well the compiler optimizes the code, but seeing how the r600 vec5 architecture supposedly has much more theoretical computation power than the g80 scalar architacture, there must be something else that's causing poor performance.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |