HD 2900XTX Benches

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Aikouka
Originally posted by: apoppin
clean up the 'mess' in their company ... no big severance packages, either
- not financial


AMD needs to blame it on the Apprentice .

What I think DailyTech needs to do is explain what their benchmarks were. People are complaining about differences between benchmarks and I checked the settings listed and both are the same. So it's quite possible that DT did not attempt to run the same exact test as they did in the XT preview (note that it's not a review, but a preview.. they don't try to review products but give an idea).

i guess you want the rest of the viewers to *bury* XTX

just hang on ... for the *official* conclusions ...
it will be far worse, i expect


we now have the apparent answer to my question:
--"Where's the Beef?"

there really isn't any ... we have a stale, warmed-over giant fluffy moldy green bun
-- with a giant *question mark?* as to DX10 performance ... next year
who gives a damn about next year's DX10 performance? :|

*nothing* about r600 here fills me with any joy ....
---i take a very [very] slight consolation that my long-term analysis of r600's delay proved to be dead-on

unfortunately ... i so wanted to be wrong ... this time



 

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Originally posted by: TecHNooB
Ah well. All aboard the S.S. NVidia.

Not me, I'm going for image quality and HD of the R630.

If it beats X1950 Pro with good drivers, DX10 and Shader Model 4.0, it's all I need. I'll be able to complete my second machine and will use the current one as a backup.
 

GTaudiophile

Lifer
Oct 24, 2000
29,776
31
81
Well, here's a thought. If Crysis Demo comes out in all its full-blast, max-settings, DX10 glory and R600 wins by huge margins, then ATi has won the day.

Perhaps R600's killerap is DX10.

Otherwise, get ready for CPU and GPU monopolies, folks.
 

swtethan

Diamond Member
Aug 5, 2005
9,083
0
0
Originally posted by: ArchAngel777
Well, I am quite dissapointed myself. I was really hoping that the R600 would eat least equil the 8800 series. Oh well, I am just dissapointed with performance in general. I purchased my 7800 GTX almost two years ago now and in most games, the 8800GTX is not even quite 2X the speed of my current card. I dunno, maybe I just expect to much. But in two years time, I really expected more performance from either nVidia or AMD/ATi. This does indeed dissapoint me. With the way things are going, I am pretty sure my rig will last me for another year at least. So, if anything R600 postponed my system rebuild another year with its lacklustre performance.

http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=279

24fps vs 57fps is not 2x?

http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=288

67 vs 113


http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=298

7 vs 28
(4x!!!!)

http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=294

26 vs 69

You dont have a 8800gtx, so you cannot say that it wouldnt double performance in your system.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: n7
Originally posted by: n7

I suspect the poor performance is due to the stream processors layout for the AMD cards vs. nV's, but i can't find anything stating just how it's set up.

From a poster commenting on the DT article:

The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.

In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.

Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.

Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.

I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.

Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.


This is what i was thinking too.

Ugh....I thought I read something similar to this, but nothing that detailed. If this is true R600 does have more in common with NV30 than just being late and underwhelming. Seems very similar to NV banking on vendor-specific programming for performance going with a 1x2 pixel pipe instead of the typical 1x1 design. Best case scenario I think NV30 had 8 pixel pipes but worst case it only had 4, which was less than the 9600pro. Been a while so actual pipe numbers may be a bit off.

That pretty much explains it though.....R600 at its worst-case really has only 64 shaders, less than both the GTS and GTX. The relatively high clock speeds keep it competitive but the GTX with 2x as many worst-case shaders is just too much to overcome. Still, interested to see why the XT doesn't scale well at higher core clockspeeds unless the shader clock scales independently and becomes the bottleneck. Might see some improvements if developers take advantage of the extra vector streams, but given the history of PC gaming in general, I wouldn't bank on it.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: GTaudiophile
Well, here's a thought. If Crysis Demo comes out in all its full-blast, max-settings, DX10 glory and R600 wins by huge margins, then ATi has won the day.

Perhaps R600's killerap is DX10.

Otherwise, get ready for CPU and GPU monopolies, folks.

that IS the "last hope"

but it doesn't look that way ... we got the 'hint' all along that r600 was delayed for "performance" reasons and then 'the family launch'

Crysis isn't "full" dx10 either ... expect *those* after g80 has been refreshed - twice ...
next year

it doesn't perform well - ONLY in comparison to g80

but then 2nd place is *last* place in this race

and i also predicted that we would *know* about r600 - THIS week ... that NDA was ridiculous
:lips:
 

terentenet

Senior member
Nov 8, 2005
387
0
0
Originally posted by: ArchAngel777
Well, I am quite dissapointed myself. I was really hoping that the R600 would eat least equil the 8800 series. Oh well, I am just dissapointed with performance in general. I purchased my 7800 GTX almost two years ago now and in most games, the 8800GTX is not even quite 2X the speed of my current card. I dunno, maybe I just expect to much. But in two years time, I really expected more performance from either nVidia or AMD/ATi. This does indeed dissapoint me. With the way things are going, I am pretty sure my rig will last me for another year at least. So, if anything R600 postponed my system rebuild another year with its lacklustre performance.

What? Are you playing your games at 800x600? Because at more than 1600x1200, a single 8800GTX rapes a 7800GTX SLI. Stop making BS assesments.
8800GTX fixed the IQ issues related to 7 series and in most parts doubled the performance. Performance for 8 series is also most likely to improve with more matured drivers.
Until now they didn't even fix all the issues. Once that's done, drivers will be tweaked to offer more performance.
 

AnotherGuy

Senior member
Dec 9, 2003
678
0
71
Apopin u already said it like 4 times that thats what u predicted... we got it already... and u were not the only one...

Now... Lets look at it again... $400 for the XT for performance of between GTS and GTX... is that worth it or not... I think it is.
And also what will we have for the $300 price bracket... I hope performance of a little under GTS but with 512mb ram... that'd be a sweet mainstream spot... Still nV has the Vista drivers to fix... so vista users will go ATI
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: swtethan
Originally posted by: ArchAngel777
Well, I am quite dissapointed myself. I was really hoping that the R600 would eat least equil the 8800 series. Oh well, I am just dissapointed with performance in general. I purchased my 7800 GTX almost two years ago now and in most games, the 8800GTX is not even quite 2X the speed of my current card. I dunno, maybe I just expect to much. But in two years time, I really expected more performance from either nVidia or AMD/ATi. This does indeed dissapoint me. With the way things are going, I am pretty sure my rig will last me for another year at least. So, if anything R600 postponed my system rebuild another year with its lacklustre performance.

http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=279

24fps vs 57fps is not 2x?

http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=288

67 vs 113


http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=298

7 vs 28
(4x!!!!)

http://www23.tomshardware.com/graphics_...elx=33&model1=706&model2=713&chart=294

26 vs 69

You dont have a 8800gtx, so you cannot say that it wouldnt double performance in your system.

1) Thanks for the correction

2) Toms Hardware recently updated that list (I checked 3 weeks ago and in almost every benchmark at the highest resolution, the GTX did NOT double the performance of my card. With the new list, I am actually interested in purchasing a 8800 GTX now.

Anyway, glad you brought this to my attention
 

ObscureCaucasian

Diamond Member
Jul 23, 2006
3,934
0
0
This will probably be another 1800, hopefully the r650 will give some performance increase. ATI just needs to start releasing on schedule, and not 5 months late, which is most of a product cycle in this business


EDIT: I believe the lower end cards will sell very well to OEM's due to the HDMI integration.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: AnotherGuy
Apopin u already said it like 4 times that thats what u predicted... we got it already... and u were not the only one...

Now... Lets look at it again... $400 for the XT for performance of between GTS and GTX... is that worth it or not... I think it is.
And also what will we have for the $300 price bracket... I hope performance of a little under GTS but with 512mb ram... that'd be a sweet mainstream spot... Still nV has the Vista drivers to fix... so vista users will go ATI
i know ... you hate to hear it
... try to justify it is "good value" ... the nvidia boys bought the DustBuster ... just like the fans will buy r600 and console themselves with "features" they don't really like.

i guess i'm sorry for being right
:roll:

and this thread and all like them are turning into *Pity*
--or *disgust* and "all apologies" for the *FLOP* that r600 is.

i don't need to post further ... you DO know what i predicted

and thank-you very much for the public acknowledgment
:thumbsup:



i'll go quietly and with my head held high


 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I still find it hard to believe that the xtx has only a 5mhz core clock advantage over the xt. All that extra memory and bandwidth won't help much if the core is not fast enough. As bad as that makes the xtx look, it sure makes the xt look much better given the right price.
 

AnotherGuy

Senior member
Dec 9, 2003
678
0
71
Originally posted by: apoppin

i know ... you hate to hear it
... try to justify it is "good value" ... the nvidia boys bought the DustBuster ... just like the fans will buy r600 and console themselves with "features" they don't really like.

My first post on this thread was 'what a dissapointment...' so no u got the wrong guy for fanboy...
I will say it thoe that i prefer ATI in general better than NV only for the fact that ati has been doing a better job with their cards... when it came down to eye candy and higher resolutions... thats for the last 2 generations at least... while nV had the raw speed mostly...
Also I liked ATI bacause of the types of the games i play mostly... which is dx vs opengl that nv favors... but if nV got it right this round and scales better in both higher res and in general then ill go with nvidia... I dont have stocks with amd so i dont care that much.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Originally posted by: n7
Originally posted by: n7

I suspect the poor performance is due to the stream processors layout for the AMD cards vs. nV's, but i can't find anything stating just how it's set up.

From a poster commenting on the DT article:

The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.

In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.

Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.

Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.

I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.

Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.


This is what i was thinking too.

If that is true they made a terrible design decision for todays market, which will be 99% DX9 titles for the next 12-18 months. By then the R600 and G80 will be nothing.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
I'm just so sad! Where's Gstanfor? I need a shoulder to cry on! Grats Nvidia my next card will be one of yours.
 

TecHNooB

Diamond Member
Sep 10, 2005
7,460
1
76
Originally posted by: redbox
I'm just so sad! Where's Gstanfor? I need a shoulder to cry on! Grats Nvidia my next card will be one of yours.

Me too T__T; Sad day for ATI. At least we can look forward to NVidia's next offering. The 8950GTX is going to be mighty sweet! Hopefully they slap two GTX cores together instead of two GTS cores.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Sigh, if these performances benchmarks are true, AMD/ATI is in big big trouble in the discrete PC graphics card business. They've outdone themselves with a late, power-hungry (they invented the new 8-pin GPU connector for this!?!?) bust, while the 8800GTX has been a solid true next gen offering available for some time now.

If I didn't have feel almost utter apathy for the entire PC gaming industry right now, then I would really be upset for ATI, and the development of the PC graphics world into essentially a one horse race. As it is AMD/ATI are going to have to hit a home run with their integrated CPU/GPU offering in a couple of years (and given their track record apart, it will take a miracle of engineering harmony for that to happen), or face serious loss of sales.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
As long as it drives nVidia's prices down and sports a somewhat different feature-set with visuals, I could care less which company is performing better. At least we have some kind of competition now. Maybe not XTX vs GTX, but certainly XT vs. GTS and - for some - XT vs. GTX.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Genx87
Originally posted by: n7
Originally posted by: n7

I suspect the poor performance is due to the stream processors layout for the AMD cards vs. nV's, but i can't find anything stating just how it's set up.

From a poster commenting on the DT article:

The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.

In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.

Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.

Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.

I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.

Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.


This is what i was thinking too.

If that is true they made a terrible design decision for todays market, which will be 99% DX9 titles for the next 12-18 months. By then the R600 and G80 will be nothing.

Typically I don't agree with you because your posts often display a certain 'bias', but in this case you're bang on. Poor performance for DX9 titles is a serious blow because DX10 games are just not coming en masse in the near future. We've got Halo 2 (been there, done that - plus doesn't it work with DX9 cards, just requiring Vista?), a possible Crysis patch (which, if their unsupported and buggy SM 3.0 patch for Far Cry is any indication, will be late to the table and buggy), and I don't know what else. Just a total disaster for ATI/AMD (the HD 2900XT could be OK if it's priced at $399, but the power consumption figures are ridiculous).

Honestly, what's the incentive to go ATI right now, other than future performance, which has never been a smart indication of what to buy because there's always something new on the horizon and ATI/NVIDIA have been on a 6-month refresh cycle for years now.

-----------------

As an aside, it seems that in general the gaming industry is in a lull, as development teams are all working on "next-gen" looking games, which means even longer development times than before...

Not to mention an increase in console exclusive (or cross-platform) titles, such as Gears of War and many more Unreal 3 engine games, which appear to be in development for Xbox360, PS3 (in many cases) as well as PC. Ever since Microsoft introduced the original Xbox, it has stolen a lot of PC gaming's thunder, and the mockery that has become the expensive PC graphics card race has really turned a lot of people off (myself included). [though MS has their own problems with the ****** life expectancy of Xbox360 systems, and PS3 reliability is still to be determined]
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Hmm, wait for the real reviews to decide.

However to add more insult to injury, the 8800 ultra will release (a G80 on steroids) and that would create even a bigger gap in comparison with the DTs leaked benches.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
I hope you guess have had the mind to actually check those results with other benchmarks from other sites, because just using ATs review of the 630Mhz 2000Mhz XFX 8800GTX XXX.

Dailytech somehow if the settings havent changed from using the 8800GTS 640mb review vs the 2900xt, it shows the 8800GTX having 2x the performance compared to AT review.

It also shows the Half Life 2 performance about half the fps from the AT review. There are discrepencies with the FEAR benchmark also and i cant find a Company of Heroes benchmark, but come on guys cant you see something is really wrong with these benchmarks here. Until i see some sort of real proof from a real review site then i will say its crap or good.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: TecHNooB

Me too T__T; Sad day for ATI. At least we can look forward to NVidia's next offering. The 8950GTX is going to be mighty sweet! Hopefully they slap two GTX cores together instead of two GTS cores.
It won't be 2 GTX cores. The 7950 was 2 mobile cores I believe. so don't expect the 8950 until the mobile G80's start popping up and or a refresh is around.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |