2900XT vs 8800GTS 320

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: n7
Hard OCP's reviews = hilarious nV biased joke.

Their initial review's results didn't match up with basically all the other reviews out there.

Now they must retest to prove their bias apparently?

I'm not going to even read their garbage.

We all know here that HardOCP is biased towards nVidia. like Wreckage, all the other websites states that the Radeon HD 2900XT is between the GeForce 8800GTS 640MB and the 8800GTX average, sometimes is slower than a 8800GTS and sometimes is as fast or a bit faster than a GTX. And during time when drivers mature, we will see performance improvements.
 
Jun 14, 2003
10,442
0
0
Originally posted by: ArchAngel777
Originally posted by: ShadowOfMyself
Funny how HOCP comes up with a completely different conclusion than tweaktown or xbit labs or any other recent review site... I wonder why :roll:

Add that to the fact that wreckage isn't exactily non-biased... None the less, I am happy for another review, even if the review is shoddy and pathetic! I am just glad people ARE retesting these cards. I am pretty much waiting on AT, TH to do a rest before I will judge if the card can hold it's own.

I will also trust Apopp's review...

toms are a bag of chit too
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
I can't believe I just read this whole thread. Must be a slow day...

Umm... to justify this post, I like nvidia, I like ATI, I like the 8800GTS, ATI got rolled this round
 
Jun 14, 2003
10,442
0
0
Originally posted by: munky
I never liked their review style. That's like testing a civic vs. a mustang, and saying "to get enjoyable driving out of the civic we had to rice it up a bit with things like NOS, new suspension and a turbo, and look - the civic is faster!"

[cheapshotaboutamerican"sports"carsandcorners]

oh you dont need to do that, just show both cars to a few sweeping bends, and the civic would win [/cheapshotaboutamerican"sports"carsandcorners]

hmm seems theres nothing overly questionable about the numbers, but it kinda seems like they kept the 320 in its comfort zone where basically it is just as fast as the 640

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: gramboh
What a useless review style. How is it relevant having someone judge what is playable/looks nice? Raw FPS data at CONSTANT detail settings with various resolution/AA/AF like Anandtech does is the ONLY relevant measure. I don't load Hardocp.com because I don't want to give them ad revenue.

:::sigh::: Please look more carefully. The details setting are NOT constant, but always change according to performance ability at a given resolution. I think that most people calling this review shoddy, or crap, really don't understand what it is. When I first saw this review style last year, I was like "WTF is this crap?". So I understand if others feel the same way. But now that I am used to it, I like it.

It shows a buyer what performance can be expected. If somebody wants to know if they can play a certain game at certain settings, the review helps. They can ask, "What detail settings can I run on this card, for this game, at this resolution at playable fps.

More important than MAX framerate is AVERAGE framerate. And more important than AVERAGE framerate is MINIMUM framerate. If you guys don't understand the review, take a few extra minutes and go through it again. Eventually you'll say, "Hmmm. Now I see what they are trying to show us.".
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: keysplayr2003
Originally posted by: gramboh
What a useless review style. How is it relevant having someone judge what is playable/looks nice? Raw FPS data at CONSTANT detail settings with various resolution/AA/AF like Anandtech does is the ONLY relevant measure. I don't load Hardocp.com because I don't want to give them ad revenue.

:::sigh::: Please look more carefully. The details setting are NOT constant, but always change according to performance ability at a given resolution. I think that most people calling this review shoddy, or crap, really don't understand what it is. When I first saw this review style last year, I was like "WTF is this crap?". So I understand if others feel the same way. But now that I am used to it, I like it.

It shows a buyer what performance can be expected. If somebody wants to know if they can play a certain game at certain settings, the review helps. They can ask, "What detail settings can I run on this card, for this game, at this resolution at playable fps.

More important than MAX framerate is AVERAGE framerate. And more important than AVERAGE framerate is MINIMUM framerate. If you guys don't understand the review, take a few extra minutes and go through it again. Eventually you'll say, "Hmmm. Now I see what they are trying to show us.".

Been there, done that, they are trying to show us that BFG has the best cards bar none
Its hilarious how they dont even test another brand... They are the only site doing it, reeks of PAYED

But still, look at the numbers yourself, compare the first review and the new one... And lets not even mention how the chose the exact games in which the 2900 has problems (except oblivion, where it should be much faster)

Oh and I quickly read through the forum thread over there and a user with a 2900 said the STALKER results are BS too, and he gets much higher than that... What more evidence do you need? HOCP to come out and state publicly that they are bought? Wont happen

Another hilarious thing, JethroBodine (Rollo as everyone knows) posted that review on Rage3D... What a surprise
If everyone just ignored that site, I would too, but I wont shut up until people stop crediting them for helping Nvidia sell cards

And just for the record, when Firingsquad, Techreport, Guru3D or Anandtech come up with a review showing similar results, Ill just shut up about HOCP for once, but we all know thats not gonna happen
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ShadowOfMyself
Originally posted by: keysplayr2003
Originally posted by: gramboh
What a useless review style. How is it relevant having someone judge what is playable/looks nice? Raw FPS data at CONSTANT detail settings with various resolution/AA/AF like Anandtech does is the ONLY relevant measure. I don't load Hardocp.com because I don't want to give them ad revenue.

:::sigh::: Please look more carefully. The details setting are NOT constant, but always change according to performance ability at a given resolution. I think that most people calling this review shoddy, or crap, really don't understand what it is. When I first saw this review style last year, I was like "WTF is this crap?". So I understand if others feel the same way. But now that I am used to it, I like it.

It shows a buyer what performance can be expected. If somebody wants to know if they can play a certain game at certain settings, the review helps. They can ask, "What detail settings can I run on this card, for this game, at this resolution at playable fps.

More important than MAX framerate is AVERAGE framerate. And more important than AVERAGE framerate is MINIMUM framerate. If you guys don't understand the review, take a few extra minutes and go through it again. Eventually you'll say, "Hmmm. Now I see what they are trying to show us.".

Been there, done that, they are trying to show us that BFG has the best cards bar none
Its hilarious how they dont even test another brand... They are the only site doing it, reeks of PAYED

But still, look at the numbers yourself, compare the first review and the new one... And lets not even mention how the chose the exact games in which the 2900 has problems (except oblivion, where it should be much faster)

Oh and I quickly read through the forum thread over there and a user with a 2900 said the STALKER results are BS too, and he gets much higher than that... What more evidence do you need? HOCP to come out and state publicly that they are bought? Wont happen

Another hilarious thing, JethroBodine (Rollo as everyone knows) posted that review on Rage3D... What a surprise
If everyone just ignored that site, I would too, but I wont shut up until people stop crediting them for helping Nvidia sell cards

And just for the record, when Firingsquad, Techreport, Guru3D or Anandtech come up with a review showing similar results, Ill just shut up about HOCP for once, but we all know thats not gonna happen

It could easily be said, from this post, that you are biased against HOCP. Should we now take your word for it that they are? Becasue that's what you're asking us to do. To take your word for it that HOCP is payed for it's results. Again, you need to show us what they did wrong, or what they did that was geared to skew any results. And, I don't know how you can compare AT, Toms, others to HOCP. Methods are different are they not? For example, we don't know how much grass draw distance was used on any other of the review sites, do we? That's just one example on a certain game. Am I mistaken? I need to go back and look for myself again.

When apoppin gets his rig underway, he and I will try to mimmick HOCP benches for whatever games we have in common. I don't expect there will be any differences. Except for the fact that we don't have 4GHz CPU's and I have a 640GTS instead of a 320GTS.

We will see. I'll make a spreadsheet. We should probably start a new thread just for this.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: BS
Oh boy another H review bashing the 2900XT with non mature drivers.

At what point would you say drivers become mature? 2 months? 6? A year?
I agree about the drivers and AMD should be able to get more performance out of it. With all that bandwidth available to the 2900, you would think a simple driver revision would literally "wake" the card up. But it looks like that core might be designed in a way that current games cannot take advantage of. Maybe down the road a ways, devs will utilize the superscalar shaders efficiently. After reading the architectural differences between G80 and R600, and how they handle data, I can see how this presents a real challenge for the driver team at AMD.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: keysplayr2003
Originally posted by: BS
Oh boy another H review bashing the 2900XT with non mature drivers.

At what point would you say drivers become mature? 2 months? 6? A year?
I agree about the drivers and AMD should be able to get more performance out of it. With all that bandwidth available to the 2900, you would think a simple driver revision would literally "wake" the card up. But it looks like that core might be designed in a way that current games cannot take advantage of. Maybe down the road a ways, devs will utilize the superscalar shaders efficiently. After reading the architectural differences between G80 and R600, and how they handle data, I can see how this presents a real challenge for the driver team at AMD.

Stop being so nVidia biased, it takes quite a while for a driver to be mature, don't you remember that certain driver fixes improves the performance greatly in many scenarios? Why you just go to the Anandtech''s review "Catalyst, under the knife" or "Forceware, under the knife", and will see how long it took for nVidia to exploit the maximum performance on the GeForce 6 series, in the Radeon X800 series didn't take that long cause it was based on the older R300. Don't you remember the Doom 3 fix replacing the Texture Lookup table with Math? Or the Far Cry fix fixing a bottleneck in the Vertex Shader? The R600 (even the R5xx) series have a Programmable Memory Controller which the driver can manipulate to improve efficiency in different games, after all, not all the games access the RAM in the same way. The Radeon HD 2900XT is faster than a GTS 640MB and is slower than a GTX, period.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: ShadowOfMyself
Originally posted by: keysplayr2003
Originally posted by: gramboh
What a useless review style. How is it relevant having someone judge what is playable/looks nice? Raw FPS data at CONSTANT detail settings with various resolution/AA/AF like Anandtech does is the ONLY relevant measure. I don't load Hardocp.com because I don't want to give them ad revenue.

:::sigh::: Please look more carefully. The details setting are NOT constant, but always change according to performance ability at a given resolution. I think that most people calling this review shoddy, or crap, really don't understand what it is. When I first saw this review style last year, I was like "WTF is this crap?". So I understand if others feel the same way. But now that I am used to it, I like it.

It shows a buyer what performance can be expected. If somebody wants to know if they can play a certain game at certain settings, the review helps. They can ask, "What detail settings can I run on this card, for this game, at this resolution at playable fps.

More important than MAX framerate is AVERAGE framerate. And more important than AVERAGE framerate is MINIMUM framerate. If you guys don't understand the review, take a few extra minutes and go through it again. Eventually you'll say, "Hmmm. Now I see what they are trying to show us.".

Been there, done that, they are trying to show us that BFG has the best cards bar none
Its hilarious how they dont even test another brand... They are the only site doing it, reeks of PAYED

But still, look at the numbers yourself, compare the first review and the new one... And lets not even mention how the chose the exact games in which the 2900 has problems (except oblivion, where it should be much faster)

Oh and I quickly read through the forum thread over there and a user with a 2900 said the STALKER results are BS too, and he gets much higher than that... What more evidence do you need? HOCP to come out and state publicly that they are bought? Wont happen

Another hilarious thing, JethroBodine (Rollo as everyone knows) posted that review on Rage3D... What a surprise
If everyone just ignored that site, I would too, but I wont shut up until people stop crediting them for helping Nvidia sell cards

And just for the record, when Firingsquad, Techreport, Guru3D or Anandtech come up with a review showing similar results, Ill just shut up about HOCP for once, but we all know thats not gonna happen

It could easily be said, from this post, that you are biased against HOCP. Should we now take your word for it that they are? Becasue that's what you're asking us to do. To take your word for it that HOCP is payed for it's results. Again, you need to show us what they did wrong, or what they did that was geared to skew any results. And, I don't know how you can compare AT, Toms, others to HOCP. Methods are different are they not? For example, we don't know how much grass draw distance was used on any other of the review sites, do we? That's just one example on a certain game. Am I mistaken? I need to go back and look for myself again.

When apoppin gets his rig underway, he and I will try to mimmick HOCP benches for whatever games we have in common. I don't expect there will be any differences. Except for the fact that we don't have 4GHz CPU's and I have a 640GTS instead of a 320GTS.

We will see. I'll make a spreadsheet. We should probably start a new thread just for this.
FACT: i AM biased against HardOCP reviews ... and it comes from my reading their reviews and conclusions very carefully. i just cant "prove" their Bias ... yet

. . . and maybe i can get my CPU to near 4GHz

i DID get the Thermalright Ultra-120 Extreme CPU Cooler and the Scythe S-FLEX SFF21F 120mm Fan ... so hopefully at least 3Ghz ... i can easily manage 2.81Ghz with my crippled MB and stock Intel HSF

EDIT: you're right ... HardOCP does things no other reviewer does ... like messing with sun-shadows and shadows on grass ... they have a LOT less effect then they make it out to be ... i *think* they took the worst possible combination for the HD-XT and compared it with the best possible combination of the 320 GTS to deliberately and totally screw [not just skew] AMD and paint nvidia in the best possible light ... just like Wreckage does.

*ME* ... i just want the best damn [bang-for-buck] GPU for my rig ... HD-XT or GTS/X

*strangely* my x1950p runs STALKER with *everything maxed* better than the HD-XT or the GTS ... and i also have everything maxed [grass distance] in Oblivion for better fps then they get
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ShadowOfMyself

And just for the record, when Firingsquad, .... come up with a review showing similar results, Ill just shut up about HOCP for once, but we all know thats not gonna happen

http://www.firingsquad.com/har...t_performance_preview/

Battlefield 2142 - 1600x1200 4xAA 8xAF
GTS640 - 72.80
HD2900 - 56.50

STALKER - 1600x1200 0xAA 8xAF
GTS640 - 41.50
HD2900 - 35.7

Oblivion HDR Mountains Area 1600x1200 0xAA 8xAF
GTS640 - 76.9
HD2900 - 59.4

So even though Firingsquad tried to help the 2900 by not using AA in some of its tests. The 2900 lost badly to the GTS, JUST LIKE IN THE HARDOCP REVIEW

So you can shut up now.


 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
@wreckage, did you at all check the date that article was posted at ? I'll enlighten you, may 14. That's not june 13.
 

ND40oz

Golden Member
Jul 31, 2004
1,264
0
86
Originally posted by: Wreckage
Originally posted by: ShadowOfMyself

And just for the record, when Firingsquad, .... come up with a review showing similar results, Ill just shut up about HOCP for once, but we all know thats not gonna happen

http://www.firingsquad.com/har...t_performance_preview/

Battlefield 2142 - 1600x1200 4xAA 8xAF
GTS640 - 72.80
HD2900 - 56.50

STALKER - 1600x1200 0xAA 8xAF
GTS640 - 41.50
HD2900 - 35.7

Oblivion HDR Mountains Area 1600x1200 0xAA 8xAF
GTS640 - 76.9
HD2900 - 59.4

So even though Firingsquad tried to help the 2900 by not using AA in some of its tests. The 2900 lost badly to the GTS, JUST LIKE IN THE HARDOCP REVIEW

So you can shut up now.

And once again, they used XP...time to get with the present and bench in Vista. I love my nVidia cards in XP but they have terrible driver support in Vista. nVidia released the cards in Nov working XP drivers. ATI released the cards in May with working XP and Vista drivers, I'm willing to bet they leaned more heavily to the Vista side since that's where things are going. XP is done, in another 6 months you won't be able to buy it. nVidia has had 8 months to get their Vista drivers on the same page as their XP drivers and they still aren't anywhere close. There's something to be said about crossfire working out of the gates with Vista and yet I still don't have SLI support for my quadros, 3k worth of video card that you pay a premium for driver support and nVidia has dropped the ball. It's time to hold them accountable for the made for Vista stickers they're putting on their packaging. Stop benching in XP, if you're buying DX10 cards you're expecting to use them with DX10 as games trickle out and you're going to be doing that with Vista, not XP
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: MarcVenice
@wreckage, did you at all check the date that article was posted at ? I'll enlighten you, may 14. That's not june 13.

I posted one from June 13th.....it's what this thread is about (there I enlightened you).


That's the trend for defending the HD2900. As long a you test it with a few games, next year, with no AA and using some magical unseen driver it will perform better. What dreams may come.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: MarcVenice
@wreckage, did you at all check the date that article was posted at ? I'll enlighten you, may 14. That's not june 13.

nope ... the original article was May

Wednesday, June 13, 2007

Introduction

We are going to make this evaluation short, sweet and to the point since we have already performed a major evaluation of the ATI Radeon HD 2900 XT. Please read that evaluation first to get the lowdown on the new ATI Radeon HD 2900 XT video card and all the specifications therein.
by Kyle 'nvidia rules' Bennett

Yeah, he does know how to Ben[d]it ... better than Beckham
-i believe that is his goal ... since they change their testing methods to something that is *unique* to their site.

i'm out to prove him wrong ... to discredit them if i can

if they are right, i apologize .... and we will all know here what is the truth of the matter ... at least a lot more truth without 'spin'
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: bfdd
Originally posted by: ArchAngel777
Originally posted by: bfdd
Originally posted by: ArchAngel777
I never claimed the site was biased, just the well known fact that Wreckage is biased. This is the *second* review of the 2900XT since the initial release. So we have one review saying it is on par with a 8800GTX and one review saying it cannot even keep up with the 8800GTS. If I am not willing to take the review that shows the 2900XT in a good light, then why would I be willing to accept the review that shows the 2900XT dominated?

This review, in my opinion, was shoddy. That doesn't mean their numbers are false, fake, I just think the review was a P.O.S. regardless of their numbers. So they are stating the card runs like crap - ok, lets wait and see what AT/TH have to say about it when they retest.

I NEVER read a review saying it was on par with a 8800 GTX, infact I've read all the opposite. I've read it's on par if not slightly faster than an 8800 GTS, but it still is no where close to a 8800 GTX infact ATs review clearly shows an 8800 GTX SINGLE is faster than it in Xfire so wtf?

1) http://www.techspot.com/review...asus-radeon-hd-2900xt/

2) Were not talking about Initial reviews. Reading comprehension FTW.

3) So lets wait and see?

two reviews that show you're wrong, one that has your "hard facts" don't be a douche I can easily say that's bias and leans towards ATI what's to prove me wrong? Because you believe it? BTW those are awful low 3dmark06 scores, I got higher at 2.4ghz stock 2gb corsair 1066, and the gtx at stock. I'm pretty sure atleast. I could run it again if you like want me to set it all stock and we can see? I even have slower ram but 4gb of it now, want to run a bench side by side on vista? you running 64bit?

Obviously you are a 'noob' so there isn't even much sense in addressing this nonsense. You can't address the issue at hand, but instead change the subject and go off on anything else. Whatever, not worth my time.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Have you guys checked out Bit Tech's review?

It tests the cards at higher resolutions with AA which actually shows how well each card scales with increased load (this is important as games become more intensive we can see how increased load affects each card TODAY). In a lot of cases the average framerates are very similar between 8800GTS 640mb and 320mb versions, but the difference in minimum framerates is very significant.

I realize this is a May review, but when we are trying to discuss performance of the two cards in question, it is important to review more than 1 source and not assign 100% of the weight to HardOCP's review, regardless whether or not NV or ATI prevailed in their review.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
By Bit-tech.net http://www.bit-tech.net/hardwa...i_radeon_hd_2900_xt/22

First off, the card is obviously late. Very late. And normally when you?re late, you have to do something special. Unfortunately for AMD, R600 just isn?t that special because not only is Nvidia?s performance crown still intact, the card AMD has chosen to attack ? the GeForce 8800 GTS 640MB ? has come away with all but a few chinks in its armour.

By Guru3d.com http://www.guru3d.com/article/Videocards/431/26/

It is what it is, and the HD 2900 XT performance wise ended up in the lower to mid part of the high-end segment. Sometimes it has a hard time keeping up with a 320MB 8800 GTS, and in other scenarios we see performance close or equal to the GeForce 8800 GTX. Now that would be weird if we all had to pay 600 USD/EUR for it. AMD knows this, and knows it very well. This is why, and honestly this is fantastic, the product is launched at a 399 MSRP. Now I have been talking with a lot of board partners already and here in Europe the product will launch at 350 EUR; and that's just golden.

So we need to leave the uber-power-frag-meister-performance idea behind us and mentally position the product where it is and compare it with the money you have to pay for it. For your mental picture; performance wise I'd say GeForce 8800 GTS 640 MB is a good comparative product (performance wise). Then the opinion will change as you'll receive absolutely a lot of bang for your bucks here. At 350 EUR you'll have a good performing DirectX 10 compatible product, new programmable tessellation unit. It comes with 512 megs of memory. It comes with a state of the art memory controller, offers HDCP straight out of the box, all cards have HDMI connectivity with support for sound and if that alone is not enough, you receive a Valve game-bundle with some very hip titles in there for free. So yeah, you really can't go wrong there.

Beyond3d.com http://www.beyond3d.com/content/reviews/16/16

With a harder-to-compile-for shader core (although one with monstrous floating point peak figures), less per-clock sampler ability for almost all formats and channel widths, and a potential performance bottleneck with the current ROP setup, R600 has heavy competition in HD 2900 XT form. AMD pitch the SKU not at (or higher than) the GeForce 8800 GTX as many would have hoped, but at the $399 (and that's being generous at the time of writing) GeForce 8800 GTS 640MiB. And that wasn't on purpose, we reckon. If you asked ATI a year ago what they were aiming for with R600, the answer was a simple domination over NVIDIA at the high end, as always.

While we take it slow with our analysis -- and it's one where we've yet to heavily visit real world game scenarios, DX10 and GPGPU performance, video acceleration performance and quality, and the cooler side facets like the HDMI solution -- the Beyond3D crystal ball doesn't predict the domination that ATI will have done a year or more ago. Early word from colleagues at HEXUS, The Tech Report and Hardware.fr in that respect is one of mixed early performance that's 8800 GTS-esque or thereabouts overall, but also sometimes less than Radeon X1950 XTX in places. Our own early figures there show promise for AMD's new graphics baby, but not everywhere.

Tech Report http://www.techreport.com/revi...d-2900xt/index.x?pg=16

Ultimately, though, we can't overlook the fact that AMD built a GPU with 700M transistors that has 320 stream processor ALUs and a 512-bit memory interface, yet it just matches or slightly exceeds the real-world performance of the GeForce 8800 GTS. The GTS is an Nvidia G80 with 25% of its shader core disabled and only 60% of the memory bandwidth of the Radeon HD 2900 XT. That's gotta be a little embarrassing. At the same time, the Radeon HD 2900 XT draws quite a bit more power under load than the full-on GeForce 8800 GTX, and it needs a relatively noisy cooler to keep it in check. If you ask folks at AMD why they didn't aim for the performance crown with a faster version of the R600, they won't say it outright, but they will hint that leakage with this GPU on TSMC's 80HS fab process was a problem. All of the telltale signs are certainly there.

Hexus.net http://www.hexus.net/content/item.php?item=8687&page=11

AMD's had to revise RRP pricing for precisely these reasons and the consumer will benefit as a result. Thinking now about the card's pricing bracket (£250) and immediate competitor - the GeForce 8800 GTS 640 - AMD does looks better. Its card is late but not late enough to miss the DX10 gaming party which is just about to begin. The HD 2900 XT is also speedy enough to give the GTS 640 better than it receives and should sell well.

I see that most website reviews states that the Radeon HD 2900XT is as fast or a bit faster than a GTS 640MB. So that indicates that you are an nVidia Fan. Thanks, next case.



 

superbooga

Senior member
Jun 16, 2001
333
0
0
Hardocp put the x1950xtx over the 7950gx2, when most other sites said otherwise.

Now personally I think it's odd that a top of the line 2900 card can't play games with max quality settings (shadows, grass, lighting, etc). However there's no question that the GTS320/640 will almost always be able to achieve a higher level of AA thanks to the 2900's craptastic AA performance.

Then again, I think AA is overrated.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
Originally posted by: gramboh
What a useless review style. How is it relevant having someone judge what is playable/looks nice? Raw FPS data at CONSTANT detail settings with various resolution/AA/AF like Anandtech does is the ONLY relevant measure. I don't load Hardocp.com because I don't want to give them ad revenue.

:::sigh::: Please look more carefully. The details setting are NOT constant, but always change according to performance ability at a given resolution. I think that most people calling this review shoddy, or crap, really don't understand what it is. When I first saw this review style last year, I was like "WTF is this crap?". So I understand if others feel the same way. But now that I am used to it, I like it.

It shows a buyer what performance can be expected. If somebody wants to know if they can play a certain game at certain settings, the review helps. They can ask, "What detail settings can I run on this card, for this game, at this resolution at playable fps.

More important than MAX framerate is AVERAGE framerate. And more important than AVERAGE framerate is MINIMUM framerate. If you guys don't understand the review, take a few extra minutes and go through it again. Eventually you'll say, "Hmmm. Now I see what they are trying to show us.".

The problem with that review style is that the author gets to decide what's playable, and which settings to adjust to make the game "playable". And different cards will have different performance delta's from adjusting those settings, so the author is free to chose which settings to adjust that will favor one card over the other. Why did he only use 2xAA in Oblivion, but chose to max out shadow settings? I played Oblivion, and the IQ difference between shadow settings is much more subtle than the difference between 2x and 4x AA.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
No real news here. In the few 2900xt reviews that I've seen, it tends to perform about on part with the GTS 640. Given the often minuscule differences between the two GTS models, why is the idea that the 2900xt occasionally performs worse than the 320 such a shocking idea? If one thinks about the matter at all, it would be hard to come to any other conclusion than that the GTS 320 provides better value than both the GTS 640 and the 2900xt.

Indeed, the GTS 320 has been the choice for value since its release. It's boast was GTS 640 performance for less cost. The 8500/8600 series are flat out underwhelming, the extra memory of the 640 doesn't seem all that useful at mainstream resolutions, and the GTX is nearly double the cost. Enter the 2900xt, slated to compete against the GTS 640. It's going to change the value in the GTS 320 how?

Who knows where the 2900xt will end up placing after it's all said and done. I'll be curious to hear apoppin's experiences, but it will be just that: curiosity. We all have been the creators of our own problems: we get to be beta testers because we reward companies for rushing products out the door. As with any product, buyer beware for the 2900xt (or to a lesser extent the 8800 series depending on your setup).

Keys - In general, it's human nature to review results we don't like and accept those we do. I'm not sure anyone on this board is exempt from human nature (any aliens out there?). While the initial impetus for asking questions may be based on human nature, it doesn't mean that it is the sole motive, nor that the questions have no validity.
 

solofly

Banned
May 25, 2003
1,421
0
0
Well I'm very happy NV is kicking ass and all but it's time to move on and get Vista involved. I already upgraded half of my rigs to Vista so XP benches seem outdated or have a little meaning if at all.
 

will889

Golden Member
Sep 15, 2003
1,463
5
81
I can tell you from seeing a 320 and 640 side by side (with my own testing) that the difference in minimum between the 320MB and 640MB versions goes up exponentially as the resolution increases. There's not much of a difference until you hit 1600*1200 and beyond that with wide screen resolutions but gaming on a normal CRT say @ 12*10 mot much difference, most especially at 10*76 or 11*86.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: evolucion8
By Bit-tech.net http://www.bit-tech.net/hardwa...i_radeon_hd_2900_xt/22

First off, the card is obviously late. Very late. And normally when you?re late, you have to do something special. Unfortunately for AMD, R600 just isn?t that special because not only is Nvidia?s performance crown still intact, the card AMD has chosen to attack ? the GeForce 8800 GTS 640MB ? has come away with all but a few chinks in its armour.

By Guru3d.com http://www.guru3d.com/article/Videocards/431/26/

It is what it is, and the HD 2900 XT performance wise ended up in the lower to mid part of the high-end segment. Sometimes it has a hard time keeping up with a 320MB 8800 GTS, and in other scenarios we see performance close or equal to the GeForce 8800 GTX. Now that would be weird if we all had to pay 600 USD/EUR for it. AMD knows this, and knows it very well. This is why, and honestly this is fantastic, the product is launched at a 399 MSRP. Now I have been talking with a lot of board partners already and here in Europe the product will launch at 350 EUR; and that's just golden.

So we need to leave the uber-power-frag-meister-performance idea behind us and mentally position the product where it is and compare it with the money you have to pay for it. For your mental picture; performance wise I'd say GeForce 8800 GTS 640 MB is a good comparative product (performance wise). Then the opinion will change as you'll receive absolutely a lot of bang for your bucks here. At 350 EUR you'll have a good performing DirectX 10 compatible product, new programmable tessellation unit. It comes with 512 megs of memory. It comes with a state of the art memory controller, offers HDCP straight out of the box, all cards have HDMI connectivity with support for sound and if that alone is not enough, you receive a Valve game-bundle with some very hip titles in there for free. So yeah, you really can't go wrong there.

Beyond3d.com http://www.beyond3d.com/content/reviews/16/16

With a harder-to-compile-for shader core (although one with monstrous floating point peak figures), less per-clock sampler ability for almost all formats and channel widths, and a potential performance bottleneck with the current ROP setup, R600 has heavy competition in HD 2900 XT form. AMD pitch the SKU not at (or higher than) the GeForce 8800 GTX as many would have hoped, but at the $399 (and that's being generous at the time of writing) GeForce 8800 GTS 640MiB. And that wasn't on purpose, we reckon. If you asked ATI a year ago what they were aiming for with R600, the answer was a simple domination over NVIDIA at the high end, as always.

While we take it slow with our analysis -- and it's one where we've yet to heavily visit real world game scenarios, DX10 and GPGPU performance, video acceleration performance and quality, and the cooler side facets like the HDMI solution -- the Beyond3D crystal ball doesn't predict the domination that ATI will have done a year or more ago. Early word from colleagues at HEXUS, The Tech Report and Hardware.fr in that respect is one of mixed early performance that's 8800 GTS-esque or thereabouts overall, but also sometimes less than Radeon X1950 XTX in places. Our own early figures there show promise for AMD's new graphics baby, but not everywhere.

Tech Report http://www.techreport.com/revi...d-2900xt/index.x?pg=16

Ultimately, though, we can't overlook the fact that AMD built a GPU with 700M transistors that has 320 stream processor ALUs and a 512-bit memory interface, yet it just matches or slightly exceeds the real-world performance of the GeForce 8800 GTS. The GTS is an Nvidia G80 with 25% of its shader core disabled and only 60% of the memory bandwidth of the Radeon HD 2900 XT. That's gotta be a little embarrassing. At the same time, the Radeon HD 2900 XT draws quite a bit more power under load than the full-on GeForce 8800 GTX, and it needs a relatively noisy cooler to keep it in check. If you ask folks at AMD why they didn't aim for the performance crown with a faster version of the R600, they won't say it outright, but they will hint that leakage with this GPU on TSMC's 80HS fab process was a problem. All of the telltale signs are certainly there.

Just highlighted a few things for ya.

Seems to be at or below a 640 all things considered.

I know there are certain people who support this card blindly but I don't think it exceeds the 640 overall and once you consider price, heat, noise, power draw, IQ it falls far behind.

Comparing the Hardocp review to other sites it is accurate. Other sites confirm that the 2900 does not do well in those games. I really don't see what there complaint is other than their own opinion is not based on fact.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |