Competition is good!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: dadach
ah...so may nvidiot trolls cant take one review part that shows ati better than nvidia...do you guys even have your video cards to play games, or just to show off and talk smack?

I really hope you are not referring to what I posted above you. Because that is just dumb.
I have an 8800GTS and yes I do in fact play games. As for any smack talk, I only pointed out something I had noticed about scaling in R6V. Nothing more, nothing less.
If you have some sort of problem with what I posted, then I can safely say who the legitimate fanboy might be.

Now, instead of attacking nvidia fans, do you have anything to offer in this thread?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: dadach
ah...so may nvidiot trolls cant take one review part that shows ati better than nvidia...do you guys even have your video cards to play games, or just to show off and talk smack?

I really hope you are not referring to what I posted above you. Because that is just dumb.
I have an 8800GTS and yes I do in fact play games. As for any smack talk, I only pointed out something I had noticed about scaling in R6V. Nothing more, nothing less.
If you have some sort of problem with what I posted, then I can safely say who the legitimate fanboy might be.

Now, instead of attacking nvidia fans, do you have anything to offer in this thread?

you are a nvidia fan?
:Q

i guess ... now ...
... after HD2900xt
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
$340 is still too much. MIRs should not be included because not only are they not guaranteed to come back to you, you're still paying the full price up front, AND they often require you to send in the original UPC which for some manufacturers can prevent your warranty claims. Lots of shady stuff surrounding MIRs.

Give me a real price that I pay at checkout that's $320 or less and I'll be more interested in the 640MB GTS. That's a price at which I'd be interested.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: vadp
Originally posted by: vadp
Originally posted by: Wreckage
If you look at the Lost Planet DX10 demo you will see the $300 GTS stomping the shizzle out of the $400 2900.
"As you can imagine, AMD had something to say about the DX10 demo because it was co-developed by NVIDA under its The Way Its Meant To Be Played program.
This is what AMD had to say:
Before you begin testing, there are a few points I want to convey about ?Lost Planet?.
?Lost Planet? is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for.
The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion.
Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game."

http://www.hardocp.com/news.html?news=MjU4MjYsLCxoZW50aHVzaWFzdCwsLDE=

so basically what you're saying is that Lost Planet DX10 is irrelevant, as it was "co-developed" by nvidia, but yet you use R6 Vegas, a game based on early ue3 tech and developed specifically for an ati gfx card (xbox) and ported to pc to claim the superiority of an ati gfx card?

LMAO. talk about hypocrisy...
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: CaiNaM
Originally posted by: vadp
Originally posted by: vadp
Originally posted by: Wreckage
If you look at the Lost Planet DX10 demo you will see the $300 GTS stomping the shizzle out of the $400 2900.
"As you can imagine, AMD had something to say about the DX10 demo because it was co-developed by NVIDA under its The Way Its Meant To Be Played program.
This is what AMD had to say:
Before you begin testing, there are a few points I want to convey about ?Lost Planet?.
?Lost Planet? is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for.
The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion.
Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game."

http://www.hardocp.com/news.html?news=MjU4MjYsLCxoZW50aHVzaWFzdCwsLDE=

so basically what you're saying is that Lost Planet DX10 is irrelevant, as it was "co-developed" by nvidia, but yet you use R6 Vegas, a game based on early ue3 tech and developed specifically for an ati gfx card (xbox) and ported to pc to claim the superiority of an ati gfx card?

LMAO. talk about hypocrisy...

Agreed.

I wish they would call these games what they actually are. None of this "DX10" demo crap when all they are is DX9 foundations with a splash of DX10 functionality here or there. Can any game developer actually step up to the plate and truthfully claim a FULL DX10 title (even a little pittling demo?).

 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: yacoub
$340 is still too much. MIRs should not be included because not only are they not guaranteed to come back to you, you're still paying the full price up front, AND they often require you to send in the original UPC which for some manufacturers can prevent your warranty claims. Lots of shady stuff surrounding MIRs.

Give me a real price that I pay at checkout that's $320 or less and I'll be more interested in the 640MB GTS. That's a price at which I'd be interested.

I don't agree with your reasoning. The MIR should only be discounted if you have a strong belief that it won't be honored. It's not guaranteed, and some manufacturers are worse than others, but with some work most MIR are fulfilled.

Also, which manufacturers require a UPC for warranty claims? I thought at most, only the product itself and a receipt was required? I've never heard of the UPC being required.... maybe for returns, but that's not a warranty claim.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: keysplayr2003
"Stop the press! The HD 2900XT is top dog at all resolutions in Rainbow Six: Vegas which was a very pleasant surprise." -quote from legit reviews.

All resolutions huh? 1024x768 - 1600x1200 does not all resolutions make.

All resolutions that they tested. Which sadly, is only what most reviews test. Although more are coming around.

Originally posted by: Wreckage
Originally posted by: Ackmed

Is there DX10 numbers out? All Ive seen is FS and they only did DX9.

http://www.legitreviews.com/article/505/1/

That doesnt line up with FS. Brandon says, "The game doesn't load under DX10 with R600 unfortunately." So I dont know what legit did to get it to work with DX10.

All in all, I dont think its bad. Especially since they didnt have the game to optimize their drivers like NV did.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Cookie Monster
Originally posted by: Matt2
Originally posted by: lyssword
Originally posted by: Matt2
Originally posted by: vadp
Originally posted by: apoppin
but then who is the GTS 640 competing with that they think they need to drop prices?
Confused?
Check out the Unreal 3 engine benchmarks.
The 2900XT kills the 8800GTX

http://www.legitreviews.com/article/503/8/

And yes, competition is always good.
It will only get better after the launch of the rest of the ATI cards.

Bad ports.

These are in no way indicative of how a finely tuned PC version of the UE3 engine will perform.

When the 8800GTX scores 1 fps higher than the 8800GTS 320mb, I would say the engine is borked.

you mean a 320gts that is overclocked to GTX speeds

Yeah, the same one that has 42% the RAM, 83% the memory bandwidth and only 75% the shader power of a stock 8800GTX.

Yeah, that one.

You sure? the only thing different about the 8800GTS320 and the 8800GT640 is the framebuffer size.

Yes I am sure. I was referring to lyssword's comment about the 8800GTS 320mb XXX vs the stock 8800GTX.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: golem
Originally posted by: yacoub
$340 is still too much. MIRs should not be included because not only are they not guaranteed to come back to you, you're still paying the full price up front, AND they often require you to send in the original UPC which for some manufacturers can prevent your warranty claims. Lots of shady stuff surrounding MIRs.

Give me a real price that I pay at checkout that's $320 or less and I'll be more interested in the 640MB GTS. That's a price at which I'd be interested.

I don't agree with your reasoning.

You're welcome to disagree; I am primarily interested in the price I am being charged at checkout, because that is the amount of money I am paying. The MIR may come in 6-8 weeks. If it does, great.

Either way you're paying $340 for the card, not $305. When it hits closer to $305 at checkout I'll start giving it more mind.
 

golem

Senior member
Oct 6, 2000
838
3
76
You're welcome to disagree; I am primarily interested in the price I am being charged at checkout, because that is the amount of money I am paying. The MIR may come in 6-8 weeks. If it does, great.

Either way you're paying $340 for the card, not $305. When it hits closer to $305 at checkout I'll start giving it more mind.

Guess like most things, it depends on the person then. To some it's 340 to others it's 305, nothing wrong with that.
 

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: Wreckage
Originally posted by: vadp
Originally posted by: apoppin
but then who is the GTS 640 competing with that they think they need to drop prices?
Confused?
Check out the Unreal 3 engine benchmarks.
The 2900XT kills the 8800GTX

http://www.legitreviews.com/article/503/8/

And yes, competition is always good.
It will only get better after the launch of the rest of the ATI cards.

Unreal 3 - DX10

R6:V = DX9 Bad Console port

If you look at the Lost Planet DX10 demo you will see the $300 GTS stomping the shizzle out of the $400 2900.

Uh lost planet is a bad console port too, it runs in both dx10 and 9. True dx10 will be dx10 only, from what I have read on the subject.

EDIT: 2900 also destroys the gtx in marvel ultimate alliance,
http://www.legitreviews.com/article/503/11/

With driver improvements the 2900 could end up "stomping" the GTX across the board.
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: thefonz
With driver improvements the 2900 could end up "stomping" the GTX across the board.

With no driver improvements at all the 2900 does "stomp" the GTX across the board as a space heater. :heart:
 

lyssword

Diamond Member
Dec 15, 2005
5,630
25
91
Originally posted by: ss284
Originally posted by: thefonz
With driver improvements the 2900 could end up "stomping" the GTX across the board.

With no driver improvements at all the 2900 does "stomp" the GTX across the board as a space heater. :heart:

It may eat more power than gts (equal to GTX) but it creates less heat. See http://www.legitreviews.com/article/506/3/
I think the reason xt feels hotter than gtx/gts is because hot air is dumped outside the case with XT, whereas gtx dumps hot air inside the case, which is not always good. Either way they both run pretty hot.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: lyssword
Originally posted by: ss284
Originally posted by: thefonz
With driver improvements the 2900 could end up "stomping" the GTX across the board.

With no driver improvements at all the 2900 does "stomp" the GTX across the board as a space heater. :heart:

It may eat more power than gts (equal to GTX) but it creates less heat. See http://www.legitreviews.com/article/506/3/
I think the reason xt feels hotter than gtx/gts is because hot air is dumped outside the case with XT, whereas gtx dumps hot air inside the case, which is not always good. Either way they both run pretty hot.


Its not "may" eat more power, it just flat out does. It never did enter your mind that the exhaust air is warmer because the HSF on the XT is not as efficient as the HSF on the GTX? Look at core temps of both cards. The XT is always higher than the GTX. Maybe that's the reason??? Poorer cooling solution on the XT? Could be. The HSF cannot pull heat away from the R600 core and GDDR3 fast enough, hence higher core temps. At least I'm open to it. You're not gonna hear any of this, are ya... And, if the GTX dumps hot air into the case, what exhaust were they measuring exactly on the GTX???
 

dadach

Senior member
Nov 27, 2005
204
0
76
ok, m8 we get it...nvidia is best in everything, now go away...this is my contribution to this thread
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: dadach
ok, m8 we get it...nvidia is best in everything, now go away...this is my contribution to this thread

So far, most of your "contributions" have been attacks with very little substance. Why don't you go away?
 

lyssword

Diamond Member
Dec 15, 2005
5,630
25
91
Originally posted by: keysplayr2003
Originally posted by: lyssword
Originally posted by: ss284
Originally posted by: thefonz
With driver improvements the 2900 could end up "stomping" the GTX across the board.

With no driver improvements at all the 2900 does "stomp" the GTX across the board as a space heater. :heart:

It may eat more power than gts (equal to GTX) but it creates less heat. See http://www.legitreviews.com/article/506/3/
I think the reason xt feels hotter than gtx/gts is because hot air is dumped outside the case with XT, whereas gtx dumps hot air inside the case, which is not always good. Either way they both run pretty hot.


Its not "may" eat more power, it just flat out does. It never did enter your mind that the exhaust air is warmer because the HSF on the XT is not as efficient as the HSF on the GTX? Look at core temps of both cards. The XT is always higher than the GTX. Maybe that's the reason??? Poorer cooling solution on the XT? Could be. The HSF cannot pull heat away from the R600 core and GDDR3 fast enough, hence higher core temps. At least I'm open to it. You're not gonna hear any of this, are ya... And, if the GTX dumps hot air into the case, what exhaust were they measuring exactly on the GTX???

maybe if you had read the link you will see how they did it (hint: they pulled out mobo and measured from gtx exhaust. Better or worse cooling, the fact is, it creates less heat. I'm not defending how much power it eats, but I myself am surprised at some of the 2900xt results. And personally I'm waiting for 2600xt since I don't have $400.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Little surprised to see the op trolling here with silly fish bait. Competition is nice and

don't worry, be happy!

Seems to me that the hd 2900xt is looking very good, but power consumption is not great. Still haven't noticed idle use (too lazy to look), but likely even beats the 8800gtx for the power pig award. Maybe even beats that phantom ultra oc. For me the hd2600xt passive cooled looks like a possible winner for my watch movies rig. Hope they get it out soon. :beer:
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: dadach
ok, m8 we get it...nvidia is best in everything, now go away...this is my contribution to this thread

You should wait until you have 100 posts before telling someone to go away.

Originally posted by: lyssword
maybe if you had read the link you will see how they did it (hint: they pulled out mobo and measured from gtx exhaust. Better or worse cooling, the fact is, it creates less heat. I'm not defending how much power it eats, but I myself am surprised at some of the 2900xt results. And personally I'm waiting for 2600xt since I don't have $400.

So what does a lower exhaust temperature mean?

I'm more interested in which GPU stays cooler and that is definitely not the HD2900XT.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: keysplayr2003


Its not "may" eat more power, it just flat out does.

Except not all reviews show the same thing. Case in point; http://www.bit-tech.net/hardware/2007/05/16/r600_ati_radeon_hd_2900_xt/21

Shows the 2900XT using less power, on a 975X mobo by about 20w. And about 30w more on a 680i mobo. Both under load. AT idle (where the card is the vast majority of the time) the 2900XT consumes less power than the 8800GTX and both flavors of the GTS. By a lot.

Personally I dont care about what idle (or load for that matter, Ive got a mans PSU), some people are crying about how much money more a month a video card will cost running than another. It also shows that on a 975x mobo, the 2900XT uses less power than the GTX, and barely more than either GTS.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Xbitlabs power consumption table

IMHO, this is more accurate than another power readings since the testbed itself is specially modified to read the amount of power the card pulls instead of recording the total system power which can vary pretty bad. (those generic tests gives you a rough idea of the power consumption but not specific enough to know the real power consumption to compare)

161W (XT) at load compared to 131.5W (GTX). note: 225W is the max you can pull from 2xPCI-e connectors and 16xPCI-e slot.

You may not care, but theres ALOT of people out there who care about this. Why do you think major IHVs in the industry are pushing the idea of "performance per watt "

Im also not sure why people are jumping to conclusions from one review (legitreviews) site showing the XT winning against the GTX compared to a whole bunch out there showing the direct opposite.

Its too soon to judge the performance of this card because the 8.38 driver (or whatever its named) replaces AAA with EATM. This is one reason why the XT suddenly climbs to GTX performance since oblivion has lots and lots of alphas outdoors. (note that AAA for XT is like TRSS)

Not to mention that the drivers are buggy for lost planet where it doesnt render a whole of stuff including some DX10 effects on the demo itself.

With so many reviews conflicting with one and another, im kind of skeptical about judging the performance of the 2900XT.
 

Atty

Golden Member
Aug 19, 2006
1,540
0
76
Originally posted by: vadp
Originally posted by: apoppin
but then who is the GTS 640 competing with that they think they need to drop prices?
Confused?
Check out the Unreal 3 engine benchmarks.
The 2900XT kills the 8800GTX

http://www.legitreviews.com/article/503/8/

And yes, competition is always good.
It will only get better after the launch of the rest of the ATI cards.
Yeah, a badly ported game built to run on ATI hardware, great way to compare the two. ;-)

 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Wreckage
Originally posted by: vadp
Originally posted by: apoppin
but then who is the GTS 640 competing with that they think they need to drop prices?
Confused?
Check out the Unreal 3 engine benchmarks.
The 2900XT kills the 8800GTX

http://www.legitreviews.com/article/503/8/

And yes, competition is always good.
It will only get better after the launch of the rest of the ATI cards.

Unreal 3 - DX10

R6:V = DX9 Bad Console port

If you look at the Lost Planet DX10 demo you will see the $300 GTS stomping the shizzle out of the $400 2900.

Lost Planet is just as much a dx9 bad console port as r6:v, except it has a virtually meaningless Dx10 patch.
I'd be interested in seeing how the hd2900 runs the dx9 version. I know the x1950xt gets higher framerates than what is shown on the dx9 version, and I think the only thing the dx10 version adds is supposed to be performance.
Expect the 8800gtx to win either way (lost planet is about as dx9 as you can get, we've yet to see a single dx10 benchmark that actually stresses dx10), but the 2900 should be performing better than it is regardless. I think drivers can be blamed on this one, that or just a bad port.

If a 7900gtx outperforms an x1950xt on lost planet, then I'd say lost planet is a bad port that was optimized for nvidia hardware and not ati.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |