Competition is good!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

golem

Senior member
Oct 6, 2000
838
3
76
Its too soon to judge the performance of this card because the 8.38 driver (or whatever its named) replaces AAA with EATM. This is one reason why the XT suddenly climbs to GTX performance since oblivion has lots and lots of alphas outdoors. (note that AAA for XT is like TRSS)

What's EATM? So this was the reason for the jump in performance in Oblivion that was posted a while back? What's the advantages and drawbacks of EATM?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
Originally posted by: keysplayr2003


Its not "may" eat more power, it just flat out does.

Except not all reviews show the same thing. Case in point; http://www.bit-tech.net/hardware/2007/05/16/r600_ati_radeon_hd_2900_xt/21

Shows the 2900XT using less power, on a 975X mobo by about 20w. And about 30w more on a 680i mobo. Both under load. AT idle (where the card is the vast majority of the time) the 2900XT consumes less power than the 8800GTX and both flavors of the GTS. By a lot.

Personally I dont care about what idle (or load for that matter, Ive got a mans PSU), some people are crying about how much money more a month a video card will cost running than another. It also shows that on a 975x mobo, the 2900XT uses less power than the GTX, and barely more than either GTS.

LOL. If you didn't care, you wouldn't post. LOL again? A man's PSU? Let me guess, you got your PSU from TestosteroneTech?

Any other sites showing less power consumption for a 2900XT?

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
New Benchmark - Old Version

By now, we?ve all been beaten to death with all the DirectX 9 benchmarking on DirectX 10 capable cards because no DirectX 10 benchmark game titles were available to be benchmarked. I've been saying for months that the NVIDIA and ATI battle doesn't mean anything till we see some performance numbers on the DirectX 10. Before the launch of the ATI Radeon HD 2000 Series AMD and ATI seeded the press with benchmark copies of the upcoming DirectX 10 game title Call of Juarez. Finally, we had a benchmark that we could use to run DirectX 10 material and it was from a real game that consumers were able to purchase.

Once we fired up the benchmark and started running it on the NVIDIA GeForce 8800 GTX we noticed the application wouldn't run when any level of Anti-Aliasing was enabled. I contacted both ATI and NVIDIA letting them know that a benchmark was given to the media that didn't run correctly on one of the brands. It's here where things started to get a bit interesting!

I contacted several people at AMD, NVIDIA, Techland and Ubisoft and quickly found out that AMD wanted the benchmark used, NVIDIA said they had a newer build that worked, Techland was scared to upset AMD and Ubisoft was smart enough to talk off the record and admit we had early code that we more than likely shouldn't have gotten. No one ever did help us get the latest benchmark utility even though no one denied one being out there. Before jumping into the benchmark, lets take a look at what some of the companies said.

AMD.ATI's Statement:

"The game?s developer, Techland has found that this was caused by an application issue when MSAA is enabled. The benchmark can be run on both ATI Radeon and NVIDIA hardware with MSAA disabled. This application issue will be fixed in an upcoming patch available in the next couple of weeks. The full DirectX 10 version of Call of Juarez will be available to the public sometime in June with full functionality for all DirectX 10 capable graphics cards."

NVIDIA's Statement:

"NVIDIA has a long standing relationship with Techland and their publisher Ubisoft. In fact, the original European version of Call Of Juarez that was launched in September 2006 is part of the "The Way Its Meant To Be Played" program. As a result of the support Techland and Ubisoft receives for being part of the "The Way Its Meant To Be Played" program, NVIDIA discovered that the early build of the game that was distributed to the press has an application bug that violates DirectX 10 specifications by mishandling MSAA buffers, which causes DirectX 10 compliant hardware to crash. Our DevTech team has worked with Techland to fix that and other bugs in the Call of Juarez code. Benchmark testing for reviews should be performed with the latest build of the game that incorporates the DirectX 10 fixes. Previous builds of the game should not be used for benchmarking with or without AA. For details on how to get the patch and for further information please contact Techland or Ubisoft."

Techland's Statement from CEO Pawel Marchewka

"We?re not really sure which version was supplied to you. For the newest version you need to contact your contact at AMD, they have exclusive rights to distribute this benchmark. I?m not sure about why you were given an older version. I guess it probably was the newest version at that time. Our priority is making Call of Juarez the first and the best optimized DX10 game. The benchmark is updated at the same time more or less frequently. But as far as I know AMD is going to send the newest version to the Press very soon."

Basically it seemed that we were getting run in a circle as each company told us to contact the other company and no one would give up the latest benchmark build. At first I wasn't going to use the benchmark utility as when a company hands you a disc and says try this out it throws off every red flag in my book, not to mention the fact that the benchmark utility was based off an old build version that had bugs fixed in newer versions. After running the benchmark and taking screenshots to compare the image quality I couldn't resist not running this benchmark as the findings were actually not in AMD's favor. Yes, the benchmark that AMD and ATI handed out the to the press for the launch of the ATI Radeon HD 2900 XT actually played out to favor NVIDIA even though the latest build was not handed out to the media.

Hmm..

EATM is something along the lines of advanced alpha-to-coverage that results in similiar effects of TRAA and AAA.

Heres a more clear definition from razor1 from b3d:

Alpha blend is a basic alogorithm that just uses the depth buffer to the alpha test to reduce or eliminate in this case overlapping objects. EATM is a shader (pretty sure its a shader) that is a filter ontop of this that blurs the alpha channel and color channel which to some degree decreases detail but gives a better output with less jaggies. These textures that this is used on are already very low res comparative to what is used for for other objects, so the detail lose is almost unnoticable.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: keysplayr2003
Originally posted by: Ackmed
Originally posted by: keysplayr2003


Its not "may" eat more power, it just flat out does.

Except not all reviews show the same thing. Case in point; http://www.bit-tech.net/hardware/2007/05/16/r600_ati_radeon_hd_2900_xt/21

Shows the 2900XT using less power, on a 975X mobo by about 20w. And about 30w more on a 680i mobo. Both under load. AT idle (where the card is the vast majority of the time) the 2900XT consumes less power than the 8800GTX and both flavors of the GTS. By a lot.

Personally I dont care about what idle (or load for that matter, Ive got a mans PSU), some people are crying about how much money more a month a video card will cost running than another. It also shows that on a 975x mobo, the 2900XT uses less power than the GTX, and barely more than either GTS.

LOL. If you didn't care, you wouldn't post. LOL again? A man's PSU? Let me guess, you got your PSU from TestosteroneTech?

Any other sites showing less power consumption for a 2900XT?

I said I dont care how much it consumes, or worry about how many pennies more it will costs me a month compared to another card. I dont see why that is so "LOL"... And no I didnt get it from your made up site. I was forward thinking years so, and got a PC P&C PSU. One capable of handling any two cards with ease out there still today. Not some cheap crap with a LED fan and rainbow cables.

I dont know, I didnt go looking for any. And it doesnt show it using less under load, just under idle. And on a 975x chipset. I dont recall seeing any others using a 975x chipset when testing.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Wreckage
Originally posted by: apoppin
Originally posted by: Wreckage
http://www.hardforum.com/showthread.php?t=1195161
$295?

It seems the price keeps dropping.

what's a good SLI MB that will 'fit' my system and allow for QC expansion ?


:Q

http://www.newegg.com/Product/ProductLi...=ENE&DEPA=0&Description=650i&x=30&y=30

650i cheap, stable, SLI and quad support.

cheap?!?


i paid $25 for my current MB

how is that Abit Fatality SLI MB ?
--can i still use PC 6400?

EDIT: i see the ASUS recommended ... same question ... need to upgrade my DDR2 ... again?
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
Originally posted by: Wreckage
Originally posted by: apoppin
Originally posted by: Wreckage
http://www.hardforum.com/showthread.php?t=1195161
$295?

It seems the price keeps dropping.

what's a good SLI MB that will 'fit' my system and allow for QC expansion ?


:Q

$109 is dirt cheap for a performance SLI mobo.

Yes you can still use PC6400

IMO, that 650i mobo is the best deal for you.
http://www.newegg.com/Product/ProductLi...=ENE&DEPA=0&Description=650i&x=30&y=30

650i cheap, stable, SLI and quad support.

cheap?!?


i paid $25 for my current MB

how is that Abit Fatality SLI MB ?
--can i still use PC 6400?

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
OK ... ASUS ... not Abit ...

still pretty expensive ... anything *new* coming down the pipe?

some reason to hold off ... ?

just looking for the most options ... i am interested in Multi-GPU

withOUT "upgrading" my PS ...
... again
:roll:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
OK ... ASUS ... not Abit ...

still pretty expensive ... anything *new* coming down the pipe?

some reason to hold off ... ?

just looking for the most options ... i am interested in Multi-GPU

withOUT "upgrading" my PS ...
... again
:roll:
Just get a single 8800GTS 320mb. You won't need more than that unless you game on a 30"LCD at some mad resolution, trust me.

Why do you want SLI all of a sudden?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Originally posted by: apoppin
OK ... ASUS ... not Abit ...

still pretty expensive ... anything *new* coming down the pipe?

some reason to hold off ... ?

just looking for the most options ... i am interested in Multi-GPU

withOUT "upgrading" my PS ...
... again
:roll:
Just get a single 8800GTS 320mb. You won't need more than that unless you game on a 30"LCD at some mad resolution, trust me.

Why do you want SLI all of a sudden?
640MB
[and i could stick a GTS in this MB right now and get a hefty performance increase]

... and "why SLI?" ... because it doesn't look like HD2900xt is practical for anyone
[and i don't want to xfire x1900]

there is a contagion right now in the Video forum and i am also a "carrier" and infected
--i just want to experiment with either SLI or Xfire
:Q

masochism?

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
Originally posted by: SickBeast
Originally posted by: apoppin
OK ... ASUS ... not Abit ...

still pretty expensive ... anything *new* coming down the pipe?

some reason to hold off ... ?

just looking for the most options ... i am interested in Multi-GPU

withOUT "upgrading" my PS ...
... again
:roll:
Just get a single 8800GTS 320mb. You won't need more than that unless you game on a 30"LCD at some mad resolution, trust me.

Why do you want SLI all of a sudden?
640MB
[and i could stick a GTS in this MB right now and get a hefty performance increase]

... and "why SLI?" ... because it doesn't look like HD2900xt is practical for anyone
[and i don't want to xfire x1900]

there is a contagion right now in the Video forum and i am also a "carrier" and infected
--i just want to experiment with either SLI or Xfire
:Q

masochism?

They say in DX10 the cards can directly access main system memory, reducing the need for more graphics memory. IMO 640mb is a waste. Time will tell when Crysis comes out.

If you're gonna xfire anything, make it the 8800GTX.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast

They say in DX10 the cards can directly access main system memory, reducing the need for more graphics memory. IMO 640mb is a waste. Time will tell when Crysis comes out.

please explain THIS:

http://www.xbitlabs.com/articles/video/display/msi8800gts-640_22.html#sect0
Unfortunately, the technically promising graphics card, which differs from the more expensive version in the amount of memory only, is sometimes much slower than the GeForce 8800 GTS 640MB not only in games that demand a lot of graphics memory (e.g. Serious Sam 2 ), but also in applications that didn?t reveal any difference between graphics cards with 512MB and 256MB of memory before. Particularly, these are TES IV: Oblivion, Neverwinter Nights 2, and F.E.A.R. Extraction Point . 320MB is considerably more than 256MB, so this is a memory management problem, probably a driver issue.
a memory management problem in the 320M vs. 640MB versions?!?!?!

--the 320MB version is what ? ... artificially slower than the 640MB version?
--a driver issue ?

WtF?

did this get fixed ... or is nvidia "holding back" [or did hold back] performance for pricing?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: apoppin
Originally posted by: SickBeast

They say in DX10 the cards can directly access main system memory, reducing the need for more graphics memory. IMO 640mb is a waste. Time will tell when Crysis comes out.

please explain THIS:

http://www.xbitlabs.com/articles/video/display/msi8800gts-640_22.html#sect0
Unfortunately, the technically promising graphics card, which differs from the more expensive version in the amount of memory only, is sometimes much slower than the GeForce 8800 GTS 640MB not only in games that demand a lot of graphics memory (e.g. Serious Sam 2 ), but also in applications that didn?t reveal any difference between graphics cards with 512MB and 256MB of memory before. Particularly, these are TES IV: Oblivion, Neverwinter Nights 2, and F.E.A.R. Extraction Point . 320MB is considerably more than 256MB, so this is a memory management problem, probably a driver issue.
a memory management problem in the 320M vs. 640MB versions?!?!?!

--the 320MB version is what ? ... artificially slower than the 640MB version?
--a driver issue ?

WtF?

did this get fixed ... or is nvidia "holding back" [or did hold back] performance for pricing?

*anyone* please?

did this memory management problem in the 340 GTS get fixed?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
kinda ridiculously expensive ... don't you think?

$1100?

Yeah well if you SLI anything else it makes no sense.

A single 8800GTX will beat just about anything in SLI for less money, with less power, and far less driver issues.

Actually the X1950's are supposed to be pretty good in xfire. Might be fun to play with. Why don't you just do that?

IMO you're crazy buying a new motherboard just to do SLI. Wait for the DDR3 prices to fall then get one of the new 775 boards. :light:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Originally posted by: apoppin
kinda ridiculously expensive ... don't you think?

$1100?

Yeah well if you SLI anything else it makes no sense.

A single 8800GTX will beat just about anything in SLI for less money, with less power, and far less driver issues.

Actually the X1950's are supposed to be pretty good in xfire. Might be fun to play with. Why don't you just do that?

IMO you're crazy buying a new motherboard just to do SLI. Wait for the DDR3 prices to fall then get one of the new 775 boards. :light:

i *originally* was thinking about HD2900xt crossfire ... that will beat any single card
... but then there are "issues" ... big issues with noise, power and heat

then i just got *stuck* on wanting to try SLI/xfire

you are 100% right ... "wait" is the sane and sound thing to do ... absolutely NOTHING requires me to upgrade anything for 14x9 gaming right now ... but in case you haven't noticed ... an "upgrade fever" has gripped this forum ....

even though i already upgraded .. i still can't stop thinking about doing it again


perhaps it is because i am just needing a new game ... and somehow RE-4 isn't quite satisfying [i don't really like controllers] ... so i ordered Gothic2 and it's expansion from GoGamer for $11.10 shipped ... maybe that will work while my fever subsides

thanks for the sanity lecture, bro
[but i just can't seem to put that CC away]
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: SickBeast
A single 8800GTX will beat just about anything in SLI for less money

i had the opportunity of comparing sli vs gtx (only had 320mb gts avail tho) and found that at the res my monitor runs (1680x1050) the sli'd 320mb were at least as fast or faster than the single GTX, and the price would be about the same.

the system really wasn't any noiser with 2x gts over 1. power consump certainly would vary, but as i didn't have anything to measure the draw, i couldn't say by how much. at any rate, my corsair hx620 didn't have a problem with any of it.

perhaps at 19k res and above the picture may change, but my monitor isn't capable.

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: CaiNaM
Originally posted by: SickBeast
A single 8800GTX will beat just about anything in SLI for less money

i had the opportunity of comparing sli vs gtx (only had 320mb gts avail tho) and found that at the res my monitor runs (1680x1050) the sli'd 320mb were at least as fast or faster than the single GTX, and the price would be about the same.

the system really wasn't any noiser with 2x gts over 1. power consump certainly would vary, but as i didn't have anything to measure the draw, i couldn't say by how much. at any rate, my corsair hx620 didn't have a problem with any of it.

perhaps at 19k res and above the picture may change, but my monitor isn't capable.
Why on earth would you need that much shader power at such a low resolution? My 320mb 8800GTS is overkill for me at 1920x1200 in most scenarios!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |