So I just got my 2900 Pro

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gizbug

Platinum Member
May 14, 2001
2,621
0
76
This card did not impress me at all
I went with the 8800gts, couldn't be happier
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: gizbug
This card did not impress me at all
I went with the 8800gts, couldn't be happier

so you just felt compelled to tell us? This thread is not about your 8800gts. Thanks :roll:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: cmdrdredd
Originally posted by: Azn
Originally posted by: cmdrdredd
Originally posted by: Azn
Originally posted by: bryanW1995
Originally posted by: Azn
Originally posted by: cmdrdredd
Originally posted by: Azn
ATI might have lowered the voltage compared 2900xt. Who knows. Make sure you fan is @ 100% when you test for artifacts.

Not necessary because the fan will never be at 100% during games. If you test at 100% fan rotation and get 0 artifacts there is still a chance to get some during play because the fan rarely goes above 45%.

Why not? Fan is adjustable is it not?
you don't want to run that fan at anything close to 100% for an extended period of time. well, unless you work around jet engines on a daily basis. and you're deaf.

LOL. Just for testing overclocks. Some people don't mind the noise.

Yeah but then if you have to run 100% fan speed to maintain that overclock, what is the damn point?

To find your max overclock limit.

Don't you test your hardware when you first get it and than back off some for tension? How would you know what would be good for a overclock if you don't know the limits? :roll:

Later on you can also buy a better after market cooler for it to get those overclocks. No?

Not necessary. Anyone can run 880/1100 at 100% fan speed and say "oh look 3dmark got 1289371238971289378912 points" that is pointless.

What can you get while running the fan at usable levels. Anyone who calls 100% fan speed usable must be deaf. I'm not gonna lie, it's loud.

It's like running dry ice on your C2D for 5min to get the fastest super pi score. WHat is the point? You won't use it like that every day.

So you might be one who don't test your hardware to know the limits before setting your overclocks. I for one test my hardware before setting at a clock that I feel comfortable with.

For example I have E6300 @ 3.05ghz @ default voltage. I could easily do 3.5+ghz 1.55volts with the case off. Now I have $30 medium range cooler which is silent BTW. If I wanted to run 3.5ghz I could easily get a top end air cooler to run it at that clock if I wanted. You can easily do this with video cards. Knowing your limits also tell you what you can achieve WORTHWHILE when you get a after market cooler.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.

Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
After the initial quirks related to the driver installer packages, I am happy with my 2900 Pro 1GB. This is just a preliminary statement, though, because I want to test out Blu-Ray playback on a 30" monitor. A few observations:

  1. - I could see where the arguments regarding noise comes from. It's definitely loud in 3D but the noise tone has changed from many of ATI's previous cards (such as X850XT). In the past the noise was more like 'chrrrr' or 'krrrrr', but with 2900 it's got 'brrrrr' or 'frrrr' tone. Under 2D inside a case, it's not really noticeable.

    - I'm kinda confident that these 1GB GDDR4 version 2900 Pro's are rebadged XT's. Mine overclocks easily to 750/1000.

    - AA really kills the performance of this card. Interestingly, if I don't use any AA, the card performs very well even @2560x1600 (tested with Oblivion).

    - There is a subtle difference in image quality between ATI vs NV. I can't tell it in a clear manner but 2900 gives shinier/lighter feel while 8800 gives heavier/more saturated feel. It was very noticeable especially under default driver settings.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: lopri
After the initial quirks related to the driver installer packages, I am happy with my 2900 Pro 1GB. This is just a preliminary statement, though, because I want to test out Blu-Ray playback on a 30" monitor. A few observations:

  1. - I could see where the arguments regarding noise comes from. It's definitely loud in 3D but the noise tone has changed from many of ATI's previous cards (such as X850XT). In the past the noise was more like 'chrrrr' or 'krrrrr', but with 2900 it's got 'brrrrr' or 'frrrr' tone. Under 2D inside a case, it's not really noticeable.

    - I'm kinda confident that these 1GB GDDR4 version 2900 Pro's are rebadged XT's. Mine overclocks easily to 750/1000.

    - AA really kills the performance of this card. Interestingly, if I don't use any AA, the card performs very well even @2560x1600 (tested with Oblivion).

    - There is a subtle difference in image quality between ATI vs NV. I can't tell it in a clear manner but 2900 gives shinier/lighter feel while 8800 gives heavier/more saturated feel. It was very noticeable especially under default driver settings.

AA doesn't kill the card performance like you make it out be. the card has alot of potential and i think with few driver update it will best price vs perfomance gpu. when the X1900xt 256mb came it was the best right of the bat , it took 1 driver update to fix performance issue with games like oblivion.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: bryanW1995
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.

Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?

Why wouldn't I know that some fans are controlled by temps? You do realize that some overclocks need cooler temps to be stable right?

No I like messing with my own settings. I don't need ATI, Nvidia, Ford, or even you telling me how I can use my machine.

When I'm browsing internet and doing non 3D of course I want the fan quiet as possible. When gaming I wouldn't care if it was 100% fan because my speakers would be kicking and fan noise wouldn't bother me if it gave me better overclock.

WOW you still shooting yourself in the foot about Nvidia knowing ATI's specs before designing comparable card? You have proof of this? And no I don't want to hear your conspiracy theories. :roll:

What does someone being in college have anything to do with subject at hand?

 

n7

Elite Member
Jan 4, 2004
21,303
4
81
Originally posted by: lopri
- I'm kinda confident that these 1GB GDDR4 version 2900 Pro's are rebadged XT's. Mine overclocks easily to 750/1000.

- AA really kills the performance of this card. Interestingly, if I don't use any AA, the card performs very well even @2560x1600 (tested with Oblivion).

Very very nice.

I spent $600 on my HD 2900 XT 1 GB back when i got to compare to my 8800 GTX.

You basically just got that for half that price :Q :thumbsup:
(Though granted, i could do something like 850/1150 IIRC...)

And AA...the reason i've been disappointed with AMD this round.
Well, that & the fact they seem to have to release driver updates to be able to decently perform in all the DX10 games being releases.
Except that still isn't working for some...
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Getting closer.. currently looping 3DMark06 @800/1000.. (2560x1600)
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
Originally posted by: lopri
Getting closer.. currently looping 3DMark06 @800/1000.. (2560x1600)

Only crappy thing is that i found that DX10 (at least Call of Juarez benchmark) couldn't handle the OC DX9 could.

Back when i had that card, i didn't have really anything DX10 though.

Now, different story.
 

Oyeve

Lifer
Oct 18, 1999
21,946
839
126
Well, I installed my 2900pro this weekend. Went from an ATI X1950XTX using cat ver 7.9 and the 2900pro went urp. Uninstalled all ATI crap. reinstalled the 7.9 drivers and CCC would crap out, uninstalled all ATI crap again and used the cd drivers that came with card, all worked great so I went and installed 7.9 drivers again and all is working great. All my games run perfect will everything turned all the way up. Looks fantastic. BUT, I cannot run any standard benchmarks on this sucker. 3dmark06 and 06 just hangs. What are people using as a bench for this card? I installed the latest beta ATITOOLS so can control the fan but I wanna see 3dmark scores! I live this card BTW.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
install the *latest* version of 3DMarkXX

http://www.futuremark.com/comp...m/pressreleases/49968/
Selected update details are listed below; a full list of fixes and technical data can be found on the Futuremark website. 3DMark05 1.3.0 is not Windows Vista enabled and should only be used on 32 bit Windows XP systems with the latest updates.

3DMark06 1.1.0:

* Windows Vista enabled;
* Internet Explorer 7 compatibility;
* Improved start-up speed by optimizing the SystemInfo component;
* Updated SystemInfo component with support for the latest CPU and graphics hardware;
* Fixes all reported & reproduced issues.

PCMark05 1.2.0:

* Windows Vista enabled;
* Internet Explorer 7 compatibility;
* Fixed Windows Media Player version detection;
* Improved start-up speed by optimizing the SystemInfo component;
* Updated SystemInfo component with support for the latest CPU and graphics hardware;
* Fixes all reported & reproduced issues.
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: n7
Only crappy thing is that i found that DX10 (at least Call of Juarez benchmark) couldn't handle the OC DX9 could.
That I didn't know. I do have a copy of Call of Juarez (came with an X2 3800+) so I can check it out. But it's very possible that DX9 codes and DX10 codes stress different part of the silicon, I guess.

So far Blu-Ray playback over dual-link DVI has been flawless in windowed or letterbox'ed mode (1920x1080). It's one of the features I wanted which wasn't viable with G80, so I'm pretty happy with that. With 8800 GTX I couldn't even get the HD playback to work on a 30" monitor - only on 24" via single-link DVI. It's probably not a big deal for many, but I'm planning to get rid of my aging 2405FPW so this dual-link HDCP is a very nice feature to have.

Contrary to AT's review, I can attest that the quality of HD playback is absolutely superior with 2900 than with a 8600/8800. I have played:

  1. The Departed (VC-1, Avg. 20~25Mbps)
    Disturbia (H.264, Avg. 24~35Mbps)
    The Devil Wears Prada (MPEG-2, Avg. 15~20Mbps)
    The Planet Earth (VC-1, Avg. 15~25Mbps)
The difference was incredible. Fudgeness, banding, and tearing I would see with a 8600 GT disappeared completely. I don't understand where Derek Wilson's conclusion comes from. (For that matter, I'm unsure about the HQV benchmark as well as the titles he used for comparison) This is from image quality's point of view - CPU usage is hard to compare on my end because previously I used X2 3800+ (@3.0GHz) with a 8600 GT and this 2900 Pro is sitting next to Pentium 2140 (@2.67GHz). Also it's worth mentioning that manipulating playback was unstable with a 8600 GT from time to time. (such as fast-forward/rewind, speed-search, and resizing/positioning window, etc.) All this with PowerDVD which is known to be more optimized for G84 than R600.

With my own experience as well as Scott Wasson's review here, I have to question Derek Wilson's article as a whole. If anything it shows once again where his stance belongs. I now regret my 8600 GT purchases (2 of them) based on his review and wish I had tried a 2600 XT.
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: Oyeve
Well, I installed my 2900pro this weekend. Went from an ATI X1950XTX using cat ver 7.9 and the 2900pro went urp. Uninstalled all ATI crap. reinstalled the 7.9 drivers and CCC would crap out, uninstalled all ATI crap again and used the cd drivers that came with card, all worked great so I went and installed 7.9 drivers again and all is working great. All my games run perfect will everything turned all the way up. Looks fantastic. BUT, I cannot run any standard benchmarks on this sucker. 3dmark06 and 06 just hangs. What are people using as a bench for this card? I installed the latest beta ATITOOLS so can control the fan but I wanna see 3dmark scores! I live this card BTW.

How do I get 3Dmark to work with the HD 2900XT?
Go to the c:\windows\system32\futuremark\msc directory and rename or delete the file "direcpll.dll" if you install another 3Dmark like 03 or 05 you will have to do this again.
http://www.overclock.net/ati/2...900xt-info-thread.html
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SniperDaws
This isnt a bad review http://www.guru3d.com/article/review/463/1

But all in all the 8800GTS 320mb beats the 2900pro AND the 2900XT when all cards concerend arent overclocked that is.

actually their graphs are really annoying ... they change the positions and colors of the GPU bars in their graphs - it isn't consistent - as though there are several testers and no single editor.

also ... their conclusion disagrees with yours:

The Verdict ...

So for 249 USD you have one exceptional deal here. Get it, grab Rivatuner, find you maximum OC (there's a lot of headroom to play around with) and you'll have a rather extensive smile on your face. I figure that if you flash an XT BIOS into this card, you're good to go as well *coughs* did I just say that out loud? And I have to include this ... even if your card couldn't even overclock 1 single MHZ, this product still would be an excellent performer for the money. Now, the GeForce 8800 GTS 320MB is not far away from this product performance and pricing wise. You've seen the results and you've seen differences. It's not a win/lose situation, the cards are competitive with each other.
 

Oyeve

Lifer
Oct 18, 1999
21,946
839
126
Originally posted by: lopri
Originally posted by: Oyeve
Well, I installed my 2900pro this weekend. Went from an ATI X1950XTX using cat ver 7.9 and the 2900pro went urp. Uninstalled all ATI crap. reinstalled the 7.9 drivers and CCC would crap out, uninstalled all ATI crap again and used the cd drivers that came with card, all worked great so I went and installed 7.9 drivers again and all is working great. All my games run perfect will everything turned all the way up. Looks fantastic. BUT, I cannot run any standard benchmarks on this sucker. 3dmark06 and 06 just hangs. What are people using as a bench for this card? I installed the latest beta ATITOOLS so can control the fan but I wanna see 3dmark scores! I live this card BTW.

How do I get 3Dmark to work with the HD 2900XT?
Go to the c:\windows\system32\futuremark\msc directory and rename or delete the file "direcpll.dll" if you install another 3Dmark like 03 or 05 you will have to do this again.
http://www.overclock.net/ati/2...900xt-info-thread.html

This worked, thanks for the info. Only got 8895. No OC and card at default settings. Dunno if this is average or not as the online stat said said I had the lowest score so far for this card.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Azn
Originally posted by: bryanW1995
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.

Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?

Why wouldn't I know that some fans are controlled by temps? You do realize that some overclocks need cooler temps to be stable right?

No I like messing with my own settings. I don't need ATI, Nvidia, Ford, or even you telling me how I can use my machine.

When I'm browsing internet and doing non 3D of course I want the fan quiet as possible. When gaming I wouldn't care if it was 100% fan because my speakers would be kicking and fan noise wouldn't bother me if it gave me better overclock.

WOW you still shooting yourself in the foot about Nvidia knowing ATI's specs before designing comparable card? You have proof of this? And no I don't want to hear your conspiracy theories. :roll:

What does someone being in college have anything to do with subject at hand?
You must have a REALLY good speaker system if you don't mind running a 2900xt at 100% fan while gaming. Do you have some of those cool bose noise-cancelling headphones?

Sorry for accusing of spending time in college. You were just acting like we live in an idealized world where white is white, black is black, and industrial espionage only happens in the movies or a history book. I should have known better.
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Overall, for pure gaming experience G80 is the way to go. Even with the price parity between 2900 Pro and GTS 320, the latter seems to give better performance overall as well as image quality. Unfortunately for 2900 Pro, AA isn't really an option if you were to play modern games at 1600x1200 and up. Even AF is very taxing from my observations. G80 takes very little hit from enabling X16AF and the quality of filtering is magnificent. With R600, enabling 16AF takes ~20% (a very crude approximation) of performance hit and even then I see less dense textures and moire effects. It's such a shame because the 2900 XT/Pro design has a lot other things going for it.

There is one extremely positive exception, however. I know why this hasn't been mentioned yet, but if you play recent PC ports of XBox 360 games (i.e. Ubisoft's recent offerings) this card (2900 XT/Pro) stretches its legs like there is no tomorrow. I was almost shocked even though I sort of expected a better showing of 2900 in these titles. Splinter Cell: Double Agent, Rainbow Six: Vegas, and GRAW, and so on - in these games, 2900 Pro is way ahead of 8800 GTS, be it compatibility or performance. For example, I played SC: DA with its default wide screen resolution of 1280x760 and the FPS stayed at 100FPS (game limit) practically throughout the game. I configured the .ini file to set the resolution as 1920x1200. The FPS hovers around 70~90FPS with in-game setting maxed. It was amazing because this game has been quite notorious for various reasons. Yon can check AT's own review on this game here.

So this card is a definitely mixed bag. More on this later.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: bryanW1995
Originally posted by: Azn
Originally posted by: bryanW1995
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.

Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?

Why wouldn't I know that some fans are controlled by temps? You do realize that some overclocks need cooler temps to be stable right?

No I like messing with my own settings. I don't need ATI, Nvidia, Ford, or even you telling me how I can use my machine.

When I'm browsing internet and doing non 3D of course I want the fan quiet as possible. When gaming I wouldn't care if it was 100% fan because my speakers would be kicking and fan noise wouldn't bother me if it gave me better overclock.

WOW you still shooting yourself in the foot about Nvidia knowing ATI's specs before designing comparable card? You have proof of this? And no I don't want to hear your conspiracy theories. :roll:

What does someone being in college have anything to do with subject at hand?
You must have a REALLY good speaker system if you don't mind running a 2900xt at 100% fan while gaming. Do you have some of those cool bose noise-cancelling headphones?

Sorry for accusing of spending time in college. You were just acting like we live in an idealized world where white is white, black is black, and industrial espionage only happens in the movies or a history book. I should have known better.

Nope I have crappy speakers and I have crappy headphones. I have crap. Not everyone cares for noise especially when gaming with speakers up.

What's wrong with people in college? Maybe you watched too many James bond movies while you were growing up and think about conspiracies on a daily basis. These corporate companies are legitimate businesses. If they get caught they could get fined or put in jail. I'm not saying it doesn't happen but there are people who would do the right thing before selling their soul short.
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: tuteja1986
AA doesn't kill the card performance like you make it out be. the card has alot of potential and i think with few driver update it will best price vs perfomance gpu. when the X1900xt 256mb came it was the best right of the bat , it took 1 driver update to fix performance issue with games like oblivion.
I agree that the card has a lot of potential with shader heavy games. Learned this while playing Splinter Cell: Double Agent. While the game is fubar'ed by Ubisoft's horrendous porting as well as their notorious usage of copy protection, I'm confident that R600 will come out ahead of G80 on this title. (gotta borrow a GTX again to test this out)

But in the meantime, the performance hit caused by AA on 2900 is quite REAL. Also it's got little to do with memory controllers or bandwidth, IMO. I say this because despite the lower FPS, I experience less stutters with 2900 than with 8800. It's the core itself that doesn't have enough power to process any degree of AA, and I don't think the situation will be rectified via drivers. Basically ATI designed the R600 with the mindset that they designed R500 with, IMO.

Check how performance is affected by AA in HL2: Episode One, a title known to be ATI-driver friendly.

4AA results in almost half the performance of no AA
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: lopri
Overall, for pure gaming experience G80 is the way to go. Even with the price parity between 2900 Pro and GTS 320, the latter seems to give better performance overall as well as image quality. Unfortunately for 2900 Pro, AA isn't really an option if you were to play modern games at 1600x1200 and up. Even AF is very taxing from my observations. G80 takes very little hit from enabling X16AF and the quality of filtering is magnificent. With R600, enabling 16AF takes ~20% (a very crude approximation) of performance hit and even then I see less dense textures and moire effects. It's such a shame because the 2900 XT/Pro design has a lot other things going for it.

There is one extremely positive exception, however. I know why this hasn't been mentioned yet, but if you play recent PC ports of XBox 360 games (i.e. Ubisoft's recent offerings) this card (2900 XT/Pro) stretches its legs like there is no tomorrow. I was almost shocked even though I sort of expected a better showing of 2900 in these titles. Splinter Cell: Double Agent, Rainbow Six: Vegas, and GRAW, and so on - in these games, 2900 Pro is way ahead of 8800 GTS, be it compatibility or performance. For example, I played SC: DA with its default wide screen resolution of 1280x760 and the FPS stayed at 100FPS (game limit) practically throughout the game. I configured the .ini file to set the resolution as 1920x1200. The FPS hovers around 70~90FPS with in-game setting maxed. It was amazing because this game has been quite notorious for various reasons. Yon can check AT's own review on this game here.

So this card is a definitely mixed bag. More on this later.

Did you get a 2900p to compare with your 320GTS?
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
No, unfortunately I've sold my GTS 320 in the anticipation of high-end SKU in November. (which turned out to be a stupid decision) I am trying to talk my friend into selling my original GTX back which I sold him about 5 months ago. He is computer-illiterate and believes that $500 video card helps the Photoshop performance so I'm sure I can work something out for him.

Regardless of the pure gaming performance, this 2900 Pro 1GB is a keeper for me due to quite a few important features. I haven't tested the audio pass-through yet via DVI-HDMI converter, so that's going to be the next. (along with performance tests)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
the *weird* thing is ... we might still get that High-end GPU - no one knows for sure
[even though i am also inclined to believe it will be both late and evolutionary].

i think the 2900p "price-performance" is good ... and if you game at 16x10 or below you will suffer no disadvantage compared with GTS whatsoever ... maybe even some advantages over the 320MB version. Especially with a decent OC. But of course, not in the GTX' class.
--that'd take an insane core OC.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: lopri
I am trying to talk my friend into selling my original GTX back which I sold him about 5 months ago. He is computer-illiterate and believes that $500 video card helps the Photoshop performance so I'm sure I can work something out for him.

:laugh:

Ok that made my day for the third time!

Theres a review on how bandwidth affects the R600/G80 in performance. Conclusion is that AA is hardly bottlenecked by bandwidth on R600 and the fact that G80 would have been better with 512bit bus and 512mb instead of 384/768 combination we are seeing today.

AA performance will never be fixed, because its hardware related. R600 does 4xAA in 2 cycles unlike G80 with its 1 cycle. Not to mention not having ROP resolve hence i.e AA being done through the shaders result in such dismal performance drop on R600 when AA is enabled. (The bottleneck isn't the bandwidth here but something else). The only thing ATi could do is possibly up the performance by other means. This shaderAA is a design choice and so far the cons (Horrid performance loss when using AA compared to competition) outweigh the pros. (Some newer titles not needing AA at all like UT3 based games)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |