Originally posted by: gizbug
This card did not impress me at all
I went with the 8800gts, couldn't be happier
Originally posted by: cmdrdredd
Originally posted by: Azn
Originally posted by: cmdrdredd
Originally posted by: Azn
Originally posted by: bryanW1995
you don't want to run that fan at anything close to 100% for an extended period of time. well, unless you work around jet engines on a daily basis. and you're deaf.Originally posted by: Azn
Originally posted by: cmdrdredd
Originally posted by: Azn
ATI might have lowered the voltage compared 2900xt. Who knows. Make sure you fan is @ 100% when you test for artifacts.
Not necessary because the fan will never be at 100% during games. If you test at 100% fan rotation and get 0 artifacts there is still a chance to get some during play because the fan rarely goes above 45%.
Why not? Fan is adjustable is it not?
LOL. Just for testing overclocks. Some people don't mind the noise.
Yeah but then if you have to run 100% fan speed to maintain that overclock, what is the damn point?
To find your max overclock limit.
Don't you test your hardware when you first get it and than back off some for tension? How would you know what would be good for a overclock if you don't know the limits? :roll:
Later on you can also buy a better after market cooler for it to get those overclocks. No?
Not necessary. Anyone can run 880/1100 at 100% fan speed and say "oh look 3dmark got 1289371238971289378912 points" that is pointless.
What can you get while running the fan at usable levels. Anyone who calls 100% fan speed usable must be deaf. I'm not gonna lie, it's loud.
It's like running dry ice on your C2D for 5min to get the fastest super pi score. WHat is the point? You won't use it like that every day.
Originally posted by: lopri
After the initial quirks related to the driver installer packages, I am happy with my 2900 Pro 1GB. This is just a preliminary statement, though, because I want to test out Blu-Ray playback on a 30" monitor. A few observations:
- I could see where the arguments regarding noise comes from. It's definitely loud in 3D but the noise tone has changed from many of ATI's previous cards (such as X850XT). In the past the noise was more like 'chrrrr' or 'krrrrr', but with 2900 it's got 'brrrrr' or 'frrrr' tone. Under 2D inside a case, it's not really noticeable.
- I'm kinda confident that these 1GB GDDR4 version 2900 Pro's are rebadged XT's. Mine overclocks easily to 750/1000.
- AA really kills the performance of this card. Interestingly, if I don't use any AA, the card performs very well even @2560x1600 (tested with Oblivion).
- There is a subtle difference in image quality between ATI vs NV. I can't tell it in a clear manner but 2900 gives shinier/lighter feel while 8800 gives heavier/more saturated feel. It was very noticeable especially under default driver settings.
Originally posted by: bryanW1995
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.
Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?
Originally posted by: lopri
- I'm kinda confident that these 1GB GDDR4 version 2900 Pro's are rebadged XT's. Mine overclocks easily to 750/1000.
- AA really kills the performance of this card. Interestingly, if I don't use any AA, the card performs very well even @2560x1600 (tested with Oblivion).
Originally posted by: lopri
Getting closer.. currently looping 3DMark06 @800/1000.. (2560x1600)
Selected update details are listed below; a full list of fixes and technical data can be found on the Futuremark website. 3DMark05 1.3.0 is not Windows Vista enabled and should only be used on 32 bit Windows XP systems with the latest updates.
3DMark06 1.1.0:
* Windows Vista enabled;
* Internet Explorer 7 compatibility;
* Improved start-up speed by optimizing the SystemInfo component;
* Updated SystemInfo component with support for the latest CPU and graphics hardware;
* Fixes all reported & reproduced issues.
PCMark05 1.2.0:
* Windows Vista enabled;
* Internet Explorer 7 compatibility;
* Fixed Windows Media Player version detection;
* Improved start-up speed by optimizing the SystemInfo component;
* Updated SystemInfo component with support for the latest CPU and graphics hardware;
* Fixes all reported & reproduced issues.
That I didn't know. I do have a copy of Call of Juarez (came with an X2 3800+) so I can check it out. But it's very possible that DX9 codes and DX10 codes stress different part of the silicon, I guess.Originally posted by: n7
Only crappy thing is that i found that DX10 (at least Call of Juarez benchmark) couldn't handle the OC DX9 could.
Originally posted by: Oyeve
Well, I installed my 2900pro this weekend. Went from an ATI X1950XTX using cat ver 7.9 and the 2900pro went urp. Uninstalled all ATI crap. reinstalled the 7.9 drivers and CCC would crap out, uninstalled all ATI crap again and used the cd drivers that came with card, all worked great so I went and installed 7.9 drivers again and all is working great. All my games run perfect will everything turned all the way up. Looks fantastic. BUT, I cannot run any standard benchmarks on this sucker. 3dmark06 and 06 just hangs. What are people using as a bench for this card? I installed the latest beta ATITOOLS so can control the fan but I wanna see 3dmark scores! I live this card BTW.
http://www.overclock.net/ati/2...900xt-info-thread.htmlHow do I get 3Dmark to work with the HD 2900XT?
Go to the c:\windows\system32\futuremark\msc directory and rename or delete the file "direcpll.dll" if you install another 3Dmark like 03 or 05 you will have to do this again.
Originally posted by: SniperDaws
This isnt a bad review http://www.guru3d.com/article/review/463/1
But all in all the 8800GTS 320mb beats the 2900pro AND the 2900XT when all cards concerend arent overclocked that is.
The Verdict ...
So for 249 USD you have one exceptional deal here. Get it, grab Rivatuner, find you maximum OC (there's a lot of headroom to play around with) and you'll have a rather extensive smile on your face. I figure that if you flash an XT BIOS into this card, you're good to go as well *coughs* did I just say that out loud? And I have to include this ... even if your card couldn't even overclock 1 single MHZ, this product still would be an excellent performer for the money. Now, the GeForce 8800 GTS 320MB is not far away from this product performance and pricing wise. You've seen the results and you've seen differences. It's not a win/lose situation, the cards are competitive with each other.
Originally posted by: lopri
Originally posted by: Oyeve
Well, I installed my 2900pro this weekend. Went from an ATI X1950XTX using cat ver 7.9 and the 2900pro went urp. Uninstalled all ATI crap. reinstalled the 7.9 drivers and CCC would crap out, uninstalled all ATI crap again and used the cd drivers that came with card, all worked great so I went and installed 7.9 drivers again and all is working great. All my games run perfect will everything turned all the way up. Looks fantastic. BUT, I cannot run any standard benchmarks on this sucker. 3dmark06 and 06 just hangs. What are people using as a bench for this card? I installed the latest beta ATITOOLS so can control the fan but I wanna see 3dmark scores! I live this card BTW.
http://www.overclock.net/ati/2...900xt-info-thread.htmlHow do I get 3Dmark to work with the HD 2900XT?
Go to the c:\windows\system32\futuremark\msc directory and rename or delete the file "direcpll.dll" if you install another 3Dmark like 03 or 05 you will have to do this again.
You must have a REALLY good speaker system if you don't mind running a 2900xt at 100% fan while gaming. Do you have some of those cool bose noise-cancelling headphones?Originally posted by: Azn
Originally posted by: bryanW1995
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.
Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?
Why wouldn't I know that some fans are controlled by temps? You do realize that some overclocks need cooler temps to be stable right?
No I like messing with my own settings. I don't need ATI, Nvidia, Ford, or even you telling me how I can use my machine.
When I'm browsing internet and doing non 3D of course I want the fan quiet as possible. When gaming I wouldn't care if it was 100% fan because my speakers would be kicking and fan noise wouldn't bother me if it gave me better overclock.
WOW you still shooting yourself in the foot about Nvidia knowing ATI's specs before designing comparable card? You have proof of this? And no I don't want to hear your conspiracy theories. :roll:
What does someone being in college have anything to do with subject at hand?
Originally posted by: bryanW1995
You must have a REALLY good speaker system if you don't mind running a 2900xt at 100% fan while gaming. Do you have some of those cool bose noise-cancelling headphones?Originally posted by: Azn
Originally posted by: bryanW1995
you probably don't know this, but the cpu fan speed increases on an ati card as the temp goes up. Therefore, there is no need to ever force the fan to 100%.
Are you still harping on that nvidia/ati knowing each other's specs issue? Do you think that ford doesn't know 6 months before the general public what gm is going to release? How naiive are you? A slightly above average intelligence person who has spent a few days on this or any other decent computer tech site usually has a pretty good idea of upcoming releases from nvidia/intel/DAAMIT/etc well before they reach the general public. Do you honestly think that ati and nvidia bury their heads in the sand and ignore all rumors/rumblings about future products that their competition is working on? What about when one person quits/gets fired and hires on with the competition? Tell me, honestly, you're in college, right?
Why wouldn't I know that some fans are controlled by temps? You do realize that some overclocks need cooler temps to be stable right?
No I like messing with my own settings. I don't need ATI, Nvidia, Ford, or even you telling me how I can use my machine.
When I'm browsing internet and doing non 3D of course I want the fan quiet as possible. When gaming I wouldn't care if it was 100% fan because my speakers would be kicking and fan noise wouldn't bother me if it gave me better overclock.
WOW you still shooting yourself in the foot about Nvidia knowing ATI's specs before designing comparable card? You have proof of this? And no I don't want to hear your conspiracy theories. :roll:
What does someone being in college have anything to do with subject at hand?
Sorry for accusing of spending time in college. You were just acting like we live in an idealized world where white is white, black is black, and industrial espionage only happens in the movies or a history book. I should have known better.
I agree that the card has a lot of potential with shader heavy games. Learned this while playing Splinter Cell: Double Agent. While the game is fubar'ed by Ubisoft's horrendous porting as well as their notorious usage of copy protection, I'm confident that R600 will come out ahead of G80 on this title. (gotta borrow a GTX again to test this out)Originally posted by: tuteja1986
AA doesn't kill the card performance like you make it out be. the card has alot of potential and i think with few driver update it will best price vs perfomance gpu. when the X1900xt 256mb came it was the best right of the bat , it took 1 driver update to fix performance issue with games like oblivion.
Originally posted by: lopri
Overall, for pure gaming experience G80 is the way to go. Even with the price parity between 2900 Pro and GTS 320, the latter seems to give better performance overall as well as image quality. Unfortunately for 2900 Pro, AA isn't really an option if you were to play modern games at 1600x1200 and up. Even AF is very taxing from my observations. G80 takes very little hit from enabling X16AF and the quality of filtering is magnificent. With R600, enabling 16AF takes ~20% (a very crude approximation) of performance hit and even then I see less dense textures and moire effects. It's such a shame because the 2900 XT/Pro design has a lot other things going for it.
There is one extremely positive exception, however. I know why this hasn't been mentioned yet, but if you play recent PC ports of XBox 360 games (i.e. Ubisoft's recent offerings) this card (2900 XT/Pro) stretches its legs like there is no tomorrow. I was almost shocked even though I sort of expected a better showing of 2900 in these titles. Splinter Cell: Double Agent, Rainbow Six: Vegas, and GRAW, and so on - in these games, 2900 Pro is way ahead of 8800 GTS, be it compatibility or performance. For example, I played SC: DA with its default wide screen resolution of 1280x760 and the FPS stayed at 100FPS (game limit) practically throughout the game. I configured the .ini file to set the resolution as 1920x1200. The FPS hovers around 70~90FPS with in-game setting maxed. It was amazing because this game has been quite notorious for various reasons. Yon can check AT's own review on this game here.
So this card is a definitely mixed bag. More on this later.
Originally posted by: lopri
I am trying to talk my friend into selling my original GTX back which I sold him about 5 months ago. He is computer-illiterate and believes that $500 video card helps the Photoshop performance so I'm sure I can work something out for him.