ATI has features like HDR(not FP16 of course) + AA
Originally posted by: Matthias99
ATI has features like HDR(not FP16 of course) + AA
The X1800/X1900 cards have OpenEXR HDR support.
Originally posted by: TecHNooB
The two are pretty even. The biggest problem with the X1900 is that it runs super hot (load temps can easily exceed 80C). The 7900GTX is still based on G70 architecture so it can't do HDR+AA, which sucks because who wants to spend $500+ on a videocard and be unable to use HDR and AA at the same time.
Originally posted by: TecHNooB
The two are pretty even. The biggest problem with the X1900 is that it runs super hot (load temps can easily exceed 80C). The 7900GTX is still based on G70 architecture so it can't do HDR+AA, which sucks because who wants to spend $500+ on a videocard and be unable to use HDR and AA at the same time.
Originally posted by: DeathReborn
Originally posted by: TecHNooB
The two are pretty even. The biggest problem with the X1900 is that it runs super hot (load temps can easily exceed 80C). The 7900GTX is still based on G70 architecture so it can't do HDR+AA, which sucks because who wants to spend $500+ on a videocard and be unable to use HDR and AA at the same time.
That "who" is people like me. I at present have not seen a implementation of HDR that looks like they claim "more realistic". The closest to "realistic HDR" I have seen in games if HL2's implementation.
For me HDR is not a feature i'm going to be using in the near future. Oblivion may have HDR but i'll turn it off unless it produces the realistic effect it's supposed to. AA is a much more attractive feature to me, something both NV & ATI can do.
7900GTX = X1900 imo
Originally posted by: Extelleron
Originally posted by: DeathReborn
Originally posted by: TecHNooB
The two are pretty even. The biggest problem with the X1900 is that it runs super hot (load temps can easily exceed 80C). The 7900GTX is still based on G70 architecture so it can't do HDR+AA, which sucks because who wants to spend $500+ on a videocard and be unable to use HDR and AA at the same time.
That "who" is people like me. I at present have not seen a implementation of HDR that looks like they claim "more realistic". The closest to "realistic HDR" I have seen in games if HL2's implementation.
For me HDR is not a feature i'm going to be using in the near future. Oblivion may have HDR but i'll turn it off unless it produces the realistic effect it's supposed to. AA is a much more attractive feature to me, something both NV & ATI can do.
7900GTX = X1900 imo
The way Oblivion renders in HDR allows for HDR + AA even on nVidia cards as far as I know.
Originally posted by: EuR Fr3nCh T3rRoR
Heres a quick question, In my second rig, I have 2 EVGA 256MB 7800GTX's. I sold them and can now purchase a X1900XTX and have a little bit of money in my pocket. If I do this, will I see a decrease in performance? I mean, am I doing the wrong thing here? I am never getting rid of my 2 512MB 7800GTX's that are in my first rig, but the two 256 7800GTX's seemed a little old to me, so I kind of tempted to get rid of em....Let me know what you think mate...By the way, I like to play my games at 1600x1200 with ATLEAST 4xAA and 8xAF....Will i see a decrease in performance by doing this trade?
Originally posted by: DeathReborn
Originally posted by: Extelleron
Originally posted by: DeathReborn
Originally posted by: TecHNooB
The two are pretty even. The biggest problem with the X1900 is that it runs super hot (load temps can easily exceed 80C). The 7900GTX is still based on G70 architecture so it can't do HDR+AA, which sucks because who wants to spend $500+ on a videocard and be unable to use HDR and AA at the same time.
That "who" is people like me. I at present have not seen a implementation of HDR that looks like they claim "more realistic". The closest to "realistic HDR" I have seen in games if HL2's implementation.
For me HDR is not a feature i'm going to be using in the near future. Oblivion may have HDR but i'll turn it off unless it produces the realistic effect it's supposed to. AA is a much more attractive feature to me, something both NV & ATI can do.
7900GTX = X1900 imo
The way Oblivion renders in HDR allows for HDR + AA even on nVidia cards as far as I know.
If that is the case then i'll still turn it off if in my eyes it is of substandard quality. If it looks good enough to me i'll use it. On the other hand, if it looks good and is not able to run on NV cards (HDR+AA that is) then I have a X1800XT Crossfire setup on my other PC to run it with.
Originally posted by: Dethfrumbelo
Hmm, it was my understanding that x1800/1900 cards generally use FX10 when AA is applied to HDR, because the performance hit would be too big with FP16.
Originally posted by: darXoul
Like I said many times before, I don't like HDR. I really don't know what's so special about it. The bloom effect makes scenes look almost surreal. Maybe HL2 is decent in terms of HDR implementation. Other than that, also considering the performance drop, I'd stay away from HDR. Of course, this is my subjective, personal opinion.
Like I said many times before, I don't like HDR. I really don't know what's so special about it. The bloom effect makes scenes look almost surreal. Maybe HL2 is decent in terms of HDR implementation. Other than that, also considering the performance drop, I'd stay away from HDR. Of course, this is my subjective, personal opinion.
Originally posted by: EuR Fr3nCh T3rRoR
Like I said many times before, I don't like HDR. I really don't know what's so special about it. The bloom effect makes scenes look almost surreal. Maybe HL2 is decent in terms of HDR implementation. Other than that, also considering the performance drop, I'd stay away from HDR. Of course, this is my subjective, personal opinion.
I think the Source engine implemented HDR the best, my opinion. I was blown away when I first played Lost Coast.