BFG10K
Lifer
- Aug 14, 2000
- 22,709
- 2,979
- 126
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?
If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...
why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|
Originally posted by: Cookie Monster
The conclusion is that the X1 series will give you a better experience in oblivion unless we see more performance enhances from the drivers from each company yet to come.
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?
If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...
why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|
Originally posted by: otispunkmeyer
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?
If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...
why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|
would be a perfect excuse to try out the new VF900.....looks like a beast that thing....get your conductive pen out too, up the volts and scream along at over 700Mhz on the core.
did you see the VRzone article? he got about 800Mhz out of a 7900GT. simply amazing. it scored 7000 in 06....imagine 2 of them!
Originally posted by: keysplayr2003
I still subscribe to the theory that ATI renders a scenes AA to limited distances while Nvidia renders AA to the entire scene with no depth restrictions. It is said ATI has much more efficient AA methods, well, this would explain why. I have seen too many users mentioning this, that they noticed distance AA not being done on their ATI card but is being done on their Nvidia card. These are usually people who have already owned a 7800GT/X already and went and bought a X1800/X1900. Those that spend 500+ on these cards, I tend to believe it. I wish I had the extra cash to have my own little test lab here, but not this year.
Originally posted by: Cookie Monster
Well, what do you think?
Originally posted by: nib95
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?
If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...
why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|
No it's not. Take it from an owner of both.
My single XTX was louder and hotter then both my 7900 GT's.
I know it sounds crazy, but honestly it's true.
Originally posted by: nib95
Originally posted by: keysplayr2003
I still subscribe to the theory that ATI renders a scenes AA to limited distances while Nvidia renders AA to the entire scene with no depth restrictions. It is said ATI has much more efficient AA methods, well, this would explain why. I have seen too many users mentioning this, that they noticed distance AA not being done on their ATI card but is being done on their Nvidia card. These are usually people who have already owned a 7800GT/X already and went and bought a X1800/X1900. Those that spend 500+ on these cards, I tend to believe it. I wish I had the extra cash to have my own little test lab here, but not this year.
Well as an owner of both I have to disagree with that.
I expressely remember trees and objects in-long distances being AA'd with ATI cards.
Granted I was use Adaptive AA.
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?
I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.
Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?
Originally posted by: BFG10K
Alpha textures.Why would adaptive AA make any difference?
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Alpha textures.Why would adaptive AA make any difference?
And that means......???? If you have a link or something explaining Alpha textures, that would be cool. Thanks.
Originally posted by: nib95
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Alpha textures.Why would adaptive AA make any difference?
And that means......???? If you have a link or something explaining Alpha textures, that would be cool. Thanks.
Well I have no idea what it means.
All I know is, with my own two eyes, I saw that with Adaptive AA enabled on my old X1900 XTX, I saw AA even in the distance. That's all I know.
Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:
CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif
X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps
FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif
X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps
So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:
Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps
1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps
Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps
1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps
Originally posted by: hemmy
Xbit has the best reviews out there, tons of games
Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:
CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif
X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps
FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif
X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps
So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:
Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps
1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps
Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps
1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps
Go suck your mighty ATI some more maybe??? Dont be so upset if your great company's card loses a benchmark
Originally posted by: keysplayr2003
Shhhh.... Jokers is hard at work here..
Originally posted by: hemmy
Xbit has the best reviews out there, tons of games
Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:
CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif
X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps
FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif
X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps
So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:
Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps
1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps
Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps
1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps
Go suck your mighty ATI some more maybe??? Dont be so upset if your great company's card loses a benchmark
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?
I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.
Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?
Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.
Originally posted by: 5150Joker
Originally posted by: keysplayr2003
Shhhh.... Jokers is hard at work here..
It just shows Xbit's newest review results are inconsistent with other reviews out there including their own. Furthermore they failed to test Oblivion with AF because as I pointed out, if they had, nVidia's numbers would be even lower.
Originally posted by: keysplayr2003
Originally posted by: 5150Joker
Originally posted by: keysplayr2003
Shhhh.... Jokers is hard at work here..
It just shows Xbit's newest review results are inconsistent with other reviews out there including their own. Furthermore they failed to test Oblivion with AF because as I pointed out, if they had, nVidia's numbers would be even lower.
I saw no mention of them not using AF. I did see that they disabled FSAA on both cards so they could use HDR.
Quote: "We decided not to test our solutions with enabled FSAA, because the HDR support gets disabled even on ATI cards in this case and the graphics quality drops down significantly."
I read through the oblivion page twice and saw no mention of AF unless I am missing something. Are you just assuming they aren't using AF? Or did you actually read that somewhere and I missed it?
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?
I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.
Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?
Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.
The "merit in my post", as you would have it, is to point out that trying to pass off what is a pretty hefty beating by saying that future hypothetical driver improvements could produce different results is pointless and, frankly, desperate. The results are what they are. Whether they'd be different with drivers that don't currently exist or not is neither here nor there, especially as - as you agree - there's no reason to believe that both sides can't improve their performance.
I'm not saying that there wouldn't be performance improvements. I'm saying that using hypothetical future improvements to attempt to explain away what is a bad result in one game for one manufacturer is a bogus and desperate tactic which I, personally, "don't really see any merit" in.