Xbitlab's G71 review!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?

If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...

why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|

No it's not. Take it from an owner of both.
My single XTX was louder and hotter then both my 7900 GT's.

I know it sounds crazy, but honestly it's true.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: Cookie Monster
The conclusion is that the X1 series will give you a better experience in oblivion unless we see more performance enhances from the drivers from each company yet to come.

That's a bit desperate isn't it?
 
Jun 14, 2003
10,442
0
0
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?

If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...

why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|


would be a perfect excuse to try out the new VF900.....looks like a beast that thing....get your conductive pen out too, up the volts and scream along at over 700Mhz on the core.

did you see the VRzone article? he got about 800Mhz out of a 7900GT. simply amazing. it scored 7000 in 06....imagine 2 of them!
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: otispunkmeyer
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?

If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...

why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|


would be a perfect excuse to try out the new VF900.....looks like a beast that thing....get your conductive pen out too, up the volts and scream along at over 700Mhz on the core.

did you see the VRzone article? he got about 800Mhz out of a 7900GT. simply amazing. it scored 7000 in 06....imagine 2 of them!


Well on monday I'm doing exactly that!
Ordering my two VF900's and a new PSU (Fortron 700w) tomorrow.
Already have my conductive pen and AS5. So hopefully on monday I'll have results and benchmarks for you. Cant wait!
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I still subscribe to the theory that ATI renders a scenes AA to limited distances while Nvidia renders AA to the entire scene with no depth restrictions. It is said ATI has much more efficient AA methods, well, this would explain why. I have seen too many users mentioning this, that they noticed distance AA not being done on their ATI card but is being done on their Nvidia card. These are usually people who have already owned a 7800GT/X already and went and bought a X1800/X1900. Those that spend 500+ on these cards, I tend to believe it. I wish I had the extra cash to have my own little test lab here, but not this year.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: keysplayr2003
I still subscribe to the theory that ATI renders a scenes AA to limited distances while Nvidia renders AA to the entire scene with no depth restrictions. It is said ATI has much more efficient AA methods, well, this would explain why. I have seen too many users mentioning this, that they noticed distance AA not being done on their ATI card but is being done on their Nvidia card. These are usually people who have already owned a 7800GT/X already and went and bought a X1800/X1900. Those that spend 500+ on these cards, I tend to believe it. I wish I had the extra cash to have my own little test lab here, but not this year.


Well as an owner of both I have to disagree with that.
I expressely remember trees and objects in-long distances being AA'd with ATI cards.
Granted I was use Adaptive AA.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: nib95
Originally posted by: dug777
I wonder if two 7900GTs whining away in SLI would be louder than an x1900xtx?

If i had the money i'd be all over a X1900xt like flies over a pile of particularly seductive manure, but it looks like it will be a 7900GT for me as they are going for 460AUD (about 330 USD iirc) here at one place...

why do they test at 'quality' rather than 'high quality'? i'm not dropping that kinda money to not use 'high quality' "|

No it's not. Take it from an owner of both.
My single XTX was louder and hotter then both my 7900 GT's.

I know it sounds crazy, but honestly it's true.

do the 7900GTs whine tho? I'd prefer a roar to a whine anyday
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: nib95
Originally posted by: keysplayr2003
I still subscribe to the theory that ATI renders a scenes AA to limited distances while Nvidia renders AA to the entire scene with no depth restrictions. It is said ATI has much more efficient AA methods, well, this would explain why. I have seen too many users mentioning this, that they noticed distance AA not being done on their ATI card but is being done on their Nvidia card. These are usually people who have already owned a 7800GT/X already and went and bought a X1800/X1900. Those that spend 500+ on these cards, I tend to believe it. I wish I had the extra cash to have my own little test lab here, but not this year.


Well as an owner of both I have to disagree with that.
I expressely remember trees and objects in-long distances being AA'd with ATI cards.
Granted I was use Adaptive AA.

Why would adaptive AA make any difference? Just asking.

 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
I have been a fan of Xbit's video card reviews for awhile. They don't rush it out the door on launch day and are pretty extensive on tests. This review though as 5150 mentioned has some descrepencies? Maybe driver changes from past reviews caused some of that. And what happened with crossfire!?! Did ATI find some magical driver finally that Xbit used for this review? I dunno, a lot of it just doesn't follow past reviews.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?

Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.

 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Why would adaptive AA make any difference?
Alpha textures.

And that means......???? If you have a link or something explaining Alpha textures, that would be cool. Thanks.

Well I have no idea what it means.
All I know is, with my own two eyes, I saw that with Adaptive AA enabled on my old X1900 XTX, I saw AA even in the distance. That's all I know.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: nib95
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Why would adaptive AA make any difference?
Alpha textures.

And that means......???? If you have a link or something explaining Alpha textures, that would be cool. Thanks.

Well I have no idea what it means.
All I know is, with my own two eyes, I saw that with Adaptive AA enabled on my old X1900 XTX, I saw AA even in the distance. That's all I know.

Well thats a good thing. And is your performance different with and without adaptive AA?

 

hemmy

Member
Jun 19, 2005
191
0
0
Xbit has the best reviews out there, tons of games

Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:

CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif

X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps

FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif

X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps


So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:


Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps

1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps

Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps

1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps

Go suck your mighty ATI some more maybe??? Dont be so upset if your great company's card loses a benchmark
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: hemmy
Xbit has the best reviews out there, tons of games

Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:

CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif

X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps

FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif

X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps


So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:


Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps

1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps

Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps

1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps

Go suck your mighty ATI some more maybe??? Dont be so upset if your great company's card loses a benchmark

Shhhh.... Jokers is hard at work here..

 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: keysplayr2003


Shhhh.... Jokers is hard at work here..


It just shows Xbit's newest review results are inconsistent with other reviews out there including their own. Furthermore they failed to test Oblivion with AF because as I pointed out, if they had, nVidia's numbers would be even lower.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: hemmy
Xbit has the best reviews out there, tons of games

Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:

CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif

X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps

FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif

X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps


So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:


Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps

1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps

Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps

1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps

Go suck your mighty ATI some more maybe??? Dont be so upset if your great company's card loses a benchmark

AMM... Your are a joke

When did even ATI loose the Oblivion bechmark At XBIT LABs .. look at average frame rate and NVidia is way below...

 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?

Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.


The "merit in my post", as you would have it, is to point out that trying to pass off what is a pretty hefty beating by saying that future hypothetical driver improvements could produce different results is pointless and, frankly, desperate. The results are what they are. Whether they'd be different with drivers that don't currently exist or not is neither here nor there, especially as - as you agree - there's no reason to believe that both sides can't improve their performance.

I'm not saying that there wouldn't be performance improvements. I'm saying that using hypothetical future improvements to attempt to explain away what is a bad result in one game for one manufacturer is a bogus and desperate tactic which I, personally, "don't really see any merit" in.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: 5150Joker
Originally posted by: keysplayr2003


Shhhh.... Jokers is hard at work here..


It just shows Xbit's newest review results are inconsistent with other reviews out there including their own. Furthermore they failed to test Oblivion with AF because as I pointed out, if they had, nVidia's numbers would be even lower.

I saw no mention of them not using AF. I did see that they disabled FSAA on both cards so they could use HDR.

Quote: "We decided not to test our solutions with enabled FSAA, because the HDR support gets disabled even on ATI cards in this case and the graphics quality drops down significantly."

I read through the oblivion page twice and saw no mention of AF unless I am missing something. Are you just assuming they aren't using AF? Or did you actually read that somewhere and I missed it?

 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: keysplayr2003
Originally posted by: 5150Joker
Originally posted by: keysplayr2003


Shhhh.... Jokers is hard at work here..


It just shows Xbit's newest review results are inconsistent with other reviews out there including their own. Furthermore they failed to test Oblivion with AF because as I pointed out, if they had, nVidia's numbers would be even lower.

I saw no mention of them not using AF. I did see that they disabled FSAA on both cards so they could use HDR.

Quote: "We decided not to test our solutions with enabled FSAA, because the HDR support gets disabled even on ATI cards in this case and the graphics quality drops down significantly."

I read through the oblivion page twice and saw no mention of AF unless I am missing something. Are you just assuming they aren't using AF? Or did you actually read that somewhere and I missed it?



They used their Pure Speed setting which has no AF.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?

Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.


The "merit in my post", as you would have it, is to point out that trying to pass off what is a pretty hefty beating by saying that future hypothetical driver improvements could produce different results is pointless and, frankly, desperate. The results are what they are. Whether they'd be different with drivers that don't currently exist or not is neither here nor there, especially as - as you agree - there's no reason to believe that both sides can't improve their performance.

I'm not saying that there wouldn't be performance improvements. I'm saying that using hypothetical future improvements to attempt to explain away what is a bad result in one game for one manufacturer is a bogus and desperate tactic which I, personally, "don't really see any merit" in.

We don't even know how long that minimum framerate was sustained. It could have been a nanosecond drop and fraps recorded that minimum. So as for now, unless someone can show the duration of that drop in framerate on the 7900GTX card, it is open for interpretation and opinion.

And what did you make of Jokers very informative post about various review site descrepancies?

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |