Xbitlab's G71 review!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
The X-bit article shows how well NVIDIA's balanced architecure works in a broad range of games. Since most people play more than one or two games a year, it's nice to see a wider variety benchmarked.

Our tests have shown that Nvidia?s approach is justifiable even in modern games with their abundance of pixel shader-based visual effects because the Radeon X1900 often slows down under high textural load.
 

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: Cookie Monster
Noise:
For comparison, the Radeon X1900 XTX produces some noise even when its blower is working at a reduced speed, and when the speed is high, the noise is hardly bearable due to the irritating tone produced by the resonating plastic casing.

What the heck? I couldn't find the X1900XT in stock at my favorite store?

The just had some comparable items......




JK
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: FalllenAngell
Originally posted by: Cookie Monster
Noise:
For comparison, the Radeon X1900 XTX produces some noise even when its blower is working at a reduced speed, and when the speed is high, the noise is hardly bearable due to the irritating tone produced by the resonating plastic casing.

What the heck? I couldn't find the X1900XT in stock at my favorite store?

The just had some comparable items......




JK

That is enough evidence in that post for me to believe that your no other than rollo ... why don't you just use your Rollo Alias I am over the whole AEG..
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: 5150Joker
Originally posted by: Cookie Monster
If i had the money for a single card, i would go buy the HIS X1900XTX. Absolutely nothing to worry about temps or noise thanks to their AC coolnig solutions.

However the 7900GT consuming 48W at load is just amazing!!

Yes but how much does the power consumption increase once everyone OC's it? How much power do the factory overclocked GT's consume? nV fans touting factory OC'd GT's while quoting stock GT power consumption numbers is very deceptive.

http://www.vr-zone.com/?i=3437&s=4

Gives some numbers.

Franking speaking, we don?t think the GeForce 7900 GT cooler is half as good as the GeForce 7900 GTX one. It is small and has a tiny fan, so it is rather noisy, yet not very efficient.

Funny how its only ATi that is given a hard time about fan noise, when the 7900GT is just as loud. Both at 50db's.

but you should be aware that 14x Super AA theoretically produces a lower-quality picture than 16x SLI AA with its honest super-sampling.

Key word, "theoretically". FS did a image comparison, and they came to the conclusion that there was no difference. ATi's 14x is faster than NV's 8x most of the time anyways. SLI AA is horribly slow compared to SuperAA.

A stock 7900GT load at 48w is pretty amazing to me though. A lot of power, while not consuming much. Not using high quality settings with high end cards in all games doesnt make much sense to me however. Am I missing something or do you have to click each and every page to find what you want? There is no drop down box? Ugh... I like that they use a lot of games, just dont care for some of their settings.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Barkotron
Originally posted by: Finny
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.

First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.

Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.

However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.


1. Nobody's dismissing possible driver updates. I, however, am dismissing the idea that possible future driver updates should be used as an excuse for a poor showing on either side. Well, unless there's a very clear driver bug, as there have been in the past with SLI/Crossfire setups, or the renaming fear.exe in earlier Cats etc.

2. Why is there a right to expect a minimum of 30FPS just because someone's spent a lot of money on a videocard? Have you actually seen the amount of stuff that goes on outdoors in Oblivion? It looks incredible, and screenshots just don't do it justice. Frankly I'm impressed that frame rates are as high as they are.

3. Xbit seems to be as reliable as other sites out there. Personally I have no problem with the numbers they're putting out - I'm just pointing out that specifically those Oblivion scores are not numbers "NVidia fans" should be shouting about. The ATI card has 80% better minimum FPS @12x10 and nearly 60% better @16x12. That's a big difference at any level, but when the difference is between 27FPS and 15FPS, or 22FPS and 14FPS, then it's the difference between "just about playable" and "severely affecting gameplay". Those scores are a hammering, and no meaningless drivel about "nanosecond frame rate drops" can hide that.

Sure, driver updates can bring big improvements, but until they've been tested, using possible future driver enhancements as an excuse is pure spin.

Meaningless drivel? heh. I think you might want to take a step back and check yourself. The first step to falling off of your pedestal is to belittle other folks comments. So try a be a bit more congenial with your posts please. "Sure it could have." Means what? That it could have and there is nothing meaningless about it. Meaningless to you? For certain.

EDIT: And WTF is this statement about? "If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot."

This only shows you're predisposed to thinking the only way Nvidia can gain performance in oblivion is to cheat and reduce IQ. This my friend shows the rest of us exactly where you stand. Nothing further your honor.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
can someone please answer this question
Originally posted by: toyota
How does Xbitlabs do 16x AF in Doom3?? Ultra quality is 8x AF. I cant change the AF setting in the Doom3 profile of the nvidia control panel because its greyed out. So can some please tell me how Xbitlabs can do this.

 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: tuteja1986
G80 ain't coming in June Dethfrumbelo ... August 2006 latest :!

Source? Nvidia's current roadmap says June.

Doesn't matter to me whether it's August/September anyway.

The current generation of cards have too many issues to be worth it at this stage.




 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
I find it ironic that they wanted to use HDR for best image quality yet didn't use Highest quality possible from drivers or even AF.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: keysplayr2003

ATI card gone?

Yeap, I dont have my X1900 XTX.
Was a great card while it lasted.
But in all honesty, that, nor the new 7900 GTX are still not powerful enough to run games at max resolutions (1920 x 1200 min 1680 x 1050) max settings and maintain 60fps average.
My SLI set up now is.

Another thing, I have read that the 7900 GT is as loud as an X1900 XTX.
I have to disagree with these findings, as my XTX at least to me seemed much louder then both my cards now. Plus, the XTX ran much hotter. I use to reach 80 degrees load, but my 7900 GT's never reach more then 55.
BIG difference in temps there, for exactly the same system.
I recieve my VF900's on monday, hopefully my temps will drop further still.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: FalllenAngell

What the heck? I couldn't find the X1900XT in stock at my favorite store?
The irony is that the X1900XT that 5150choker is pimping is his sig, is "out of stock".

So much for availability :laugh:

He should also do a price check on the other card he is whoring
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: keysplayr2003

EDIT: And WTF is this statement about? "If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot."

This only shows you're predisposed to thinking the only way Nvidia can gain performance in oblivion is to cheat and reduce IQ. This my friend shows the rest of us exactly where you stand. Nothing further your honor.

Now who's jumping to conclusions keys? 80% is a huge margin to make up; there's a difference between thinking NVidia would have to cheat to "gain performance" (as you put it, obfuscating the point), and making up an 80% gap!

-----------

Regarding some of the other points, I think it's pretty much academic. The X18xx cards are much noisier at idle than Nvidia offerings, and are (at best) equal at full load (to the noisiest 7900GT's). The 7900GTX cooler is very quiet. But Ackmed brings up a good point that the 7900GT can get noisy and yet people ignore this fact.

Personally I think the stock cooler on the X1xx cards stinks and I would replace it right away if I got any of those cards; same as on the 7900GT.

Originally posted by: Wreckage
The X-bit article shows how well NVIDIA's balanced architecure works in a broad range of games. Since most people play more than one or two games a year, it's nice to see a wider variety benchmarked.

Our tests have shown that Nvidia?s approach is justifiable even in modern games with their abundance of pixel shader-based visual effects because the Radeon X1900 often slows down under high textural load.

"Balanced approach" - I like that. Funny way of calling it the same old stuff with higher clockspeeds.

I think both architectures are starting to show that brute strength alone is starting to hit a wall and more forward thinking is going to be required. G71, with 24/24 pipes is sheer brute strength on pipelines, while ATI's R580 shows some good ideas with 48 pixel pipes but also that focusing on pixel shading alone without any improvement to texturing power (16 texture pipes) is going to be held back by texturing performance.

Basically both companies have taken completely separate routes to come to near identical results (with the usual back-and-forth in different games).
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003

EDIT: And WTF is this statement about? "If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot."

This only shows you're predisposed to thinking the only way Nvidia can gain performance in oblivion is to cheat and reduce IQ. This my friend shows the rest of us exactly where you stand. Nothing further your honor.

Now who's jumping to conclusions keys? 80% is a huge margin to make up; there's a difference between thinking NVidia would have to cheat to "gain performance" (as you put it, obfuscating the point), and making up an 80% gap!

Jiffy, I didn't "put" anything. I quoted the other dude who brought up the word cheating.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: nib95
Originally posted by: keysplayr2003

ATI card gone?

Yeap, I dont have my X1900 XTX.
Was a great card while it lasted.
But in all honesty, that, nor the new 7900 GTX are still not powerful enough to run games at max resolutions (1920 x 1200 min 1680 x 1050) max settings and maintain 60fps average.
My SLI set up now is.

Another thing, I have read that the 7900 GT is as loud as an X1900 XTX.
I have to disagree with these findings, as my XTX at least to me seemed much louder then both my cards now. Plus, the XTX ran much hotter. I use to reach 80 degrees load, but my 7900 GT's never reach more then 55.
BIG difference in temps there, for exactly the same system.
I recieve my VF900's on monday, hopefully my temps will drop further still.

I heard the 7900GT stock coolers can get a bit noisy, but if you're saying the XTX was louder than both GT's together, then daaaaaaamn.

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
I still subscribe to the theory that ATI renders a scenes AA to limited distances
I doubt something like this is even possible. Distance rendering is game specific and the only way the GPU has any idea about distance is with mip-mapping. Of course mip-mapping is a purely texture operation while MSAA is a purely geometric operation.

And that means......???
AKA transparency textures. If you're still confused then you need to read what adaptive AA/transparency AA actually does.

If you're seeing aliasing on distance objects with regular MSAA then they could well be alpha textures which is why adaptive AA will fix it. Of course this has nothing at all to do with distance.

And is your performance different with and without adaptive AA?
That depends on how many alpha textures are in the scene. The difference could be nothing, large, or anything in between.

Kind of my point. If the 84.25 drivers enhance oblivion performance, they should be tried and tested. Even if it is a beta, it would be good to see any improvements as a hint of performance gains to come. This goes for both companies mind you. The 6.4 CATs are supposed to fix the Xfire issue with oblivion.
Except 6.4 will be an official driver release and part of the regular monthly driver support so it's quite invalid to compare it to nVidia's unsupported and untested beta drivers.

We should not be using beta drivers to force the likes of nVidia to pull their finger out and start releasing official drivers more often than tri-yearly. nVidia's driver support is a total joke compared to ATi's.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: BFG10K
I still subscribe to the theory that ATI renders a scenes AA to limited distances
I doubt something like this is even possible. Distance rendering is game specific and the only way the GPU has any idea about distance is with mip-mapping. Of course mip-mapping is a purely texture operation while MSAA is a purely geometric operation.

And that means......???
AKA transparency textures. If you're still confused then you need to read what adaptive AA/transparency AA actually does.

If you're seeing aliasing on distance objects with regular MSAA then they could well be alpha textures which is why adaptive AA will fix it. Of course this has nothing at all to do with distance.

And is your performance different with and without adaptive AA?
That depends on how many alpha textures are in the scene. The difference could be nothing, large, or anything in between.

Kind of my point. If the 84.25 drivers enhance oblivion performance, they should be tried and tested. Even if it is a beta, it would be good to see any improvements as a hint of performance gains to come. This goes for both companies mind you. The 6.4 CATs are supposed to fix the Xfire issue with oblivion.
Except 6.4 will be an official driver release and part of the regular monthly driver support so it's quite invalid to compare it to nVidia's unsupported and untested beta drivers.

We should not be using beta drivers to force the likes of nVidia to pull their finger out and start releasing official drivers more often than tri-yearly. nVidia's driver support is a total joke compared to ATi's.


I don't see the problem with releasing Beta Drivers that function well, just because they don't have Microsoft WHQL seal of apporval doesn't mean they are bad...
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
The X-bit article shows how well NVIDIA's balanced architecure works in a broad range of games.
Yeah, until you actually start playing said games at settings that deviate from the standard benchmark settings and/or start playing games that aren't actively benchmarked.

That's when you start discovering how poor nVidia's drivers are in the real world and how many glitches they have, especially when trying to use the SSAA modes in older games.

why don't you just use your Rollo Alias
Because the Mods banned Trollo so he's forced come back as Fallllllllen Angelllllllllllllll or whatever the heck he is.

can someone please answer this question
I've already answered it - use the image_anisotropy setting.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
I don't see the problem with releasing Beta Drivers that function well,
But that's just it: they don't function well. Even the official drivers have a lot of problems and it takes months/years to get them fixed, if they're ever fixed.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: Wreckage
Originally posted by: FalllenAngell

What the heck? I couldn't find the X1900XT in stock at my favorite store?
The irony is that the X1900XT that 5150choker is pimping is his sig, is "out of stock".

So much for availability :laugh:

He should also do a price check on the other card he is whoring



5150choker, wow that's clever coming from you; hope you didn't strain your small brain too much with that one. Guess the powercolor card sold out due to high demand. Isn't that the excuse you nvidiots use? Anyway my sig's been updated with two better deals.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003

EDIT: And WTF is this statement about? "If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot."

This only shows you're predisposed to thinking the only way Nvidia can gain performance in oblivion is to cheat and reduce IQ. This my friend shows the rest of us exactly where you stand. Nothing further your honor.

More meaningless drivel. It doesn't show anything of the sort. Stop reading things into my comments that aren't there - if you're going to read that into my comments, it shows a lot more about your bias than mine.

If they can make that up, then great. I'm dubious that anyone can make up such a massive gap without dubious optimisations. Whether NV or anyone else.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
How does Xbitlabs do 16x AF in Doom3?? Ultra quality is 8x AF. I cant change the AF setting in the Doom3 profile of the nvidia control panel because its greyed out. So can some please tell me how Xbitlabs can do this.
 

Sable

Golden Member
Jan 7, 2006
1,127
99
91
Originally posted by: toyota
How does Xbitlabs do 16x AF in Doom3?? Ultra quality is 8x AF. I cant change the AF setting in the Doom3 profile of the nvidia control panel because its greyed out. So can some please tell me how Xbitlabs can do this.
Originally posted by: BFG10K
I've already answered it - use the image_anisotropy setting.

Try reading the thread? :thumbsup:
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Sable
Originally posted by: toyota
How does Xbitlabs do 16x AF in Doom3?? Ultra quality is 8x AF. I cant change the AF setting in the Doom3 profile of the nvidia control panel because its greyed out. So can some please tell me how Xbitlabs can do this.
Originally posted by: BFG10K
I've already answered it - use the image_anisotropy setting.

Try reading the thread? :thumbsup:
well i didnt see that. i looked at tweakguides and they said the "image_anisotropy setting" has valid values are 1, 2, 4, and 8. SO HOW THE HELL CAN XBITLABS DO 16AF??
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |