Xbitlab's G71 review!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
Toyota, it's a console command.

image_anisotropy 16 will set AF to 16x.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: BFG10K
Toyota, it's a console command.

image_anisotropy 16 will set AF to 16x.
thanks but tweakguides say that only 1,2,4 and 8 can be used.

 

Sable

Golden Member
Jan 7, 2006
1,127
99
91
Originally posted by: toyota
well i didnt see that. i looked at tweakguides and they said the "image_anisotropy setting" has valid values are 1, 2, 4, and 8. SO HOW THE HELL CAN XBITLABS DO 16AF??

I'd guess by using the number 16? I reckon that tweak guide was written before 16af was possible on cards. Doom3 is pretty old now.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Sable
Originally posted by: toyota
well i didnt see that. i looked at tweakguides and they said the "image_anisotropy setting" has valid values are 1, 2, 4, and 8. SO HOW THE HELL CAN XBITLABS DO 16AF??

I'd guess by using the number 16? I reckon that tweak guide was written before 16af was possible on cards. Doom3 is pretty old now.
well of course common sense would say 16. i cant find anything that actually says it will work. i am pretty sure 16af was around before tweakguide for doom3 came out. its not that new of a setting is it?

 

yacoub

Golden Member
May 24, 2005
1,991
14
81
HELLO I'm still waiting for ANY manufacturers to put out alternatively cooled X1900XT and 7900GT. This is taking WAY too long to happen. A quieter fan like the 7900GTX has would be ideal. GET WITH IT ALREADY, manufacturers.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
thanks but tweakguides say that only 1,2,4 and 8 can be used.
Okay, you can believe what Tweakguides says or you can try it yourself. It's your choice. :roll:

Now for me personally, I've been running 16xAF in Doom 3 since the day it came out.
 
Jun 14, 2003
10,442
0
0
Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:

CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif

X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps

FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif

X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps


So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:


Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps

1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps

Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps

1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps


depends on what they used for the rest of the system, what benches they used, settings, perhaps their windows was in overall better shape.....theres so many variables to include when asking why one site got different to another. i expect that different sites get different results, thats why i read more than one site to generate an opinion

i mean again as usual your on your back foot trying to find a way to make ATI better. why bother?

Ati was best in ExtremeTechs test, Nv put in a good show in Xbits tests.....big wow. all that tells me is that you cant really go wrong with either purchase.

im not a extreme tech frequenter, so i would probably take xbits results with a little more trust. but having said that, the last video review i read from them was that massive 30+ game round up and there were alot of mistakes.
 
Jun 14, 2003
10,442
0
0
Originally posted by: 5150Joker
Originally posted by: Cookie Monster
Link

Lots of visuals.


Edit - after comparing the screen shots, ATi is alot darker than NV on the screenshots of the F-15C.

Comparing in HL2, why is it that in the ATi shot compared to the NV you see the dissapearance of the branches and grass?

Im not sure if ATi has "better" IQ as some claimed because some of the screenshots NV shots looks more detailed compared to ATi.

Can people with SLI or Crossfire post some screen shots?

Also i noted this.

ATi HDR
NV HDR

ATi HDR
NV HDR

What do you guys think?


Hmm don't have either of those games installed but maybe someone else here can check. I think short FRAPs videos would be a lot more telling than screenshots. I can host the vids if you guys decide to make some.


i have splinter cell.....i may reinstall it since im getting bored lol, ill take some video if i do
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: BFG10K
thanks but tweakguides say that only 1,2,4 and 8 can be used.
Okay, you can believe what Tweakguides says or you can try it yourself. It's your choice. :roll:

Now for me personally, I've been running 16xAF in Doom 3 since the day it came out.
thanks. i was just trying to clear this up. i have never messed with anything but the nvidia control panel or in game settings. i uninstalled doom3 a few days ago. after you apply the the 16af like that does it stay or do you have to do it every time you start the game?? also does it actually look any better cause i cant even tell the difference between high and ultra?

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
The setting will stay because it's saved in the config file.

However keep in mind that if you change the global IQ setting (e.g. high quality to ultra) it may change to another value. If it does just change it back to 16 again.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: BFG10K
The setting will stay because it's saved in the config file.

However keep in mind that if you change the global IQ setting (e.g. high quality to ultra) it may change to another value. If it does just change it back to 16 again.
ok last question. lol. does it actually look any better cause i cant even tell the difference between high and ultra?


 

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
Yes, it does look better but it may be hard to notice because Doom 3 doesn't tend to have long draw distances indoors. The performance hit is minimal though so just enable it.

Now, the main difference between high and ultra is that ultra does not compress any of the textures, unlike high. I don't think AF actually changes between the two settings.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Barkotron
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003

EDIT: And WTF is this statement about? "If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot."

This only shows you're predisposed to thinking the only way Nvidia can gain performance in oblivion is to cheat and reduce IQ. This my friend shows the rest of us exactly where you stand. Nothing further your honor.

More meaningless drivel. It doesn't show anything of the sort. Stop reading things into my comments that aren't there - if you're going to read that into my comments, it shows a lot more about your bias than mine.

If they can make that up, then great. I'm dubious that anyone can make up such a massive gap without dubious optimisations. Whether NV or anyone else.

No, I think I read into your comment properly. Your the one who said it, not me. When you hear something you don't like, it's "drivel". hehe. 11 posts in this forum and you spend 9 of them in this thread defending ATI/or belittling other peoples comments. It only took you two posts in this thread for you to call someone desparate. THAT is a fanboy statement. Look, don't worry about it, I have your number as others will soon enough.

 
Sep 6, 2005
135
0
0
Well, I guess I'll just reply to several points.

I still say that a site somewhere should both test games w/ the latest official and beta drivers, so as to prove whether or not a "performance increase" is really there or not.

munky
It's not that Nv has totally taken control of this forum. It's just that some people, including me, got a x1900xt(x) before Nv even released their g71, and as a single card it's still better than the g71, so I really dont care how many trolls spout pro-Nv crap, since I got a card that's faster and technologically more advanced than what they're raving about. Or, maybe it's because a certain AEG shill has been keeping his head low these days, so there's less flaming going on.

Well, we could debate the concept of truly being "better" till land's end, but I wont for the sake of not wasting my own time. I myself would prefer, for a single card, the 7900GTX, simply because I've had to stand a loud-as-hell computer for quite a while, and the GTX's cooler is very appealing. Furthermore, the card itself performs well enough, so I wouldn't feel like I wasted money... Of course, as I've said before, right now, I still consider the X1900XT the better bang-for-the-buck, simply because it costs much less (And it's generally in stock much more frequently, regardless of reason)

Speaking of computer noise, I also agree with some others on the 7900GT's cooling design. I mean, c'mon, WTF? It's more powerful than the 7800GTX (Non 512), so it should at least have the same cooling design as it did. That was notable for being very quiet as well, and for a high-end card like the 7900GT, it's only fitting to have a cooler that doesn't make someone think "puny".

Finally, I'd like to make one last statement: Why do you people keep discounting SLI, or any dual-card solution for that matter, as a viable high-end setup? The way I see it, you're basically saying that you can't truly count a dual-card solution as being truly more powerful than a single-card because the pair of cards are less powerful individually than the more powerful single card. It's like saying dual-core processors aren't worth it because you can buy a more powerful single-core processor, despite the advantages of dual-core.

The way I see it, things are going dual (Or just multi) powered now. If one can have a dual-core processor and count themselves as truly "high-end", why can't one do the same with a pair of video cards? A pair of 7900GTs, for example, outperforms an XTX and GTX in pretty much all situations (*Waits for the hunt for the game that disproves this*), so why should the owner not have the privilege to say that he's near the very top of the charts in performance?

Note- I'm not trying to belittle the power of your setup, if I ever came across like that. Quite the opposite, really, but I'm just trying to make a point here.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Finny
Well, I guess I'll just reply to several points.

I still say that a site somewhere should both test games w/ the latest official and beta drivers, so as to prove whether or not a "performance increase" is really there or not.

munky
It's not that Nv has totally taken control of this forum. It's just that some people, including me, got a x1900xt(x) before Nv even released their g71, and as a single card it's still better than the g71, so I really dont care how many trolls spout pro-Nv crap, since I got a card that's faster and technologically more advanced than what they're raving about. Or, maybe it's because a certain AEG shill has been keeping his head low these days, so there's less flaming going on.

Well, we could debate the concept of truly being "better" till land's end, but I wont for the sake of not wasting my own time. I myself would prefer, for a single card, the 7900GTX, simply because I've had to stand a loud-as-hell computer for quite a while, and the GTX's cooler is very appealing. Furthermore, the card itself performs well enough, so I wouldn't feel like I wasted money... Of course, as I've said before, right now, I still consider the X1900XT the better bang-for-the-buck, simply because it costs much less (And it's generally in stock much more frequently, regardless of reason)

Speaking of computer noise, I also agree with some others on the 7900GT's cooling design. I mean, c'mon, WTF? It's more powerful than the 7800GTX (Non 512), so it should at least have the same cooling design as it did. That was notable for being very quiet as well, and for a high-end card like the 7900GT, it's only fitting to have a cooler that doesn't make someone think "puny".

Finally, I'd like to make one last statement: Why do you people keep discounting SLI, or any dual-card solution for that matter, as a viable high-end setup? The way I see it, you're basically saying that you can't truly count a dual-card solution as being truly more powerful than a single-card because the pair of cards are less powerful individually than the more powerful single card. It's like saying dual-core processors aren't worth it because you can buy a more powerful single-core processor, despite the advantages of dual-core.

The way I see it, things are going dual (Or just multi) powered now. If one can have a dual-core processor and count themselves as truly "high-end", why can't one do the same with a pair of video cards? A pair of 7900GTs, for example, outperforms an XTX and GTX in pretty much all situations (*Waits for the hunt for the game that disproves this*), so why should the owner not have the privilege to say that he's near the very top of the charts in performance?

Note- I'm not trying to belittle the power of your setup, if I ever came across like that. Quite the opposite, really, but I'm just trying to make a point here.

I just don't like dual setups because for twice the cost it does not give you twice the performance, and there are still issues involved with dual cards, like getting vsync to work correctly. Of course, if someone plays at 1600x1200 resolution or higher, then dual card may be necessary to provide smooth gameplay, but since I'm playing at 1280x960 a single high end card seems like a better choice for me.

And while we're on the whole "dual" topic, I recently switched from a single core cpu to a dual core cpu, and so far I've only seen huge improvement when encoding videos. Other than that, it performs about the same as my single core did in the tasks that I do. Games dont run faster, and even Windoze doesnt seem any more responsive than on a single core. So, a high end rig doesnt necessarily need a dual core cpu. But it does seem like techology in general is heading that way, with stuff like quad core cpu's, quad video card setups, and the 7-core cell processor.
 

CP5670

Diamond Member
Jun 24, 2004
5,524
602
126
Originally posted by: otispunkmeyer
i have splinter cell.....i may reinstall it since im getting bored lol, ill take some video if i do

Does this game even use EXR HDR on ATI cards? I remember that the 1.4 patch (which came out before the X1800 line) added in a shader-based HDR mode for X800 cards that looked dimmer than the normal HDR, so the game may be using that even on the X1x00 cards.

I wonder if this has anything to do with the X1900 XTX's exceptional minimum framerate compared to the 7900 GTX in this game?

Originally posted by: BFG10K
Yeah, until you actually start playing said games at settings that deviate from the standard benchmark settings and/or start playing games that aren't actively benchmarked.

If you're referring to how well they run old games, it's the other way around from what I have seen. The Catalyst 6.3 drivers seem to have broken something in Freespace 2, a game that I play regularly, and are causing BSODs/reboots in certain situations, even though 6.2 had no problems. This is not the first time this kind of thing has happened; the 4.4 drivers also broke something else a few years ago that was never fixed (although it has become less relevant now). There have never been any such problems with the Nvidia drivers in this game. This recent FS2 issue is really the only reason why I haven't already bought an X1900 XT or XTX.

There was also someone here saying a year ago that System Shock 2 didn't work at all on recent ATI drivers, although I don't know if there is any truth to that. (and Nvidia cards don't work very well in that game either)
 
Sep 6, 2005
135
0
0
Originally posted by: munky

I just don't like dual setups because for twice the cost it does not give you twice the performance, and there are still issues involved with dual cards, like getting vsync to work correctly. Of course, if someone plays at 1600x1200 resolution or higher, then dual card may be necessary to provide smooth gameplay, but since I'm playing at 1280x960 a single high end card seems like a better choice for me.

And while we're on the whole "dual" topic, I recently switched from a single core cpu to a dual core cpu, and so far I've only seen huge improvement when encoding videos. Other than that, it performs about the same as my single core did in the tasks that I do. Games dont run faster, and even Windoze doesnt seem any more responsive than on a single core. So, a high end rig doesnt necessarily need a dual core cpu. But it does seem like techology in general is heading that way, with stuff like quad core cpu's, quad video card setups, and the 7-core cell processor.

Well, as far as I can tell, most of the performance gain from dual-core processors is in stuff like multi-tasking & video encoding. That, and Quake 4. I guess it's more "future proofing" oneself than anything.

As for dual-cards, I can see where you're comming from, but this is how I see it:

1. It doesn't have to be double the performance, per se, just enough to warrant the cost. If a pair of 7900GTs ($600) can outperform a single 7900GTX ($500), which they certainly do, then they're plenty worth their keep. (Note- that's at MSRB or whatever it's called- I know the prices actually vary, but this is just an attempt to make a point)

2. As plenty have said, if you're not in a situation where you'd need a pair of cards, there's no reason to go SLI. Again, I'm not belittling you or anything (Hell, I'm lucky if I can get 1024x768 in newer games myself), but that's just how it is, I suppose. At 1280x960, an X1900XT seems perfectly fine.

3. Yeah, there's still issues with SLI; however, I'm sure they'll be ironed out. That, or Crossfire will eventually catch up & give nVidia a rude awakening.

4. Not necessary?! But the e-penis! The E-PENIS! Dual-core FX-60s and Quad-SLI are the viagra of the e-penis!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Finny
I guess it's more "future proofing" oneself than anything.

But the e-penis! The E-PENIS! Dual-core FX-60s and Quad-SLI are the viagra of the e-penis!

Ya the problem is future proofing doesn't really make sense in the case of dual core processors. If a user can't benefit from dual core today, it makes sense to save $150 instead of getting X2 3800+ for $300. In the future you can then put $150 towards X2 4600+ when it now costs $300 and you decide you need it, or games start to take advantage of it.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: CP5670
Does this game even use EXR HDR on ATI cards?
No, ATI cards run in the apparently ATI-coded "SM2 HDR" mode, which is FX16 (fixed point, aka integer, 16bit, to accomodate the X-series, which can't blend FP formats). NV runs in "SM3 HDR" mode, which is FP16 (floating point 16bit). I'm not sure why ATI's SM3 parts don't (or can't?) run the NV SM3 mode.

(AFAIK, "EXR" basically refers to FP16, or at least was adopted as FP16 precision by NV et al.)
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
Originally posted by: RussianSensation
Originally posted by: Finny
I guess it's more "future proofing" oneself than anything.

But the e-penis! The E-PENIS! Dual-core FX-60s and Quad-SLI are the viagra of the e-penis!

Ya the problem is future proofing doesn't really make sense in the case of dual core processors. If a user can't benefit from dual core today, it makes sense to save $150 instead of getting X2 3800+ for $300. In the future you can then put $150 towards X2 4600+ when it now costs $300 and you decide you need it, or games start to take advantage of it.
actually you do benefit when multitasking from dual cores. And running several things at one time. Game on one core and have things running in the backround.
 
Oct 4, 2004
10,521
6
81
To summarize the Single-Card Configuration from the Xbitlabs review
(ALL DATA from the 1600x1200 with 4X AA/16XAF graphs)

1) MAJOR WINS (>10% avg FPS difference)
X1900XTX: Battlefield 2, Elder Scrolls: Oblivion (actually tied with avg. FPS but MUCH higher min. FPS), Far Cry with AA, Half-Life 2, Splinter Cell - Chaos Theory (Radeon has MUCH higher min. FPS)
7900GTX : The Chronicles of Riddick, Far Cry with HDR and NO AA, Half-Life 2: Lost Coast, Pacific Fighters, Warhammer 40K: Dawn of War (Mops the floor with ATi...22% higher avg, and more than DOUBLE the min. FPS), Project Snowblind

2) MODERATE WINS (5-10% avg FPS difference)
X1900XTX: Age of Empires 3 (with noticably higher min. FPS), F.E.A.R
7900GTX: Doom 3, Serious Sam 2

3) CLOSE WINS ( <5% avg fps difference)
X1900XTX: None
7900GTX: Call of Duty 2

4) TIE: Quake 4
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |