thanks but tweakguides say that only 1,2,4 and 8 can be used.Originally posted by: BFG10K
Toyota, it's a console command.
image_anisotropy 16 will set AF to 16x.
Originally posted by: toyota
well i didnt see that. i looked at tweakguides and they said the "image_anisotropy setting" has valid values are 1, 2, 4, and 8. SO HOW THE HELL CAN XBITLABS DO 16AF??
well of course common sense would say 16. i cant find anything that actually says it will work. i am pretty sure 16af was around before tweakguide for doom3 came out. its not that new of a setting is it?Originally posted by: Sable
Originally posted by: toyota
well i didnt see that. i looked at tweakguides and they said the "image_anisotropy setting" has valid values are 1, 2, 4, and 8. SO HOW THE HELL CAN XBITLABS DO 16AF??
I'd guess by using the number 16? I reckon that tweak guide was written before 16af was possible on cards. Doom3 is pretty old now.
Okay, you can believe what Tweakguides says or you can try it yourself. It's your choice. :roll:thanks but tweakguides say that only 1,2,4 and 8 can be used.
Originally posted by: 5150Joker
The results Xbit got are completely different than what most other sites get in games like FEAR, BF2 and CoD 2. Take for example ExtremeTech's recent video card roundup. The XFX 7900 GTX XXX is an overclocked card with it's frequency at 700/1.8 GHz and costs $600 vs a standard 7900 GTX with 650/1.6 GHz clocks that costs $500. They used the XFX 7900 GTX XXX against a stock X1900 XTX:
CoD 2: 1280x1024 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130954,00.gif
X1900 XTX: 49 fps vs.
XFX 7900 GTX XXX: 45 fps
FEAR results: 1280x960 4xAA/8xAF
http://common.ziffdavisinternet.com/util_get_image/13/0,1425,i=130949,00.gif
X1900 XTX: 72 fps
XFX 7900 GTX XXX: 70 fps
So how is it that the X1900 XTX is losing to the stock GTX in Xbit labs review with 16x AF applied when it's beating an overclocked XFX GTX in extremetech's review with 8x AF? Everyone knows nVidia cards take a bigger hit with AF enabled. Which site is to be believed? Also Xbit failed to use AF with their Oblivion testing:
Xbit's Results: HDR Pure Speed
1280x1024:
7900 GTX: 15/43.3 fps
X1900 XTX: 27/42.2 fps
1600x1200:
7900 GTX: 14/36 fps
X1900 XTX: 22/36.3 fps
Firingsquad's Results: HDR 8x AF:
1280x1024:
7900 GTX: 33.6 fps
X1900 XTX: 37.8 fps
1600x1200:
7900 GTX: 20.3 fps
X1900 XTX: 27.3 fps
Originally posted by: 5150Joker
Originally posted by: Cookie Monster
Link
Lots of visuals.
Edit - after comparing the screen shots, ATi is alot darker than NV on the screenshots of the F-15C.
Comparing in HL2, why is it that in the ATi shot compared to the NV you see the dissapearance of the branches and grass?
Im not sure if ATi has "better" IQ as some claimed because some of the screenshots NV shots looks more detailed compared to ATi.
Can people with SLI or Crossfire post some screen shots?
Also i noted this.
ATi HDR
NV HDR
ATi HDR
NV HDR
What do you guys think?
Hmm don't have either of those games installed but maybe someone else here can check. I think short FRAPs videos would be a lot more telling than screenshots. I can host the vids if you guys decide to make some.
thanks. i was just trying to clear this up. i have never messed with anything but the nvidia control panel or in game settings. i uninstalled doom3 a few days ago. after you apply the the 16af like that does it stay or do you have to do it every time you start the game?? also does it actually look any better cause i cant even tell the difference between high and ultra?Originally posted by: BFG10K
Okay, you can believe what Tweakguides says or you can try it yourself. It's your choice. :roll:thanks but tweakguides say that only 1,2,4 and 8 can be used.
Now for me personally, I've been running 16xAF in Doom 3 since the day it came out.
ok last question. lol. does it actually look any better cause i cant even tell the difference between high and ultra?Originally posted by: BFG10K
The setting will stay because it's saved in the config file.
However keep in mind that if you change the global IQ setting (e.g. high quality to ultra) it may change to another value. If it does just change it back to 16 again.
Originally posted by: Barkotron
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003
EDIT: And WTF is this statement about? "If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot."
This only shows you're predisposed to thinking the only way Nvidia can gain performance in oblivion is to cheat and reduce IQ. This my friend shows the rest of us exactly where you stand. Nothing further your honor.
More meaningless drivel. It doesn't show anything of the sort. Stop reading things into my comments that aren't there - if you're going to read that into my comments, it shows a lot more about your bias than mine.
If they can make that up, then great. I'm dubious that anyone can make up such a massive gap without dubious optimisations. Whether NV or anyone else.
munky
It's not that Nv has totally taken control of this forum. It's just that some people, including me, got a x1900xt(x) before Nv even released their g71, and as a single card it's still better than the g71, so I really dont care how many trolls spout pro-Nv crap, since I got a card that's faster and technologically more advanced than what they're raving about. Or, maybe it's because a certain AEG shill has been keeping his head low these days, so there's less flaming going on.
Originally posted by: Finny
Well, I guess I'll just reply to several points.
I still say that a site somewhere should both test games w/ the latest official and beta drivers, so as to prove whether or not a "performance increase" is really there or not.
munky
It's not that Nv has totally taken control of this forum. It's just that some people, including me, got a x1900xt(x) before Nv even released their g71, and as a single card it's still better than the g71, so I really dont care how many trolls spout pro-Nv crap, since I got a card that's faster and technologically more advanced than what they're raving about. Or, maybe it's because a certain AEG shill has been keeping his head low these days, so there's less flaming going on.
Well, we could debate the concept of truly being "better" till land's end, but I wont for the sake of not wasting my own time. I myself would prefer, for a single card, the 7900GTX, simply because I've had to stand a loud-as-hell computer for quite a while, and the GTX's cooler is very appealing. Furthermore, the card itself performs well enough, so I wouldn't feel like I wasted money... Of course, as I've said before, right now, I still consider the X1900XT the better bang-for-the-buck, simply because it costs much less (And it's generally in stock much more frequently, regardless of reason)
Speaking of computer noise, I also agree with some others on the 7900GT's cooling design. I mean, c'mon, WTF? It's more powerful than the 7800GTX (Non 512), so it should at least have the same cooling design as it did. That was notable for being very quiet as well, and for a high-end card like the 7900GT, it's only fitting to have a cooler that doesn't make someone think "puny".
Finally, I'd like to make one last statement: Why do you people keep discounting SLI, or any dual-card solution for that matter, as a viable high-end setup? The way I see it, you're basically saying that you can't truly count a dual-card solution as being truly more powerful than a single-card because the pair of cards are less powerful individually than the more powerful single card. It's like saying dual-core processors aren't worth it because you can buy a more powerful single-core processor, despite the advantages of dual-core.
The way I see it, things are going dual (Or just multi) powered now. If one can have a dual-core processor and count themselves as truly "high-end", why can't one do the same with a pair of video cards? A pair of 7900GTs, for example, outperforms an XTX and GTX in pretty much all situations (*Waits for the hunt for the game that disproves this*), so why should the owner not have the privilege to say that he's near the very top of the charts in performance?
Note- I'm not trying to belittle the power of your setup, if I ever came across like that. Quite the opposite, really, but I'm just trying to make a point here.
Originally posted by: otispunkmeyer
i have splinter cell.....i may reinstall it since im getting bored lol, ill take some video if i do
Originally posted by: BFG10K
Yeah, until you actually start playing said games at settings that deviate from the standard benchmark settings and/or start playing games that aren't actively benchmarked.
Originally posted by: munky
I just don't like dual setups because for twice the cost it does not give you twice the performance, and there are still issues involved with dual cards, like getting vsync to work correctly. Of course, if someone plays at 1600x1200 resolution or higher, then dual card may be necessary to provide smooth gameplay, but since I'm playing at 1280x960 a single high end card seems like a better choice for me.
And while we're on the whole "dual" topic, I recently switched from a single core cpu to a dual core cpu, and so far I've only seen huge improvement when encoding videos. Other than that, it performs about the same as my single core did in the tasks that I do. Games dont run faster, and even Windoze doesnt seem any more responsive than on a single core. So, a high end rig doesnt necessarily need a dual core cpu. But it does seem like techology in general is heading that way, with stuff like quad core cpu's, quad video card setups, and the 7-core cell processor.
Originally posted by: Finny
I guess it's more "future proofing" oneself than anything.
But the e-penis! The E-PENIS! Dual-core FX-60s and Quad-SLI are the viagra of the e-penis!
No, ATI cards run in the apparently ATI-coded "SM2 HDR" mode, which is FX16 (fixed point, aka integer, 16bit, to accomodate the X-series, which can't blend FP formats). NV runs in "SM3 HDR" mode, which is FP16 (floating point 16bit). I'm not sure why ATI's SM3 parts don't (or can't?) run the NV SM3 mode.Originally posted by: CP5670
Does this game even use EXR HDR on ATI cards?
actually you do benefit when multitasking from dual cores. And running several things at one time. Game on one core and have things running in the backround.Originally posted by: RussianSensation
Originally posted by: Finny
I guess it's more "future proofing" oneself than anything.
But the e-penis! The E-PENIS! Dual-core FX-60s and Quad-SLI are the viagra of the e-penis!
Ya the problem is future proofing doesn't really make sense in the case of dual core processors. If a user can't benefit from dual core today, it makes sense to save $150 instead of getting X2 3800+ for $300. In the future you can then put $150 towards X2 4600+ when it now costs $300 and you decide you need it, or games start to take advantage of it.