Anand's 9800XT and FX5950 review, part 2

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
oldfart: I think the point ben is trying to make is "yes STC1" didnt look pretty. But Nvidia was just following the spec that was laid out by the maker of the spec. So the question is "why do you rag on Nvidia" for supporting the spec to a T?

What if there was something whacky in DX9 that worked like crap. ATI supported it to a T. Would you be bashing ATI for supporting the spec to a T?

 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Genx87
oldfart: I think the point ben is trying to make is "yes STC1" didnt look pretty. But Nvidia was just following the spec that was laid out by the maker of the spec. So the question is "why do you rag on Nvidia" for supporting the spec to a T?

What if there was something whacky in DX9 that worked like crap. ATI supported it to a T. Would you be bashing ATI for supporting the spec to a T?
That is why I would like to see the actual spec. Does it say 16 bit only, or does it allow for greater? I have not seen the actual spec, have you? We've all seen different interpretations of specs (in this very thread). It is also worthy to note that nVidia finally went to 32 bit in GF4 and up cards. I'm not bashing nVidia for that issue. It is the harping on one company for issues and ignoring or justifying issues of another. ATi's rolling lines for instance. Yup that is an certainly an issue. Where is the mention the nVidia had this problem on GF2 cards? How about the FX 3D flickering issue that nVidia themselves calls a "similar issue" to the ATi rolling lines?

Anyway, I shouldn't even posted this since I said I'm done with this. This is not about texture compression.

The reason I came into this thread was the accusations of Anandtech doctoring the sceenshots in the review and the comment that Valve admitted to rigging the HL2 tests. Both of which are false statements.

I would still like to see the actual MS DXTC1 spec if anyone knows where to get it.


 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
They were cheating in 3DMark2K3, I've said in numerous times.
Wonderful, so you've finally admitted that all eight of FutureMark's findings (which included application detection) are cheats. Thus we can finally move on.

Why do you make that leap?
Because they took great steps to hide it.

List some. List the exact title, exactly what the cheat is and what it causes.
We can start off with the link I gave you at Beyond3D and I will respond to your point later in this post.

They did admit it,
Can I please see a link to a press release or otherwise coming straight from nVidia (ie not just a rumour site saying this is what we heard when we put a glass to the wall of nVidia's conference room) that nVidia admitted to cheating?

You honestly think they could come up with a scheme that the community couldn't hack through inside of a few days?
I don't have the information regarding nVidia's driver strength to make such a guess. However given that it's likely that they started implementing the cheats after the NV30's lackluster debut then it's quite possible they began planning measures to counteract any applications that could detect the cheats.

Besides, I believe the latest patch scripts from RivaTuner have broken it although they don't work on all versions of the drivers.

If they could, the entertainment industry would gladly pay nVidia's driver team millions and millions of dollars to sort out their piracy issue.
What are you talking about? Methods such as 128 bit encryption form the basis of most encryption and are pretty much unbreakable unless you apply brute force to them. nVidia obviously didn't even implement something that strong otherwise Unwinder couldn't have broken it, which provides more basis to suggests a rush job rather than a long term plan.

Not quite. nVidia's DXTC3 was superior to ATi's DXTC1 in terms of image quality.
Yes it was slightly superior and it also came with a 10% performance hit. Also for titles like UT your were simply SOL on nVidia cards and in order to enjoy the richer textures you had to enjoy rainbow coloured artifacts in coronas, skies and everything else that had a hint of transparency.

That was what S3 did, and they created the standard.
And it doesn't make any sense. The angle I'm approaching it from is that there's a performace gain from doing but I can't see why that would be the case.

How many games does it impact?
All Quake 3 engined games, UT, etc.

They knew they had an issue, they implanted a switch in the registry to force the use of S3TC3 for those that wanted to(unfortunately that would not work for UT as the textures were all precompressed which was not the case with the other titles that compressed at run time).
Yes and the switch was applied after the bechmarks had been run and the user had installed the card and found abysmal image quality. How many websites applied the switch before benchmarking? Again, it's just another example of nVidia artifically inflating benchmark results through methods that are essentially not available to end-users.

Same with S3's, why aren't you bashing them about it?
Because S3 were long dead when the Radeon/GTS/Voodoo5 benchmarks were in full swing.

That sure as hell wasn't the case at the Basement.
I never said or implied that it was.

Again, what about S3?
Same deal but not a factor as they weren't in the game.

Using that same logic, the R9700Pro has faulty PS2.0 support.
No because running PS 2.0 on a 9700 Pro doesn't create an unusable experience on the Radeon 9700 Pro, neither does it inflate benchmark performance.

My issue with nVidia is for including an unusable feature and also enabling it by default to inflate benchmark numbers. 99% of nVidia users would have either applied the DTX3 fix or turned off texture compression and both methods have a negative impact on performance, throwing out the benchmarks they initially used to make a purchase decision.

If I applied your logic then absolutely. I don't apply your logic however.
I still fail to see how a PS 3.0 application wouldn't work on your 9500 Pro but then worked on your Ti4600. Or is this a hypothetical example?

They aren't using app detection for that, they do it for all D3D games.
So why does the unaltered Direct3D app that Dave intially tested use trilinear until he changed the executable to "UT2003.exe"?

As far as using app detection for optimizations, PowerVR does this an incredible amount. Are they cheating in d@mn near every game you have heard of?
Yes but again it's like S3 and isn't as much of a factor since they're out of the game. However if they released the Kyro4 which showed comparable performance to today's high end cards and then further investigation showed application detection, shader subsitution and the like then hell yes, I'd be all over it like a rash.

But yes you're quite right, application detection is cheating no matter who does it. You shouldn't be able to change the behaviour of a driver based on the executable name that it's running.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
This doesnt surprise me this has been drowned out by the fanATIcs.
Nothing has been drowned out. This is the fourth time I'm saying that ATi was found to be relying on application detection and promised to remove it, which they did. And FutureMark sure as hell didn't find eight different forms of cheating like they did with nVidia.

Key here for you fanATIcs. 90,0,45, 22.5


You do know that the NV3x core is using adaptive anisotropic filtering as well? Also ATi's AF has always incurred negligable performance hits, unlike the awful performance that nVidia's pre-NV3x AF causes. On my Ti4600 I could cut my performance to just 50% in certain games just by using AF x2. I guess that's what happens when the hardware suddenly decides to halve its texel fillrate. .

It is also worthy to note that nVidia finally went to 32 bit in GF4 and up cards.
Ironically no, I don't believe so. I'm not sure about the FX cards but AFAIK the GF4s still use 16 bit textures/interpolation (if you want to believe Ben) but they apply some form of dithering to reduce most of the negative image quality. Most, but the Radeon still has the edge in the side-by-side screenshots I've compared between my 9700 Pro and Ti4600.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
DPS-

Is my understanding correct that currently ATI cards support DX9 games essentially the way Microsoft intended DirectX to work as a universal standard, and that the latest Nvidia cards require specific drivers from Nvidia which sort of adapt their card to DX9 games in a way that is game specfic ?

Not quite. ATi's and nVidia's shader architecture is very different. Due to this, using different compiler optimizations effect them in different ways. The big issue between ATi's and nVidia's shaders are performance levels. With ATi getting their DX9 part out the door in the timeframe they did, and nVidia being way late with theirs, ATi has spent quite a bit of time as the lead development platform.

BFG

Because they took great steps to hide it.

They have to be doing something worth hiding first.

And it doesn't make any sense.

(About S3) Why doesn't it make sense?

Because S3 were long dead when the Radeon/GTS/Voodoo5 benchmarks were in full swing.

The Savage2000 came out after the original GeForce.

All Quake 3 engined games, UT, etc.

I have Quake3, RTCW, Alice, JKII and JKIII- Quake3 is the only one that had major issues.

Again, it's just another example of nVidia artifically inflating benchmark results through methods that are essentially not available to end-users.

This is how it went down- nVidia and S3 reached a cross licensing agreement due to their court case. When they reached that agreement nVidia enabled S3TC in their drivers. Because they enabled it, Quake3 used it by default. It wasn't nVidia forcing S3TC on in Quake3, the game searched for support and enabled it on its own with the builds they were using at the time.

99% of nVidia users would have either applied the DTX3 fix or turned off texture compression and both methods have a negative impact on performance, throwing out the benchmarks they initially used to make a purchase decision.

Carmack is the one who enabled it by default in Quake3, and then disabled it when people disliked what it did.

Or is this a hypothetical example?

Hypothetical.

So why does the unaltered Direct3D app that Dave intially tested use trilinear until he changed the executable to "UT2003.exe"?

They were using app detection, perhaps I should have clarified on that. With the latest driver builds they have changed over from app specific to API level.

Yes but again it's like S3 and isn't as much of a factor since they're out of the game. However if they released the Kyro4 which showed comparable performance to today's high end cards and then further investigation showed application detection, shader subsitution and the like then hell yes, I'd be all over it like a rash.

They did it with the Kyro and Kyro2(app detection), I did not bash them then about it, and I'm not going to bash nVidia now over the same thing(nor would I bash ATi). It has nothing to do with the fact that this is nVidia, it has to do with keeping the same position I have had for all graphics card companies for years. I'm not going to change my stance on issues just because the masses are on a different bandwagon.

But yes you're quite right, application detection is cheating no matter who does it.

I don't think it is. With the Kyro2 some games would have serious image corruption without app detection.

Oldfart-

The reason I came into this thread was the accusations of Anandtech doctoring the sceenshots

I didn't say they were doctored, I said they were screwed up. Not only have I linked you to quotes to other people saying the same thing, there has also been screenshots posted in this thread from the exact same location that demonstrated exactly what I was talking about.

the comment that Valve admitted to rigging the HL2 tests.

Have you been following the conversation I've been having with Dave? With everything he has stated that Valve was doing it seems to me that it is what was going on. They were trying to show developers that nVidia had poor shader performance unless you gave them specific optimizations and one of those being MS native, there was no need to use it as it wasn't what they were trying to show.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Genx87
ATI implicated in Game Test 4

This doesnt surprise me this has been drowned out by the fanATIcs.
Please don't bait me with incorrect information. ATi did not lower IQ in GT4, they reordered a shader to better suit their shader. Their reordering provided the same output, and I didn't see one screenshot showing degraded IQ. Did you?

If you actually look at those AM3 screens D-L provided, you'll see nV shows multiple instances of AF inferior to ATi (particularly in the distance, but most noticable on the shot with the green buggy in the foreground), and D-L was remiss not to note them

So you are saying the link you provided the reviewers are wrong? Let me get this correct...............................

And if you really think ATI's AF is better than Nvidias oh boy.....................

Key here for you fanATIcs. 90,0,45, 22.5
More mindless bait. Did you actually look at the screenshots I linked, or are you simply taking their word over mine because they have a website? Here's a tip for you: check the Serious Sam 2 shot, the one without AF--nV just looks a lot worse. And I don't see ATi's angle-specific AF hurting their IQ more than nV in those AQ3 shots--I just see that nV's AF looks almost nonexistent beyond a certain point, and is clearly inferior on the green buggy.

Here's something else to stretch your mind: did you know that apparently nV doesn't do more than 2x AF beyond a certain texture layer? That nV's AF on the FX architecture is NOT like on the GF3/4 series, but is adaptive (albeit in different ways than ATi)? That the 52.14 driver doesn't do true trilinear filtering, according to 3DCenter's analysis?

Genx87, please think or research harder before adding to the already high level of noise in these forums. TIA.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Please don't bait me with incorrect information. ATi did not lower IQ in GT4, they reordered a shader to better suit their shader. Their reordering provided the same output, and I didn't see one screenshot showing degraded IQ. Did you?


I show you proof and you refute it? Wonderful........................They did indeed lower the output of the game. If they didnt then there was no reason for the shader replacement to be removed.

Since you want a little research. Here is a review from Beyond3d.com

ATI AF Quality

NV30 AF

NV35 AF

While the NV30 had obvious issues with balanced and aggressive the application mode is much better than the highest quality on the 9800 Pro. Same can be said of the 5900 Ultra.


And while we are on the subject of AF quality Tom had an interesting issue with his latest review of AM3.

Toms AM3 Review

I found this quote interesting...............................

Regarding the low quality of the ATI drivers, we assume that triangles of the terrain are in a unfavorable angle for ATI´s adaptive filtering technology.


Just something to stretch your mind with
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
[b\Ben[/b]
I didn't say they were doctored, I said they were screwed up. Not only have I linked you to quotes to other people saying the same thing, there has also been screenshots posted in this thread from the exact same location that demonstrated exactly what I was talking about.
Well, I guess it is up to Anandtech to answer that one then. How did the shots get screwed up to show AA that doesn't work on ATi cards? How were jaggies added in on nVidia shots? The clarity on some shots could be explained by jpeg compression, but that is for Anandtech to answer.

The 3 AQ3 screen shot sizes:
nv45.23.alloff.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 37.1 KB (38,031 bytes)

nv52.14.alloff.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 38.9 KB (39,892 bytes)

cat3.7.alloff.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 38.7 KB (39,690 bytes)

All basically the same. I dont know of a way to measure the level of JPEG compression.
With everything he has stated that Valve was doing it seems to me that it is what was going on.
"Seems to me" and Valve admitting to rigging the test are not the same thing. I haven't seen anyone else say that.



 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Genx87, you obviously know next to nothing about the 3DM03 cheating and FM's paper. You did not show me "proof" that ATi was "lowering IQ." FM prohibits shader replacement, period, and that's why they asked ATi to remove it--not because it reduced IQ. Replacing a shader (or shuffling the instruction order, in ATi's case) doesn't necessarily lead to lower quality. And you don't need to link B3D articles for me, I've read them all, and none of them support you WRT to my questions OR your assertions.

Regarding AF quality, don't show me articles that are months old. You know nV has been changing their AF quality for the past few driver releases, right? Look into it.

As for the THG link, did you cast your rose-tinted eyes on the 51.75 screenshot? Consider that a summary of the 3DC article.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
How did the shots get screwed up to show AA that doesn't work on ATi cards?

I have given one possible explenation. Use your head on this, how is the Radeon exhibiting AA on the textures? We know it is not something the R9800 can do currently.

How were jaggies added in on nVidia shots?

When did I say anything remotely resembling anything like that? The screenshots are screwed up. Yet again I will point out that THUGSROOK posted his own screenshot showing the exact same frame and it looks considerably different then Anand's. nVidia's edge AA isn't as good as ATi's and I've never claimed otherwise(though I'm waiting for you to accuse me of saying it, so I can fit in to your biggest nVidiot on the web illusion you have created for yourself).

All basically the same. I dont know of a way to measure the level of JPEG compression.

They were changed. By that I mean the screenshots up now are not the same as the ones that were posted at first. If they were, someone would have mentioned it five days ago when I brought it up.

"Seems to me" and Valve admitting to rigging the test are not the same thing. I haven't seen anyone else say that.

Valve claimed they did everything they could to get nVidia's performance up. MS has a compiler that is faster for the FX then the default. Valve has this compiler and in fact admitted to using it for mixed mode but did not use it for the pure path. Dave stated Valve was trying to send a message with the public bench, a message about nVidia's shader performance.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Pete are you telling me ATI has magically modified their hardware to perform full AF at angles other than 90 and 0?

 

Sazar

Member
Oct 1, 2003
62
0
0
without getting into trolling and flamebaiting

genx... ati applies AF at specific angles that are present in most all games in most all environments... perhaps the only cases this may not be relevant may be flight sims due to the degrees aircraft will turn while in the air..

in rts and 3d fps's... the AF is as good as the competition... be it nvidia or matrox... I may have to give the nod to matrox since they do have exceptional IQ but thats another story...

generally you will not notice the IQ for AF being an issue... and there are many cases where ati's AF is better than nvidia's... selective screenshots from selective games will not show this... but taking the general sum of screenshots will...

on the whole... it is conceded that ati's AF is perhaps a step below nvidia's...

on the other hand... AA... there is NO excuse for one applying 4xaa and still seeing jaggies on VERTICAL lines to the extent you will see on practically any nvidia screenshot...

they do a decent job there... but lets face it... AA incurs a larger performance hit overall...

personally I am looking forward to seeing what the new shader replacement algorithms are capable of... though I feel there should be a disclaimer instead of handing out apples to apples certificate of approval initially... it is my hope that a few revisions in the shaders output will be good enough to match the IQ intended by game devs instead of IQ anticipated by driver devs...
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Ben
Yet again I will point out that THUGSROOK posted his own screenshot showing the exact same frame and it looks considerably different then Anand's

It should

Anands:
The 3 AQ3 screen shot sizes:
nv45.23.alloff.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 37.1 KB (38,031 bytes)

nv52.14.alloff.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 38.9 KB (39,892 bytes)

cat3.7.alloff.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 38.7 KB (39,690 bytes)

Thugsrook:
Frame04000.jpg
450 x 300
72 DPI
Pixel depth/colors 24/16M
File size: 62.1 KB (63,603 bytes)

Since it is ~ a 75% larger file size, it obviously used a lower lever of jpeg compression. How is this even a valid comparison? I think the shots that are all the same file size, done by the same person at the same time with the same program and the same level of jpeg compression are a wee bit more of a valid comparison than the shot that Thugs did.

What I did find quite interesting on Thug's shot, is that his shot showed a background and neither the ATi nor the nVidia normal drivers do. I've been reading good things about the Omega drivers on both nVidia and ATi cards so I gave them a whirl. So far, they seem good. I now also have the background like the shot Thugs posted in AQ3. Very interesting.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Genx87
Pete are you telling me ATI has magically modified their hardware to perform full AF at angles other than 90 and 0?

Stop trolling. ATi's R3x0 and RV3x0 series now perform full AF at 0, 45, and 90 degrees. They also allow for full trilinear. This is different from the R(V)250's limitation of bilinear-only AF with maximum samples at 0 and 90 degrees.

nV's FX series do NOT have the same flawless AF as their GF4Ti series, and nV has been decreasing the FXs' AF since that line's release. So you can no longer say nV's AF is better than ATi's without qualification. Both have their pros and cons.

Read what I linked before you pull another post from the past from your troll file. I can understand if you were arguing from an informed perspective or if you were willing to learn, but it's clear all you're doing is trolling. You're only painting yourself more the fool with every clueless protest.
 

muzz

Member
May 17, 2003
27
0
0
NV blows, they will fk you fans right in the ass, so I hope ya have a condom and some KY cuz the big rammin' is coming.

They have porked ya for so damn long, that praying for a bs miracle is beyond hope.....

They don't care about you, they never have and they never will.......

So get over it ya fkn clowns.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Ben:
They have to be doing something worth hiding first.
Then why hide something that they're not doing?

Would you make a bunker for hiding drugs if you didn't have any nor ever planned to have any?
Would you make a secret stash to hide illegal weapons if you didn't have any nor ever planned to have any?

(About S3) Why doesn't it make sense?
I think they're doing it to inflate performance which is why nVidia chose to follow their example. ATi and 3dfx obviously quite rightly felt image quality is more important than providing an unusable feature on their cards.

The Savage2000 came out after the original GeForce.
Rarely did I see S3 boards in the standard benchmarking bundle. But yes, I agree that S3's method sucked and I've said this before.

I have Quake3, RTCW, Alice, JKII and JKIII- Quake3 is the only one that had major issues.
Because it's turned off by default. However on the Radeon I can enable it to get better performance with a negligable loss in image quality. Also do you have Medal of Honour? Enabling texture compression in that game increases image quality because the engine loads higher quality textures. Of course on pre-NV25 cards you also get nice rainbow banding to go with those large textures.

Because they enabled it, Quake3 used it by default.
Yes and it did the same on Radeon and VSA-100 based boards and had great image quality as well.

Carmack is the one who enabled it by default in Quake3, and then disabled it when people disliked what it did.
If you have a Radeon or Voodoo you can happily enable it again. That's my point - nVidia (and S3) have created an unusable feature. It's nothing to do with Carmack or Quake III (apart from the lightmap issue of course).

They were using app detection, perhaps I should have clarified on that. With the latest driver builds they have changed over from app specific to API level.
That's still cheating. If the user requests trilinear AF in the drivers then the drivers have no right to override those settings.

They did it with the Kyro and Kyro2(app detection),
Application detection is bad because it's a fragile optimisation that can break with later versions of the program or by simply renaming the executable for whatever reason. It also removes all control from the user.

With the Kyro2 some games would have serious image corruption without app detection.
That's a very valid point. On that issue I propose that there should options in the driver control panel to both let the user know what is happening and allow them to overwrite it.

It doesn't have to be anything complex, just a simple tick box along the lines of "allow drivers to automatically detect applications"; manufacturers can even enable it by default if they like.

That gives control back to the user instead of just having the drivers to run off and do whatever they like without telling anyone. It also takes away the blame of cheating from the manufacturer since the reviewer can untick it if he/she doesn't like what the option does.

Valve claimed they did everything they could to get nVidia's performance up.
And they did, with a mixed mode, hand-optimised, Microsoft compiled rendering path. If that path can't beat ATi's then there's no way the full precision path is going to do so, with or without Microsoft's compiler. Why waste time and resources on something as useless as that? Valve already spent five times as much time on nVidia's rendering path than they did on ATi's. At some point you just have to accept bad hardware when you see it and move on.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Genx87:
And yes ATI was caught with a lower IQ in 3dmark in game test 4.
No they weren't. ATi rendered exactly what they were asked (i.e. they didn't degrade image quality or features at all) and they did not perform shader subsitution either. The shader in question had genuine optimisations (instruction shuffling is a totally legitimate thing to do) but unfortunately it relied on application detection to work, which is the cheat that FutureMark was reporting. Also ATi promised to remove it and they did so.

Just something to stretch your mind with
And here's something to stretch yours: nVidia's FX series do something similar, and worse when it suits them.

are you telling me ATI has magically modified their hardware to perform full AF at angles other than 90 and 0?
Yes, but it's called hardware engineering not magic. The R3xx (and RV3xx) boards apply full strength anisotropic to every 45 degree increment in the 360 degree wheel and 2x AF (8 or 16 tap) to every 22.5 increment. Between those angles the anisotropy level that is applied ranges between the aforementioned highest and lowest values.

In addition I've observed that the core has other improvements such as reducing the visible signs of mip-map boundaries when performing bilinear AF, thus in many cases making it look equal to trilinear AF without the aid of coloured mip-map boundaries to show otherwise.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Whew this thread is a nighmare..bilinear AF, aa, fsaa, 16 bit ... whatever... over my head...I just read the conclusions. I think many of you could save yourself all this bickering by doing the same.

Originally posted by: THUGSROOK
lets put it this way~

9800XT is on my Xmas list
)
 

muzz

Member
May 17, 2003
27
0
0
Originally posted by: Pete
Originally posted by: muzz


[tripe] Tripe my butt......... Talk about magical Det's... gimme a break with that crap.

This is totally unnecessary. Please keep this kind of unconstructive garbage to yourself.

Yes sir!! Is there anything else I can do for you?

I guess telling the truth around here gets the gods a lil pizzed huh.........

 

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
Originally posted by: muzz
Originally posted by: Pete
Originally posted by: muzz


[tripe] Tripe my butt......... Talk about magical Det's... gimme a break with that crap.

This is totally unnecessary. Please keep this kind of unconstructive garbage to yourself.

Yes sir!! Is there anything else I can do for you?

I guess telling the truth around here gets the gods a lil pizzed huh.........


No, not the "truth-telling", just the trolls.



Mike G
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
"are you telling me ATI has magically modified their hardware to perform full AF at angles other than 90 and 0?"

Within the footprint (which composes over 90% of the POV in any given game) it does and it does it at 128tap and nvidia's af does 64tap within a footprint that the algo determines is the best "for the fov" I'd rather not have more adaptive af thank you.

rogo
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
I'd be careful flinging the "troll" word about ginfest-you're bias shows up very conspiciously in PMs and your system specs.

rogo
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: muzz
NV blows, they will fk you fans right in the ass, so I hope ya have a condom and some KY cuz the big rammin' is coming.

They have porked ya for so damn long, that praying for a bs miracle is beyond hope.....

They don't care about you, they never have and they never will.......

So get over it ya fkn clowns.



The "big ramming"?! Spoken like a man who's yet a virgin....
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |