EVGA GeForce GTX 295+ Review

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SlowSpyder
If these are the new prices, then you're right, the price is close enough that the GTX295 would be a better buy quite possibly. What I meant by 'on average' is that whenever I looked in the past it was always very near $100, almost always. I'm not sure if these are the new prices, or these are just some hot deals that we won't see again. Like I mentioned earlier, there was just a deal that is dead now where the 4870x2 was $345.
No its not the new retail price, the quoted price on that GTX 295 is a sale price from a national retailer that's advertised to run for at least a month. But it makes no sense to factor in the "best sale price" for one part without factoring in the best sale prices for another. You either compare the best prices for both, or you compare the retail price of both. What you can't do is say "GTX 295 is $500 and 4870X2 is $449 with 50 MIR, so 4870X2 is $100 cheaper" when people have linked a $440 GTX 295 from a national retailer.

Originally posted by: Elfear
Interesting article. They seem to come to a different conclusion than a lot of other review sites but I like that they test the 4870X3 along with the other cards.
There's myriad reviews done with the latest drivers from both ATI and AMD (181.20 WHQL and 8.12 Hot Fix) comparing the GTX 295 and 4870X2 clearly showing the GTX 295 is the faster part, even in the majority of 8xAA or 2560+AA benches. Not all of the original launch GTX 295 reviews have the latest ATI hot fix, but almost all of the GTX 285 reviews have the latest drivers + GTX 295 and 4870X2.

FiringSquad
TechReport
PCGamesHardware
AnandTech

Certainly a bit surprising how Nvidia seems to have caught up in high bandwidth/VRAM situations from the initial set of previews done with beta drivers. Really comes down to whether or not the additional $50 or so is worth it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Elfear

I think Physx is a cool concept but it needs more implementation in games before it becomes "significantly more important" and even then I'm dubious. AA can be applied in almost any game.

AA adds really nothing to gameplay. It is truly pure eye candy. I like eye candy but not over gameplay. For example 3dMark has plenty of eye candy.


Originally posted by: SlowSpyder
Like I mentioned earlier, there was just a deal that is dead now where the 4870x2 was $345.

Which is of course irrelevant.

 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
There may be a new king when it comes to single graphics card performance but it remains the same in IQ. ATi still owns that segment. Case in point is the Far Cry 2 pics. If you look to the top left (i circled it) you can see the ATi card renders the indentation correctly while the NVIDIA card makes it look like a flat surface.

NVIDIA GTX295

ATi 4870x2

For the record, I didn't read the review, only looked at the benches and IQ comparisons. So if its mentioned in the article, forgive me. The only reason I can think of for NVIDIA not rendering this correctly has to be boosting framerates. What other reason could there be?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: PC Surgeon
There may be a new king when it comes to single graphics card performance but it remains the same in IQ. ATi still owns that segment. Case in point is the Far Cry 2 pics. If you look to the top left (i circled it) you can see the ATi card renders the indentation correctly while the NVIDIA card makes it look like a flat surface.

NVIDIA GTX295

ATi 4870x2

For the record, I didn't read the review, only looked at the benches and IQ comparisons. So if its mentioned in the article, forgive me. The only reason I can think of for NVIDIA not rendering this correctly has to be boosting framerates. What other reason could there be?

It's not a good comparison.

The angle of the screenshots is different, so the shadowing could be.

Also, the clouds in the sky are totally different, so the sun could be at a different angle, or obscured.

Last, you could also say the ATi screenshot isn't accurately showing the reflection of the glare off the roof while the NVIDIA card does.

When the screenshots are this sloppy no comparison is possible. Whoever did those should have saved the game at that point, loaded, snap. Changed cards, loaded, snap.

This is readily apparent if you load both pictures and toggle back and forth.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: PC Surgeon
There may be a new king when it comes to single graphics card performance but it remains the same in IQ. ATi still owns that segment. Case in point is the Far Cry 2 pics. If you look to the top left (i circled it) you can see the ATi card renders the indentation correctly while the NVIDIA card makes it look like a flat surface.

NVIDIA GTX295

ATi 4870x2

For the record, I didn't read the review, only looked at the benches and IQ comparisons. So if its mentioned in the article, forgive me. The only reason I can think of for NVIDIA not rendering this correctly has to be boosting framerates. What other reason could there be?
Interesting you'd come to this conclusion, given most review sites have given the nod to Nvidia for IQ since the 8800 launch, particularly when it comes to AF. Reviewers also comment on greater texture shimmering due to under-sampled AF on ATI parts, and that's before you get into any rendering errors as seen in games like Assassin's Creed, UT3 and most recently FC2. Some sites have guessed its due to ATI's over-aggressive Z-cull algorithms, resulting in some of the errors seen in the games mentioned above.

Here's a pretty good recent comparison between Nvidia and ATI IQ, they also make note of the FC2 differences with heavier shadowing in ATI's 8.12 driver:

PCGH Nvidia vs. ATI Image Quality Comparison with mouse-over pictures

Author Marc Sauter:
But in the direct comparison between a current Geforce and a current Radeon differences cannot be dismissed. While Call of Duty: World at War, Fallout 3, Need for Speed: Undercover, Left 4 Dead and Race Driver Grid look almost identical, other titles reveal small differences. In Assassin's Creed, Far Cry 2 and especially Crysis Warhead the textures are sharper on the Radeon indeed, but they also flicker more than on a Geforce if the player is moving. Furthermore it seems like the Radeon renders more shadows in Far Cry 2 since the 8.12 driver was applied.

So which producer offers the better visual quality in current games? AMD has sharper, but flickering textures by trend and Nvidia has the more settled textures. But since a Geforce has the option to force superior HQ anisotropic filtering, the AF is better.


Personal opinion of PCGH editor Raffael:
With the default settings of the driver the following applies to both Ati and Nvidia: Sampling rules that have been created for good reason are regarded to be unimportant and are ignored. The algorithms under-filter, thus they cut samples/calculations. Under-filtering matches flickering. This may look great (see Crysis on the Radeon) on pictures, but flickers heavily in motion.



 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
It would have been nice for them to try SLi AA as well. IMHO, i prefer nVIDIA's AA flexibility. You have CSAA, hybrid AA (xS), SLi AA (if you own a dual GPU setup), SSAA (OGSS only) and so forth.

Anyway, I still think the GTX295 isnt a clear winner. It runs out of memory at 2560x1600 resolution with AA applied in most modern titles. Bandwidth isnt the problem, but the freambuffer size is.
 

dadach

Senior member
Nov 27, 2005
204
0
76
Originally posted by: Wreckage
Originally posted by: nitromullet
Originally posted by: Elfear
Like Slowspyder mentioned, on average the price difference has been ~$100 and with Darkrage's link the difference is actually $120.

The largest price difference on Newegg is $100. HIS 4870 X2 for $409.99 AMIR and Asus GTX 295 for $509.99. On average the price difference is not even $100, and at $439.99 the GTX 295 Rollo linked to is the cheaper out-of-pocket card between the two.

Yeah I think their cost argument is irrelevant at this point.

The AA argument has also been done to death. Most people don't notice a difference beyond 4X or they don't care. It is funny when some people will say that PhysX is unimportant eye candy but will argue that one level of AA or another is a game changing life altering event. Ha!


in how many games can one use 8xAA?...and in how many games can one use physx...once more physx games come out, it will become as important, but currently it is really a non-factor

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: dadach


in how many games can one use 8xAA?...and in how many games can one use physx...once more physx games come out, it will become as important, but currently it is really a non-factor

You missed the point entirely. :roll:
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Cookie Monster
It would have been nice for them to try SLi AA as well. IMHO, i prefer nVIDIA's AA flexibility. You have CSAA, hybrid AA (xS), SLi AA (if you own a dual GPU setup), SSAA (OGSS only) and so forth.

Anyway, I still think the GTX295 isnt a clear winner. It runs out of memory at 2560x1600 resolution with AA applied in most modern titles. Bandwidth isnt the problem, but the freambuffer size is.

it can run out in 1920 w/AA in crysis
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: nRollo
Originally posted by: PC Surgeon
There may be a new king when it comes to single graphics card performance but it remains the same in IQ. ATi still owns that segment. Case in point is the Far Cry 2 pics. If you look to the top left (i circled it) you can see the ATi card renders the indentation correctly while the NVIDIA card makes it look like a flat surface.

NVIDIA GTX295

ATi 4870x2

For the record, I didn't read the review, only looked at the benches and IQ comparisons. So if its mentioned in the article, forgive me. The only reason I can think of for NVIDIA not rendering this correctly has to be boosting framerates. What other reason could there be?

It's not a good comparison.

The angle of the screenshots is different, so the shadowing could be.

Also, the clouds in the sky are totally different, so the sun could be at a different angle, or obscured.

Last, you could also say the ATi screenshot isn't accurately showing the reflection of the glare off the roof while the NVIDIA card does.

When the screenshots are this sloppy no comparison is possible. Whoever did those should have saved the game at that point, loaded, snap. Changed cards, loaded, snap.

This is readily apparent if you load both pictures and toggle back and forth.

If you look at the screen shots, the ATi screen capture is a bit further away and yet it still renders the indentation. The NVIDIA card on the other hand, shows the indentation up close but a few steps back, ceases to render. A friend of mine took these shots with an EVGA Superclocked GTX260 (216) 1680x1050 with all settings maxed.

Close

a little further away

Further still, you can see the depth fading

Farthest shot, almost non-existant indentation
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: jaredpace
Originally posted by: Cookie Monster
It would have been nice for them to try SLi AA as well. IMHO, i prefer nVIDIA's AA flexibility. You have CSAA, hybrid AA (xS), SLi AA (if you own a dual GPU setup), SSAA (OGSS only) and so forth.

Anyway, I still think the GTX295 isnt a clear winner. It runs out of memory at 2560x1600 resolution with AA applied in most modern titles. Bandwidth isnt the problem, but the freambuffer size is.

it can run out in 1920 w/AA in crysis

O yea, well crysis is.. well crysis
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: chizow
Originally posted by: PC Surgeon
There may be a new king when it comes to single graphics card performance but it remains the same in IQ. ATi still owns that segment. Case in point is the Far Cry 2 pics. If you look to the top left (i circled it) you can see the ATi card renders the indentation correctly while the NVIDIA card makes it look like a flat surface.

NVIDIA GTX295

ATi 4870x2

For the record, I didn't read the review, only looked at the benches and IQ comparisons. So if its mentioned in the article, forgive me. The only reason I can think of for NVIDIA not rendering this correctly has to be boosting framerates. What other reason could there be?
Interesting you'd come to this conclusion, given most review sites have given the nod to Nvidia for IQ since the 8800 launch, particularly when it comes to AF. Reviewers also comment on greater texture shimmering due to under-sampled AF on ATI parts, and that's before you get into any rendering errors as seen in games like Assassin's Creed, UT3 and most recently FC2. Some sites have guessed its due to ATI's over-aggressive Z-cull algorithms, resulting in some of the errors seen in the games mentioned above.

Here's a pretty good recent comparison between Nvidia and ATI IQ, they also make note of the FC2 differences with heavier shadowing in ATI's 8.12 driver:

PCGH Nvidia vs. ATI Image Quality Comparison with mouse-over pictures

Author Marc Sauter:
But in the direct comparison between a current Geforce and a current Radeon differences cannot be dismissed. While Call of Duty: World at War, Fallout 3, Need for Speed: Undercover, Left 4 Dead and Race Driver Grid look almost identical, other titles reveal small differences. In Assassin's Creed, Far Cry 2 and especially Crysis Warhead the textures are sharper on the Radeon indeed, but they also flicker more than on a Geforce if the player is moving. Furthermore it seems like the Radeon renders more shadows in Far Cry 2 since the 8.12 driver was applied.

So which producer offers the better visual quality in current games? AMD has sharper, but flickering textures by trend and Nvidia has the more settled textures. But since a Geforce has the option to force superior HQ anisotropic filtering, the AF is better.


Personal opinion of PCGH editor Raffael:
With the default settings of the driver the following applies to both Ati and Nvidia: Sampling rules that have been created for good reason are regarded to be unimportant and are ignored. The algorithms under-filter, thus they cut samples/calculations. Under-filtering matches flickering. This may look great (see Crysis on the Radeon) on pictures, but flickers heavily in motion.

Review sites don't speak for my own eyes. If I see a difference in the IQ, it doesn't matter if the reviewer said God himself was in the game, seeing is believing.
 

Elfear

Diamond Member
May 30, 2004
7,126
738
126
Originally posted by: chizow

Originally posted by: Elfear
Interesting article. They seem to come to a different conclusion than a lot of other review sites but I like that they test the 4870X3 along with the other cards.
There's myriad reviews done with the latest drivers from both ATI and AMD (181.20 WHQL and 8.12 Hot Fix) comparing the GTX 295 and 4870X2 clearly showing the GTX 295 is the faster part, even in the majority of 8xAA or 2560+AA benches. Not all of the original launch GTX 295 reviews have the latest ATI hot fix, but almost all of the GTX 285 reviews have the latest drivers + GTX 295 and 4870X2.

FiringSquad
TechReport
PCGamesHardware
AnandTech

Certainly a bit surprising how Nvidia seems to have caught up in high bandwidth/VRAM situations from the initial set of previews done with beta drivers. Really comes down to whether or not the additional $50 or so is worth it.

Just out of curiosity I averaged out the % difference between the X2 and the 295 at 1920x1200 and 2560x1600 using all tested quality levels from the reviews you linked to. I only considered the two resolutions because the cards are made for high-res gaming and are largely a waste at anything lower. I also threw in Hardware France's review because A) I like their review style and B) they used WHQL 181.20 and 8.12 + Hotfix drivers.

Here is what I came up with, expressed in percentage form of how much faster the GTX 295 is than the 4870X2:

Review Site..............1920x1200.......2560x1600
Anandtech....................2.10%............3.40%
Firingsquad ...................6.90%...........-2.70%
Tech Report.................13.40%..........44.70%
PC Games Hardware.......4.96%
Hardware France............3.21%.........-64.71%

*It should be noted that Tech Report tested much fewer games and resolutions than the others.

Even the review sites you linked to don't come to the same conclusion as Xbit Labs, which was "[At 1920x1200] the new card has an average advantage of 20% over the Radeon HD 4870 X2" and "The new card has an average advantage of only 8% over the Radeon HD 4870 X2 at this resolution (2560x1600)."

Now part of that could be the games they tested but the other review sites had a wide variety of games and didn't show those results. Another factor was the GTX 295 being overclocked in the Xbit review but that shouldn't make such a big difference considering the paltry increase, 4.3%/3.1%/2.6% respectively on the core/shader/mem.

Now there is no doubt that overall the GTX 295 is the faster part but the disparity is actually pretty small and not nearly as large as Xbit makes it out to be, at least compared to other review sites.
 

Sliceup

Junior Member
Jan 21, 2009
10
0
0
Yea i took those screens on my machine. and yea i am running a Nvidia card. BUT the Ati clearly wins. and yea i agree with PC Surgeon seeing is believing.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: Wreckage


Originally posted by: SlowSpyder
Like I mentioned earlier, there was just a deal that is dead now where the 4870x2 was $345.

Which is of course irrelevant.

How so? It shows that someone in the market for either card can find great deals, and the 4870x2 is close to $100 when you look at the best deals.

It's funny that you make statements like using AA over 4x isn't important. In other threads you claim if someone doesn't care for Physx they might as well stick to intergrated graphics as they don't care for what hi-end PC gaming is all about. Yet you decide you can set the level of AA that is important to people acrossed the board so long as it supports your pro-Nvidia views. Maybe people buying this level of card want more than 4xAA.

And again, tell me what is so much more immersive about Physx in this comparrison?

http://www.youtube.com/watch?v=w0xRJt8rcmY

The GTX295 is a great card, no doubt about it, but I just don't get anyone can make a blanket statement about how much AA someone needs or what features are more important then others.
 

SirPaulie

Member
Jan 23, 2009
36
0
0
Interesting comparisons but what may be needed is identical shots from both IHV's and taking steps back as well to try to gauge this more with both.







 

SirPaulie

Member
Jan 23, 2009
36
0
0
Originally posted by: SlowSpyder

The GTX295 is a great card, no doubt about it, but I just don't get anyone can make a blanket statement about how much AA someone needs or what features are more important then others.


I'm with ya -- have trouble understanding the blanket view mind-set. When i read blanket views -- is like anti-choice views. "My view is the only view and is all that matters and the world revolves around my thinking and my tastes." Sounds Silly doesn't it? But that's what a blanket view sounds like to me at times. Choices is my bag - because our tastes, wallet sizes, mind-sets differ so --- to have choice allows the ability for many mind-sets to have fun and enjoy -- some may like no AA -- others PhysX -- others Stereo3d, others x8 AA -- budget, mainstream, performance, enthusiast, higher-end flag-ship enthusiasts -- it's all good for the many.


 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: PC Surgeon
Review sites don't speak for my own eyes. If I see a difference in the IQ, it doesn't matter if the reviewer said God himself was in the game, seeing is believing.
So wait, we're supposed to ignore what review sites are clearly showing and saying based on hardware/software limitations that we know to be true and instead believe what your eyes see? We wouldn't need God himself to show us ATI still uses inferior angle-dependent AF and drivers that lack an LOD Bias clamp, its blatantly obvious:

These comparison shots were actually done by Lopri here on AT, and clearly show the lower IQ of ATI's angle-dependent AF. Look at either of the first pillars on the right or left, then zoom in, then click between the two. There's plenty of other sites showing this as well, if your eyes don't believe it....

Nvidia AF
ATI AF

Texture shimmering can't be shown from a SS but its going to happen anytime a game attempts to use negative LOD bias to sharpen textures with AF enabled. Nvidia allows for a LOD bias clamp to 0 so that a game can't set negative LOD bias with AF enabled, which significantly reduces texture shimmering on Nvidia cards.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Elfear
Now there is no doubt that overall the GTX 295 is the faster part but the disparity is actually pretty small and not nearly as large as Xbit makes it out to be, at least compared to other review sites.
Actually I was just showing the 295 was generally the faster part even at resolutions and settings the previews showed it losing badly at, 8xAA and 2560 and/or AA. I'd agree its not much faster than the 4870X2 at 1920/2560 with AA, but those reviews do show its faster where many of the previews did not.
 

SirPaulie

Member
Jan 23, 2009
36
0
0
Static shots have their place to try to gauge IQ -- and very welcomed -- as many dissect screen-shots. It's a good thing.

Another good thing is to dissect a moving screen and very important because, well, I might be stretching things here a bit: But last time i played a game it wasn't static -- but dynamic in nature.

The point is both are very important - static and dynamic to help gauge IQ for the end-user.

Some may disagree and that's what makes forums so much fun.

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Here is a really good image comparison.

If you look at Crysis Warhead, look towards the tracks at the ground. The AMD card has a lot more detail then Nvidia. Look at the 'detail' picture of the ground. The AMD card has much more detail when you look at the ground.

I'm sure there are other differences, but the Crysis Warhead stands out pretty large in my opinion.

 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: chizow
Originally posted by: PC Surgeon
Review sites don't speak for my own eyes. If I see a difference in the IQ, it doesn't matter if the reviewer said God himself was in the game, seeing is believing.
So wait, we're supposed to ignore what review sites are clearly showing and saying based on hardware/software limitations that we know to be true and instead believe what your eyes see? We wouldn't need God himself to show us ATI still uses inferior angle-dependent AF and drivers that lack an LOD Bias clamp, its blatantly obvious:

These comparison shots were actually done by Lopri here on AT, and clearly show the lower IQ of ATI's angle-dependent AF. Look at either of the first pillars on the right or left, then zoom in, then click between the two. There's plenty of other sites showing this as well, if your eyes don't believe it....

Nvidia AF
ATI AF

First of all, lets get one thing straight before you apply another false claim. I never said or implied that my view was the only one to consider. You did that and I'll wait for you to retract such an implication. What you didn't comprehend from my previous statement was that a review(ers) sites opinions matter little to me, it is the images that I'm shown or see with my own eyes.

Regarding the pics you posted, I see a few things both ways. NVIDIA seems to render the pillars better and the floors but looking dead center at the top (the center wooden piece of the multi glass roof) the ATi is more clear.

EDIT: For the record, I've never played GOW so that discrepancy in rendering may be the sun shining in an angle to distort the image on the NVIDIA SS.

Texture shimmering can't be shown from a SS but its going to happen anytime a game attempts to use negative LOD bias to sharpen textures with AF enabled. Nvidia allows for a LOD bias clamp to 0 so that a game can't set negative LOD bias with AF enabled, which significantly reduces texture shimmering on Nvidia cards.

Looking at the SS posted from Xbit and the added pics from Sliceup show it wasn't a random error by some ATi loving fool, but rather the static reality of NVIDIA rendering in FC2.

 

SirPaulie

Member
Jan 23, 2009
36
0
0
Originally posted by: SlowSpyder
Here is a really good image comparison.

If you look at Crysis Warhead, look towards the tracks at the ground. The AMD card has a lot more detail then Nvidia. Look at the 'detail' picture of the ground. The AMD card has much more detail when you look at the ground.

I'm sure there are other differences, but the Crysis Warhead stands out pretty large in my opinion.

There is a difference but the prudent thing to do now is to investigate this in motion; to see if both offer the same quality -- shimmering texture aliasing attributes, etc..

One needs to investigate both to me.





 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: PC Surgeon
First of all, lets get one thing straight before you apply another false claim. I never said or implied that my view was the only one to consider. You did that and I'll wait for you to retract such an implication. What you didn't comprehend from my previous statement was that a review(ers) sites opinions matter little to me, it is the images that I'm shown or see with my own eyes.
No, your first erroneous comment was:

There may be a new king when it comes to single graphics card performance but it remains the same in IQ. ATi still owns that segment. Case in point is the Far Cry 2 pics.

My reply to that was the general concensus from reviewers show Nvidia had superior IQ since G80, and linked to a specific review showing and explaining exactly why they had come to that conclusion.

You then went on to say it didn't matter what reviewers were saying, as that's not what your eyes are seeing? There is no need for any retraction on my part if you're going to claim 2 years of reviewer experiences are secondary to your first-hand experience. If anyone needs to make a retraction its you for claiming ATI has had the IQ crown for the last 2 years, when they have not based on general concensu from reviews.

Regarding the pics you posted, I see a few things both ways. NVIDIA seems to render the pillars better and the floors but looking dead center at the top (the center wooden piece of the multi glass roof) the ATi is more clear.
And again, these are limitations of angle-dependent AF and the lack of LOD bias clamp on ATI parts. Its really simple, if Nvidia owners want to sharpen textures at the risk of texture shimmering, they just remove their LOD clamp.

Looking at the SS posted from Xbit and the added pics from Sliceup show it wasn't a random error by some ATi loving fool, but rather the static reality of NVIDIA rendering in FC2.
Actually it just looks as if ATI has higher shadow contrast and further viewing distance for shadows, as the GTX screenshots show, the shadowing is there but disappears at a closer distance than the Radeon. If you want to see some real rendering errors in FC2, just google sunken tires or half-rocks with 8.10 WHQL.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SlowSpyder
Here is a really good image comparison.

If you look at Crysis Warhead, look towards the tracks at the ground. The AMD card has a lot more detail then Nvidia. Look at the 'detail' picture of the ground. The AMD card has much more detail when you look at the ground.

I'm sure there are other differences, but the Crysis Warhead stands out pretty large in my opinion.
Yep, and I've already linked to that, but of course you seem to be ignoring their conclusion:

The algorithms under-filter, thus they cut samples/calculations. Under-filtering matches flickering. This may look great (see Crysis on the Radeon) on pictures, but flickers heavily in motion.

I guess Radeon owners are just used to texture shimmering, so they think its normal and a by-product of good image quality.


 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |