Crysis 2 Tessellation Article

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ocre

Golden Member
Dec 26, 2008
1,594
7
81
wowwww! seero, i had a hard time believing nvidia sabotaged me. By the way you know your stuff, i am inclined to believe most every poster claiming foul is really really lacking any knowledge of how things really work.

I am also pretty sure that AMD will be able to get some decent gains through their driver, well at least on their 6900 models.

It also seems there may be even more patches for crysis2. I have yet to finish this game and i think i will shelf it for awhile so that i can play it in all its glory. I have a couple of games to finish up anyway.
I think i might wait and not buy a 6950 just yet. I really am not sure though. It seems a little close to the next gen GPUs, i think i may could wait. I surely will kick myself in the but, i mean its the first die shrink in forever. I am antsy, but it might be worth the wait.

Thanks all!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
It also not just like the DX11 patch(es) were all about tessellation.
Besides tessellation the high ressolution textures and normal maps, the patch(es) add new effects:

Soft shadows with variable penumbra
Improved water rendering
Particle motion blur
Particle shadowing
Realtime Local Reflections
Screen Space Directional Occlusion
Sprite-based Bokeh depth of field effects

And as Seero's post shows it is really complex mathematics/code and hard to "101" boil down to laymen.
 

mosox

Senior member
Oct 22, 2010
434
0
0
So, Crytek and Ubisoft have found a new way of implementing the DX11 feature, unlike all the other game developers? How many polygons does one need in order to create a flat surface?

There are lots of DX11 games, this kind of strange behavior happens only in new TWIMTBP games like HAWX and Crysis 2.

The company famous for disabling features like AA or PhysX when a card made by the competition is detected and who paid Crytek 2 millions has nothing to do with this.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So what?

I think nVidia was going to win in benches based on nVidia cards handle tessellation more efficiently than AMD. That's a given to me. What do you expect nVidia to do -- not try to add tessellation in titles and wait for AMD? Of course they're going to add more tessellation for their customers and try to raise the bar to showcase. Why wouldn't they try?

It's not the above idea but some of the areas may be deemed too subtle and not enough image quality gain for the resources spent, which to me is a valid point and agree with Behardware's conclusion. Think some of the constructive negatives are valid and worth investigating more. We don't live in a perfect or ideal world and really enjoyed Behardware's wording... "



I realize some like to blame but for some of this is just one side of blame -- their tessellation is not as efficient; their drivers are not using the double throughput in the 69XX series, according to Behardware. I couldn't believe that point wasn't raised.

How about some balance?

,

You are exaggerating my position to make it invalid. I never said anything about nVidia waiting for AMD. Nor did I insinuate it. I have no problem with tessellation in games. I welcome it. That would be a stupid position and you are attempting to make me look stupid. If you aren't going to actually address what I say at face value then don't reply to my posts, please.

I have been complaining about useless tessellation that can only serve one possible purpose... To exploit nVidia's advantage in tessellation. It would have been different if they actually used the large amounts of tessellation properly. That would have been a lot of work though. So, they took the easy way out and applied large amounts of tessellation to simple models. It does nothing to IQ that in any way justifies the resources used.

When you subdivide a model's geometry, which is what tessellation does, you have to do what they call "weighting the vertices". This makes the model maintain it's same shape, just with more detail. The more elaborate the model is the more weighting that has to be done. This can be a lot of work. Models that have no defined shape, like the ocean, don't have to be carefully weighted. Also models that are basically boxes, the road barriers, have very few edges where there are vertices to be affected. So, they are very easy to "weight". In the end, they got their tessellation and they got their exaggerated performance difference in the easiest ways possible. That's what makes it so blatant.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
So, Crytek and Ubisoft have found a new way of implementing the DX11 feature, unlike all the other game developers? How many polygons does one need in order to create a flat surface?

Just enough so that AMD hardware runs it poorly.

There are lots of DX11 games, this kind of strange behavior happens only in new TWIMTBP games like HAWX and Crysis 2.

Behavior? As in running better on Nvidia hardware after the TWIMTBP team worked with the dev? I can't imagine that.

The company famous for disabling features like AA or PhysX when a card made by the competition is detected and who paid Crytek 2 millions has nothing to do with this.

As opposed to the company that does next to nothing but condemn the other for continually bringing it's customers a better gaming experience, more features, better performance. Although I haven't heard a peep from AMD about the tesselation in Crysis 2. Unless they have and I'm just not aware.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
As opposed to the company that does next to nothing but condemn the other for continually bringing it's customers a better gaming experience, more features, better performance. Although I haven't heard a peep from AMD about the tesselation in Crysis 2. Unless they have and I'm just not aware.


Who would have thought Nvidia, or you, would pick the "victim" argument and rally behind that when confronted with really easy to see examples of foulplay...

We have BFG, a member/moderator, i would coin as nothing but neutral/objective, saying there IS a performance penalty with the way its been done. And yet there are still some people here who argue that this is to be expected.... im laughing my ass off at the apparant lack of understanding some members display.

Again, simply put: Crysis 2 has (among other fails) graphic stuff that you cant see, that impact and give you lower performance on both AMD and Nvidia cards. What part of this do you think gives you a better gaming experiance?



And as a reply in particular to Keysplayr (dont know why i bother...): AMD introduced Eyefinity as one of the biggest things to hit pc gaming in a VERY long time. Id go as far as to say that before Eyefinity, multimonitorgaming was not even considered by pc gamers.

Looking at AMDs Gaming Evolved program, i cant help but think that AAA games favor AMDs Gaming Evolved above Nvidia "to hard to remember all the words thing..", even though Nvidia still has many times more games in their program.

Please dont make more such blatantly inaccurate or false statements, ty
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
You are exaggerating my position to make it invalid. I never said anything about nVidia waiting for AMD. Nor did I insinuate it. I have no problem with tessellation in games. I welcome it. That would be a stupid position and you are attempting to make me look stupid. If you aren't going to actually address what I say at face value then don't reply to my posts, please.

I have been complaining about useless tessellation that can only serve one possible purpose... To exploit nVidia's advantage in tessellation. It would have been different if they actually used the large amounts of tessellation properly. That would have been a lot of work though. So, they took the easy way out and applied large amounts of tessellation to simple models. It does nothing to IQ that in any way justifies the resources used.

When you subdivide a model's geometry, which is what tessellation does, you have to do what they call "weighting the vertices". This makes the model maintain it's same shape, just with more detail. The more elaborate the model is the more weighting that has to be done. This can be a lot of work. Models that have no defined shape, like the ocean, don't have to be carefully weighted. Also models that are basically boxes, the road barriers, have very few edges where there are vertices to be affected. So, they are very easy to "weight". In the end, they got their tessellation and they got their exaggerated performance difference in the easiest ways possible. That's what makes it so blatant.

The road barriers don't bother me because there is subtle detail and really enjoy the angle it was first represented -- gotta love it. Even the bricks are not in your face tessellation and subtle, and I believe the game wasn't designed from the ground up for tessellation and simply added. Most of their attention was for the console market so I wasn't expecting ideal in all circumstances.

The only nit-pick I have is the road curbing so far.

http://maldotex.blogspot.com/2011/09/tesselation-myth-in-crysis-2-el-mito-de.html

This article actually did more to help me than any other. Because obviously nVidia sabotaged the GTX 470, hehe, because I can't receive the magical 40 frames with everything set to ultra. There are so many settings and configurations for a gamer to achieve nice frame-rate and a good experience.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So, Crytek and Ubisoft have found a new way of implementing the DX11 feature, unlike all the other game developers? How many polygons does one need in order to create a flat surface?

There are lots of DX11 games, this kind of strange behavior happens only in new TWIMTBP games like HAWX and Crysis 2.

The company famous for disabling features like AA or PhysX when a card made by the competition is detected and who paid Crytek 2 millions has nothing to do with this.

What is odd is the developers of the original HawX and Ubisoft did offer 10.1 because they thought it was good for their customers and was an AMD sponsored game. The game was playable for both but AMD had an advantage based on hardware -- good for AMD. The same developers decide tessellation would really make our title shine and offer more realism and is nVidia sponsored. The game is playable on both but nVidia has an advantage based on hardware -- good for nVidia.

I hear the AA in Batman again but it was nVidia's work just like AMD's native Stereo 3d support for Deus Ex -- that only worked on AMD GPU's. No big deal, it was good for AMD to try to create Stereo 3d awareness and bring this to their customers. So, what does one do? Offer constructive comments and try to see if nVidia's developer relations can get 3d vision in there as well --- and it is going to be added. Glad, they're both trying to improve the experience and trying to get titles to take advantage of their hardware strengths.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,980
595
126
I hear the AA in Batman again but it was nVidia's work just like AMD's native Stereo 3d support for Deus Ex -- that only worked on AMD GPU's. No big deal, it was good for AMD to try to create Stereo 3d awareness and bring this to their customers. So, what does one do? Offer constructive comments and try to see if nVidia's developer relations can get 3d vision in there as well --- and it is going to be added. Glad, they're both trying to improve the experience and trying to get titles to take advantage of their hardware strengths.
Load. Of. Crap.

AA works on AMD hardware in Batman, oh unless you purposely check for a Radeon card and stop the final rendering pass from taking place. Stereoscopic 3D is done differently between AMD and Nvidia, your comparison is way off. And I think you know this.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Deus EX can work on nVidia with a S3d hack but it sucks.

Edit:

Love it: How it is okay for AMD to lock a feature to their GPU's but nVidia gets demonized for it. See how some mindsets work.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Who would have thought Nvidia, or you, would pick the "victim" argument and rally behind that when confronted with really easy to see examples of foulplay...

We have BFG, a member/moderator, i would coin as nothing but neutral/objective, saying there IS a performance penalty with the way its been done. And yet there are still some people here who argue that this is to be expected.... im laughing my ass off at the apparant lack of understanding some members display.

Again, simply put: Crysis 2 has (among other fails) graphic stuff that you cant see, that impact and give you lower performance on both AMD and Nvidia cards. What part of this do you think gives you a better gaming experiance?


There is a performance penalty and there are questionable areas but what amazes me is how quickly it must be foul play and to undermine AMD at the expense of nVidia's own products. Obviously this has been going on for some time based on the zeal of some mind-sets -- where is the evidence? Developers airing their dirty laundry? Where? It gets old, so very old.


And as a reply in particular to Keysplayr (dont know why i bother...): AMD introduced Eyefinity as one of the biggest things to hit pc gaming in a VERY long time. Id go as far as to say that before Eyefinity, multimonitorgaming was not even considered by pc gamers.

It's not new and you're misinformed:

http://www.nvnews.net/reviews/matrox_parhelia/page_18.shtml

For me, when nVidia offered their Big Bang Drivers, was so hoping for surround gaming with multi-monitor and was disappointed. With AMD evangelizing EyeFinity, and one of the prominent players, which was fantastic by the way, it would raise the bar and hopefully nVidia would also offer the feature for their customers. But to me, was what took you so long?
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
There is a performance penalty and there are questionable areas but what amazes me is how quickly it must be foul play and to undermine AMD at the expense of nVidia's own products. Obviously this has been going on for some time based on the zeal of some mind-sets -- where is the evidence? Developers airing their dirty laundry? Where? It gets old, so very old.




It's not new and you're misinformed:

http://www.nvnews.net/reviews/matrox_parhelia/page_18.shtml

For me, when nVidia offered their Big Bang Drivers, was so hoping for surround gaming with multi-monitor and was disappointed. With AMD evangelizing EyeFinity, and one of the prominent players, which was fantastic by the way, it would raise the bar and hopefully nVidia would also offer the feature for their customers. But to me, was what took you so long?


Why did you quote me and reply "as if", when all you do is talk circles around yourself?

Please read what you wrote and edit it so others can understand what you are on about, id like that aswell.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Why did you quote me and reply "as if", when all you do is talk circles around yourself?

Please read what you wrote and edit it so others can understand what you are on about, id like that aswell.

It's called debating your points because what you say doesn't mean what actually is.
 

MutantGith

Member
Aug 3, 2010
53
0
0
We are talking about Crysis 2, programmed by the same company that pretty famously programmed, Crysis ... without the 2. That game was specifically designed such that the (arguable) high end visuals ran at the expense of code that NO GPU of the time could possibly run at max settings at anything resembling truly playable frame rates. Efficiency and optimization do not seem to be high on their lists, especially when it comes to a rushed, post release patch specifically designed to make the graphics in a game more "extreme" for the fanbase that noted how consolized the original release was. People wanted "extreme" DX-11 Image quality, and this is how Crytek decided to give it to them.

No one is arguing that it makes sense, or is efficient for the tesselation to be applied in the slap-dash way that it is. It doesn't. However, it's much easier for me, personally, to believe that it's because of mistakes/hurried application at the developer, rather than some secret conspiracy to hinder more than half of the installed user bases' experience.

It's a pretty easy leap for me to imagine that when rolling out code for the patch, someone ran a report, generated a list of the most commonly occuring model meshes, and then took a team into a room and told them "Take these, and tesselate until your eyes bleed. Tesselation is DX11, and therefore is good" and those coders did so. Water got tesselated because water was under all of these maps, and therefore commonly occurred. No one checked to see if tesselation would hinder/slow geometry calculations, no one thought about there being water under the map mesh, they just did their little job. Someone should have put 2 and 2 together, but they didn't. It's even possible that people forgot/didn't know that Z-culling might not stop the performance hit on hidden or distant objects. Not hard to imagine at all. They screwed up.

As I recall, the same thing happened back in the day with Morrowind. It was something of a PC killer performance hog. Vivec, I think, had a sewer and river underlayer. It turned out that that city was a major frame rate killer, and that people found changing the water reflection parameters changed your framerate, even if you were on the top floor, in an area that didn't have los to the water.

It's sloppy/inefficient coding, sure. But given the context, not too likely that it's intentional sloppy coding. When you push the envelope with new techniques, it's never going to be as efficient as when someone's been working with the code for years, and will always disporportionately murder hardware not on the bleeding edge.

I can't see a reason other than confirmation bias to really assume that there's a desire on the part of anyone to directly hamper any paying customer.
 

mosox

Senior member
Oct 22, 2010
434
0
0
wowwww! seero, i had a hard time believing nvidia sabotaged me.

Nvidia sabotaged me and many others when I couldn't use my Nvidia card for PhysX together with my 5000 series ATI card. Nvidia sabotages those who want to run SLI on AMD rigs. Nvidia hobbles CPU PhysX.

Nvidia dumped XFX when they dared to make AMD cards.

Nvidia cuts off the HW review sites when they dare to breach their review guidelines. Nvidia (not Ubisoft) demanded that the HAWX 2 benchmark is used in the reviews.

Nvidia hates 3DMark (makes them look bad) so many HW sites including this one dumped it. Anand uses only Unigine and "compute performance" (mainly Civilization V) for the reviewed video cards.

Nvidia hates video quality benchmarks like HQV (those also make them look bad) so sites like Anand, Techpowerup, etc dumped that too even for the reviews of HTPC cards.


Here's from what I know the last video quality test done by Anand (only the anisotropic filtering)

http://www.anandtech.com/show/2977/...tx-470-6-months-late-was-it-worth-the-wait-/7


Here's an example of a review for a HTPC card not tested for image quality (HQV) but tested instead in...Metro 2033 and other demanding games

http://www.anandtech.com/show/4263/amds-radeon-hd-6450-uvd3-meets-htpc

Nvidia is a bully and this has to stop. If AMD was anything like them they would have made some DX11 only games two years ago. If AMD is anything like them they will create some games that work really bad on the Nvidia cards when they will release the 7000 series and split the gaming market for good, Nvidia started this.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Nvidia sabotaged me and many others when I couldn't use my Nvidia card for PhysX together with my 5000 series ATI card. Nvidia sabotages those who want to run SLI on AMD rigs. Nvidia hobbles CPU PhysX.

Nvidia dumped XFX when they dared to make AMD cards.

Nvidia cuts off the HW review sites when they dare to breach their review guidelines. Nvidia (not Ubisoft) demanded that the HAWX 2 benchmark is used in the reviews.

Nvidia hates 3DMark (makes them look bad) so many HW sites including this one dumped it. Anand uses only Unigine and "compute performance" (mainly Civilization V) for the reviewed video cards.

Nvidia hates video quality benchmarks like HQV (those also make them look bad) so sites like Anand, Techpowerup, etc dumped that too even for the reviews of HTPC cards.


Here's from what I know the last video quality test done by Anand (only the anisotropic filtering)

http://www.anandtech.com/show/2977/...tx-470-6-months-late-was-it-worth-the-wait-/7


Here's an example of a review for a HTPC card not tested for image quality (HQV) but tested instead in...Metro 2033 and other demanding games

http://www.anandtech.com/show/4263/amds-radeon-hd-6450-uvd3-meets-htpc

Nvidia is a bully and this has to stop. If AMD was anything like them they would have made some DX11 only games two years ago. If AMD is anything like them they will create some games that work really bad on the Nvidia cards when they will release the 7000 series and split the gaming market for good, Nvidia started this.

Poor AMD apparently has a global conspiracy against them ranging from game developers to review sites. Tin foil is pretty cheap these days. Add an extra layer in case Nvidia tries to mind control you like they have the rest of us.
 

amenx

Diamond Member
Dec 17, 2004
4,107
2,376
136
Nvidia sabotaged me and many others when I couldn't use my Nvidia card for PhysX together with my 5000 series ATI card. Nvidia sabotages those who want to run SLI on AMD rigs. Nvidia hobbles CPU PhysX.

Nvidia dumped XFX when they dared to make AMD cards.

Nvidia cuts off the HW review sites when they dare to breach their review guidelines. Nvidia (not Ubisoft) demanded that the HAWX 2 benchmark is used in the reviews.

Nvidia hates 3DMark (makes them look bad) so many HW sites including this one dumped it. Anand uses only Unigine and "compute performance" (mainly Civilization V) for the reviewed video cards.

Nvidia hates video quality benchmarks like HQV (those also make them look bad) so sites like Anand, Techpowerup, etc dumped that too even for the reviews of HTPC cards.


Here's from what I know the last video quality test done by Anand (only the anisotropic filtering)

http://www.anandtech.com/show/2977/...tx-470-6-months-late-was-it-worth-the-wait-/7


Here's an example of a review for a HTPC card not tested for image quality (HQV) but tested instead in...Metro 2033 and other demanding games

http://www.anandtech.com/show/4263/amds-radeon-hd-6450-uvd3-meets-htpc

Nvidia is a bully and this has to stop. If AMD was anything like them they would have made some DX11 only games two years ago. If AMD is anything like them they will create some games that work really bad on the Nvidia cards when they will release the 7000 series and split the gaming market for good, Nvidia started this.
None of that is sufficient criteria to sway me from one card maker to another. In fact, as a satisfied Nvidia user, looks like I will continue to buy their cards. Once I experience flaws in my gaming experience or issues that hamper it, I will consider switching. So far I am a happy camper. Same reason I am with Intel CPUs.
 

mosox

Senior member
Oct 22, 2010
434
0
0
To test a HTPC card in Metro 2033 is ludicrous. To NOT test the video quality for a HTPC card is also ludicrous.

So the review sites who do this are either ..... or.....

take your pick.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
To test a HTPC card in Metro 2033 is ludicrous. To NOT test the video quality for a HTPC card is also ludicrous.

So the review sites who do this are either ..... or.....

take your pick.

This is right from the review.

Due to how little time we’ve had with the 6450 we haven’t been able to run it through our full suite of HTPC tests, but so far it’s looking very good. Between the doubling of memory bandwidth and doubling of shaders, the 6450 is now able to run all of AMD’s post-processing features at full speed—that is they all work with Enforce Smooth Video Playback enabled and without dropping any frames in the process. AMD reports an HD HQV 2.0 score of 188, while we recorded 189 on the 5570 last year (keep in mind scoring is inherently subjective to some degree). We need to do further testing, but with our limited time it looks like the 6450 is as equally capable as the 5570 when it comes to post-processing, which is to say it’s at the top of the charts.

But by all means continue...
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |