DX11

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
no shit. he said the visual difference in the pic was "between DX9 and DX11". its from DX11 running tessellation. again IF you dont have tessellation on then it will look like DX9.

Pull the trigger man.

You are arguing over the visuals that dx11 provide, tessellation is one of those visuals.

Also Crysis was not well optimized. There was a huge gain from warhead comparatively and Crytek have pretty much said Crysis 2 will be better optimized - graphics do not look worse to me...

Any game from the makers of stalker is not an optimized game Metro 2033 is definitely not optimized.

Sorry there is no reason to expect any tier of graphics card to run a game that has been pushed out the door without proper optimization. Some part of the blame lies on developers.

One can only hope that with sandy bridge and fusion dx11 on die graphics, that more attention will be paid to optimization and will increase the market for pc games. If the lowest level pcs could run games like BFBC2 even at low settings it would expand our market probably more than 20x. Maybe companies will care about porting over our controls, or optimizing our graphics.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The problem is the games you are comparing.

DX10 over DX9:
1. Much faster in performing the same task (many games are slower because they actually improve the graphics in DX10, or implement it as a badly hacked layer on top of DX9)
http://www.anandtech.com/show/2267/5
observe how it is better looking AND slightly faster in terms of FPS (at least on nvidia cards)
2. Vast improvement in the ability to render realistic shadows.
http://www.anandtech.com/show/2267/4
Observe how shadows in company of heroes became real looking with DX10.
3. Cheaper, faster, and easier to program in than DX9.
This will not be useful until DX9 support can be dropped entirely.

DX11 over DX10:
1. Tesselation: It allows you to do the same thing much faster. This is because it allows you to procedurally generate content that far exceeds the bandwidth of the PCIe interface by orders of magnitude.

The thing to understand that a lot of those "DX11" and "DX10" games are just hacked together ports of DX9 which see absolutely no benefit from DX10/11 because they don't use its unique features. They are trying to just get it to run in DX10/11 as a marketing gimmick of "look at us, we have the biggest numbers".

You should compare the right games... besides which, its not exactly a fair comparison to pit the GTX460 vs the 4870.
http://www.anandtech.com/bench/Product/304?vs=313

The GTX460 is much more powerful (and thus allows higher max graphics settings) than the 4870. And the 4870 is capable of DX10. So you should be comparing the two in DX10.
You could do a same game comparison w/ GTX460 in DX9 and DX11... but again, such games are typically DX9 games with a little addon.
To be fair you should find the best looking DX11 game and compare it to its DX9 compatriots.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Pull the trigger man.

You are arguing over the visuals that dx11 provide, tessellation is one of those visuals.


Also Crysis was not well optimized. There was a huge gain from warhead comparatively and Crytek have pretty much said Crysis 2 will be better optimized - graphics do not look worse to me...

Any game from the makers of stalker is not an optimized game Metro 2033 is definitely not optimized.

Sorry there is no reason to expect any tier of graphics card to run a game that has been pushed out the door without proper optimization. Some part of the blame lies on developers.

One can only hope that with sandy bridge and fusion dx11 on die graphics, that more attention will be paid to optimization and will increase the market for pc games. If the lowest level pcs could run games like BFBC2 even at low settings it would expand our market probably more than 20x. Maybe companies will care about porting over our controls, or optimizing our graphics.
pull the trigger? really? I am not arguing about what DX11 can or cannot give. see if you can follow along and not get confused...mike posted a pic of DX9 and DX11 shots from heaven and claimed that was what the difference was between DX9 and DX11. I was saying that the reason DX11 looked better is because it had the optional tessellation on. he said NO that the pic was just showing the difference between DX9 and DX11. he was WRONG. a game doesn't look like that just from using DX11 alone and that was the POINT.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
pull the trigger? really? I am not arguing about what DX11 can or cannot give. see if you can follow along and not get confused...mike posted a pic of DX9 and DX11 shots from heaven and claimed that was what the difference was between DX9 and DX11. I was saying that the reason DX11 looked better is because it had the optional tessellation on. he said NO that the pic was just showing the difference between DX9 and DX11. he was WRONG. a game doesn't look like that just from using DX11 alone and that was the POINT.
Your not making any sense here, tessellation is optional but it is an option only available on DX11, what does it being optional have to do with anything else?

On a side note, isn't the implementation of tessellation one of the biggest changes from DX10 to DX11?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Your not making any sense here, tessellation is optional but it is an option only available on DX11, what does it being optional have to do with anything else?

On a side note, isn't the implementation of tessellation one of the biggest changes from DX10 to DX11?


he said that that the pic was the difference between DX11 and DX9 even after I said that it was tessellation that was the actual improvement in the pic. does running in DX11 automatically use tessellation like that? NO. does every game with DX11 even have optional tessellation like that? NO

so AGAIN the pic just shows the improvement of an optional feature that DX11 can sometimes have. DX11 alone does not have the graphical difference over DX9 that he was implying that it did.
 
Last edited:

zagood

Diamond Member
Mar 28, 2005
4,102
0
71
Hey look! Old people f@cking!

Oh, sorry, that's just you guys arguing.

Toyo pointed out an issue of semantics. Everyone understood what endlessmike133 was saying, he just wanted to clarify. Then y'all got stupid. Please see sig for more information.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
endlessmike133 did not know what he was saying though. he clearly thought that just simply running DX11 would give that graphical improvement. when I told him that was tessellation making the difference he said "lol no thats the diff between DX9 and DX11". all I was trying to do was let him know DX11 by itself doesn't do that.

now all I keep getting are ridiculous comments saying tessellation is a feature of DX11. well no kidding but that has nothing to do with me trying to let him know that just running DX11 by itself doesn't it make a game look like that. and again not every DX11 game will have that level of tessellation if it has it at all. that is a benchmark to show off tessellation after all.
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
The problem is the games you are comparing.

DX10 over DX9:
1. Much faster in performing the same task (many games are slower because they actually improve the graphics in DX10, or implement it as a badly hacked layer on top of DX9)
http://www.anandtech.com/show/2267/5
observe how it is better looking AND slightly faster in terms of FPS (at least on nvidia cards)
2. Vast improvement in the ability to render realistic shadows.
http://www.anandtech.com/show/2267/4
Observe how shadows in company of heroes became real looking with DX10.
3. Cheaper, faster, and easier to program in than DX9.
This will not be useful until DX9 support can be dropped entirely.

DX11 over DX10:
1. Tesselation: It allows you to do the same thing much faster. This is because it allows you to procedurally generate content that far exceeds the bandwidth of the PCIe interface by orders of magnitude.

The thing to understand that a lot of those "DX11" and "DX10" games are just hacked together ports of DX9 which see absolutely no benefit from DX10/11 because they don't use its unique features. They are trying to just get it to run in DX10/11 as a marketing gimmick of "look at us, we have the biggest numbers".

You should compare the right games... besides which, its not exactly a fair comparison to pit the GTX460 vs the 4870.
http://www.anandtech.com/bench/Product/304?vs=313

The GTX460 is much more powerful (and thus allows higher max graphics settings) than the 4870. And the 4870 is capable of DX10. So you should be comparing the two in DX10.
You could do a same game comparison w/ GTX460 in DX9 and DX11... but again, such games are typically DX9 games with a little addon.
To be fair you should find the best looking DX11 game and compare it to its DX9 compatriots.
PCI-E bandwith will never be reached coz its theoretical. Its not streaming it works like a network. Packets thats being wrapped and send to the other side with a overhead and a 71 percent effeciency

You place middelware on top of middelware that is game engines+dx api you get one bug on top of another bug. Dont blame hardware design because of buggy software.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
PCI-E bandwith will never be reached coz its theoretical. Its not streaming it works like a network. Packets thats being wrapped and send to the other side with a overhead and a 71 percent effeciency

You place middelware on top of middelware that is game engines+dx api you get one bug on top of another bug. Dont blame hardware design because of buggy software.

http://www.youtube.com/watch?v=QMgofmIQElE
The nvidia tesselation demo uses tessellation to procedurally generate 3d models, which if you wanted to transmit them precreated would take more than 10x the bandwidth of 16x pcie v2.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
http://www.youtube.com/watch?v=QMgofmIQElE
The nvidia tesselation demo uses tessellation to procedurally generate 3d models, which if you wanted to transmit them precreated would take more than 10x the bandwidth of 16x pcie v2.

Yeah, but that's more a worst case scenario, I doubt any amount of reasonable tessellation would need that much.

Even then, I was getting 10fps in a hacked demo with my 5770, so something with 3x the tess power and some optimization and you can have that playable. So I just think its developers being lazy. The hardware is there; Quad Core CPUs, Massive amounts of ram, most people here have 4gb or more, and powerful graphics cards.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Yeah, but that's more a worst case scenario, I doubt any amount of reasonable tessellation would need that much.

you are looking at it from the wrong perspective. Its not a worst case scenario (for general rendering) but rather a best case scenario of what you can do when you build a game knowing that you have tessellation as an option.

Things that you wouldn't even bother trying before suddenly become a possibility and new engines can be written to take advantage of those features.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
you are looking at it from the wrong perspective. Its not a worst case scenario (for general rendering) but rather a best case scenario of what you can do when you build a game knowing that you have tessellation as an option.

Things that you wouldn't even bother trying before suddenly become a possibility and new engines can be written to take advantage of those features.


:hmm: when you put it that way. I do see your point. I guess it puts it more into perspective how much devs really aren't taking advantage of whats available.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I've been playing BC2 for over a month now and my son continues to tell me how both of my rig graphics look identical.

I tried to explain the differences between 2 cards and DX9 and DX11 but it is SO little it's HARDLY noticeable.

We have to rigs below side by side so it's easy to compare......and it seems like he is right.

It's almost impossible to tell the difference between DX11 and DX9. We went to # of different maps on empty servers (to same spot) and MINIMAL is the only word I can use.

You really have to look for it. For example some of the details might look better on DX11 (character seems to be most noticeable) but in general I think the upgrade to DX11 cannot be justified for BC2.

Have you guys made any side by side comparisons?

I'm a bit surprised, cause DX11 was one of the main reasons for getting 460 (besides the fact that I needed 4870 in 2nd rig......7900 wasn't cutting it anymore)

Again, I think the difference is VERY minimal AT BEST. Very hard to spot unless you really look hard.

Can it be just BC2? I do remember that it's one of the first implementation of DX11.
First Dx9 can do whatever Dx11 can do, but Unigine fooled a lot of people as DX11 is much better than Dx9/Dx10 in terms of image quality, which is not.

In the programmer perspective, Dx9 is not complex, but to make the same effect from a game that works on Nvidia and AMD, 2 different set of codes are require. In fact, different Nvidia cards may require different code path, and so is AMD cards. At the end, programmers needs to make 30 some odd different code paths to make an effect work on all video cards. Yes, the number 30 is just a random number I pick from thin air.

Coding in Dx 10 is far simpler in theory as video cards are made to support Dx 10 so programmer can have 1 code path and it will work on all Dx10 video card. That was the intention at least, but things that are written with Dx9 can't be compiled into Dx10, but rewritten into Dx10, and vice versa. Since the market is still mixed with Dx9 video card+XP, games now needs to keep a Dx9 version, and a Dx10/Dx11 version, which defeated its purpose. On top of that, Dx 10 has its bugs, but can't be fixed as it requires hardware support, which ended up in nightmares.

Dx11 is technically Dx10 + all the bug fixes + Dynamic Tessellation. The purpose of DT isn't for better image quality, but for size of the image file. Dx11 is capable of producing quality identical, or even slighty better(identical for stationary image, but better on things like cloth simulation) than Dx10 with much smaller file size and memory usage. Unigine is built based upon Dx11 so it looks good, but running unigine in Dx10 is just to show that it is backward compatible, but people mis-understood it as Dx11 IQ better than Dx10.

That means, in terms of IQ, Dx9 = Dx10 = Dx11 given that there are coded properly. However, in terms of size and resource utilization, Dx11 is better than Dx10, and Dx10 is better than Dx9.

Again, things that are coded in Dx11 looks like crap with Dx10 cards, and won't even run on Dx9 cards. Things that are coded in Dx10 can retrofit some Dx11 feature by having something small coded in Dx11 so Dx10 cards still runs the game with good IQ most of the time, it however still won't run on Dx9 cards and Dx11 cards will only have small eye candies as the game is really written mostly in Dx10. Things that are written in Dx9 works on all cards, with the same qualities.

If there exists a game that naively built from Dx11, then you can't see it in Dx9 because it isn't backward compatible. To make a game that will run under Dx9 and Dx10/11, it must have independent executables, one support Dx11(which is backward compatible with Dx10 cards), and one support Dx9. That way, the game can be made to look alike under Dx9 and Dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
First Dx9 can do whatever Dx11 can do, but Unigine fooled a lot of people as DX11 is much better than Dx9/Dx10 in terms of image quality, which is not.

In the programmer perspective, Dx9 is not complex, but to make the same effect from a game that works on Nvidia and AMD, 2 different set of codes are require. In fact, different Nvidia cards may require different code path, and so is AMD cards. At the end, programmers needs to make 30 some odd different code paths to make an effect work on all video cards. Yes, the number 30 is just a random number I pick from thin air.

Coding in Dx 10 is far simpler in theory as video cards are made to support Dx 10 so programmer can have 1 code path and it will work on all Dx10 video card. That was the intention at least, but things that are written with Dx9 can't be compiled into Dx10, but rewritten into Dx10, and vice versa. Since the market is still mixed with Dx9 video card+XP, games now needs to keep a Dx9 version, and a Dx10/Dx11 version, which defeated its purpose. On top of that, Dx 10 has its bugs, but can't be fixed as it requires hardware support, which ended up in nightmares.

Dx11 is technically Dx10 + all the bug fixes + Dynamic Tessellation. The purpose of DT isn't for better image quality, but for size of the image file. Dx11 is capable of producing quality identical, or even slighty better(identical for stationary image, but better on things like cloth simulation) than Dx10 with much smaller file size and memory usage. Unigine is built based upon Dx11 so it looks good, but running unigine in Dx10 is just to show that it is backward compatible, but people mis-understood it as Dx11 IQ better than Dx10.

That means, in terms of IQ, Dx9 = Dx10 = Dx11 given that there are coded properly. However, in terms of size and resource utilization, Dx11 is better than Dx10, and Dx10 is better than Dx9.

Again, things that are coded in Dx11 looks like crap with Dx10 cards, and won't even run on Dx9 cards. Things that are coded in Dx10 can retrofit some Dx11 feature by having something small coded in Dx11 so Dx10 cards still runs the game with good IQ most of the time, it however still won't run on Dx9 cards and Dx11 cards will only have small eye candies as the game is really written mostly in Dx10. Things that are written in Dx9 works on all cards, with the same qualities.

If there exists a game that naively built from Dx11, then you can't see it in Dx9 because it isn't backward compatible. To make a game that will run under Dx9 and Dx10/11, it must have independent executables, one support Dx11(which is backward compatible with Dx10 cards), and one support Dx9. That way, the game can be made to look alike under Dx9 and Dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.

Awesome post :thumbsup:
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
that second pick is horriblly unrealistic to me. Has anyone ever seen a cobblestone road that is that bumpy? Talk about bent ankles and jaw shuddering wagon rides.

the technology allows new things to be created, but sometimes it is misused to created stupid unrealistic things.

There are tons of examples of... a lot of designers think realistic is "everything is brown", or "too much bloom", or "the slightest movement causes speed blur", and many other stupid misusing of effects. On a lot of games you need to disable depth of field and bloom because they are overused in such unrealistic ways that the game looks better without them.
 
Last edited:

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
First Dx9 can do whatever Dx11 can do, but Unigine fooled a lot of people as DX11 is much better than Dx9/Dx10 in terms of image quality, which is not.

In the programmer perspective, Dx9 is not complex, but to make the same effect from a game that works on Nvidia and AMD, 2 different set of codes are require. In fact, different Nvidia cards may require different code path, and so is AMD cards. At the end, programmers needs to make 30 some odd different code paths to make an effect work on all video cards. Yes, the number 30 is just a random number I pick from thin air.

Coding in Dx 10 is far simpler in theory as video cards are made to support Dx 10 so programmer can have 1 code path and it will work on all Dx10 video card. That was the intention at least, but things that are written with Dx9 can't be compiled into Dx10, but rewritten into Dx10, and vice versa. Since the market is still mixed with Dx9 video card+XP, games now needs to keep a Dx9 version, and a Dx10/Dx11 version, which defeated its purpose. On top of that, Dx 10 has its bugs, but can't be fixed as it requires hardware support, which ended up in nightmares.

Dx11 is technically Dx10 + all the bug fixes + Dynamic Tessellation. The purpose of DT isn't for better image quality, but for size of the image file. Dx11 is capable of producing quality identical, or even slighty better(identical for stationary image, but better on things like cloth simulation) than Dx10 with much smaller file size and memory usage. Unigine is built based upon Dx11 so it looks good, but running unigine in Dx10 is just to show that it is backward compatible, but people mis-understood it as Dx11 IQ better than Dx10.

That means, in terms of IQ, Dx9 = Dx10 = Dx11 given that there are coded properly. However, in terms of size and resource utilization, Dx11 is better than Dx10, and Dx10 is better than Dx9.

Again, things that are coded in Dx11 looks like crap with Dx10 cards, and won't even run on Dx9 cards. Things that are coded in Dx10 can retrofit some Dx11 feature by having something small coded in Dx11 so Dx10 cards still runs the game with good IQ most of the time, it however still won't run on Dx9 cards and Dx11 cards will only have small eye candies as the game is really written mostly in Dx10. Things that are written in Dx9 works on all cards, with the same qualities.

If there exists a game that naively built from Dx11, then you can't see it in Dx9 because it isn't backward compatible. To make a game that will run under Dx9 and Dx10/11, it must have independent executables, one support Dx11(which is backward compatible with Dx10 cards), and one support Dx9. That way, the game can be made to look alike under Dx9 and Dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.

Great post, Thanks!!!

Which explains why Unreal 4 devs said no current console will be able to run it. I'm assuming it will be Native DX11.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Great post, Thanks!!!

Which explains why Unreal 4 devs said no current console will be able to run it. I'm assuming it will be Native DX11.

that is a good question... The owner of the company is also their lead designer, he has predicted the lifespan of GPUs 10 years ago with amazing accuracy, including that they will become programmable at about a certain year (he was off by 2 years)...
He also stated that now that it happened their next gen unreal engine will be written 100% in C code. Utilizing either CPU or GPGPU.

This is very ambitious, but could allow them to far surpass what DX allows them to do, what with modern video cards actually capable of running such arbitrary C code. (CUDA for nvidia, and I forgot what its called for AMD). That, however, was several years ago. When DX10 just came out... so I wonder if they stuck to it.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
if there exists a game that naively built from dx11, then you can't see it in dx9 because it isn't backward compatible. To make a game that will run under dx9 and dx10/11, it must have independent executables, one support dx11(which is backward compatible with dx10 cards), and one support dx9. That way, the game can be made to look alike under dx9 and dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.

Civilization v
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |