cusideabelincoln
Diamond Member
- Aug 3, 2008
- 3,274
- 41
- 91
Crysis isn't poorly optimized dude, it's future-proofed.
Crysis is both not optimized and "future-proofed". Warhead proves the first Crysis could be better optimized.
Crysis isn't poorly optimized dude, it's future-proofed.
no shit. he said the visual difference in the pic was "between DX9 and DX11". its from DX11 running tessellation. again IF you dont have tessellation on then it will look like DX9.
pull the trigger? really? I am not arguing about what DX11 can or cannot give. see if you can follow along and not get confused...mike posted a pic of DX9 and DX11 shots from heaven and claimed that was what the difference was between DX9 and DX11. I was saying that the reason DX11 looked better is because it had the optional tessellation on. he said NO that the pic was just showing the difference between DX9 and DX11. he was WRONG. a game doesn't look like that just from using DX11 alone and that was the POINT.Pull the trigger man.
You are arguing over the visuals that dx11 provide, tessellation is one of those visuals.
Also Crysis was not well optimized. There was a huge gain from warhead comparatively and Crytek have pretty much said Crysis 2 will be better optimized - graphics do not look worse to me...
Any game from the makers of stalker is not an optimized game Metro 2033 is definitely not optimized.
Sorry there is no reason to expect any tier of graphics card to run a game that has been pushed out the door without proper optimization. Some part of the blame lies on developers.
One can only hope that with sandy bridge and fusion dx11 on die graphics, that more attention will be paid to optimization and will increase the market for pc games. If the lowest level pcs could run games like BFBC2 even at low settings it would expand our market probably more than 20x. Maybe companies will care about porting over our controls, or optimizing our graphics.
Your not making any sense here, tessellation is optional but it is an option only available on DX11, what does it being optional have to do with anything else?pull the trigger? really? I am not arguing about what DX11 can or cannot give. see if you can follow along and not get confused...mike posted a pic of DX9 and DX11 shots from heaven and claimed that was what the difference was between DX9 and DX11. I was saying that the reason DX11 looked better is because it had the optional tessellation on. he said NO that the pic was just showing the difference between DX9 and DX11. he was WRONG. a game doesn't look like that just from using DX11 alone and that was the POINT.
Your not making any sense here, tessellation is optional but it is an option only available on DX11, what does it being optional have to do with anything else?
On a side note, isn't the implementation of tessellation one of the biggest changes from DX10 to DX11?
PCI-E bandwith will never be reached coz its theoretical. Its not streaming it works like a network. Packets thats being wrapped and send to the other side with a overhead and a 71 percent effeciencyThe problem is the games you are comparing.
DX10 over DX9:
1. Much faster in performing the same task (many games are slower because they actually improve the graphics in DX10, or implement it as a badly hacked layer on top of DX9)
http://www.anandtech.com/show/2267/5
observe how it is better looking AND slightly faster in terms of FPS (at least on nvidia cards)
2. Vast improvement in the ability to render realistic shadows.
http://www.anandtech.com/show/2267/4
Observe how shadows in company of heroes became real looking with DX10.
3. Cheaper, faster, and easier to program in than DX9.
This will not be useful until DX9 support can be dropped entirely.
DX11 over DX10:
1. Tesselation: It allows you to do the same thing much faster. This is because it allows you to procedurally generate content that far exceeds the bandwidth of the PCIe interface by orders of magnitude.
The thing to understand that a lot of those "DX11" and "DX10" games are just hacked together ports of DX9 which see absolutely no benefit from DX10/11 because they don't use its unique features. They are trying to just get it to run in DX10/11 as a marketing gimmick of "look at us, we have the biggest numbers".
You should compare the right games... besides which, its not exactly a fair comparison to pit the GTX460 vs the 4870.
http://www.anandtech.com/bench/Product/304?vs=313
The GTX460 is much more powerful (and thus allows higher max graphics settings) than the 4870. And the 4870 is capable of DX10. So you should be comparing the two in DX10.
You could do a same game comparison w/ GTX460 in DX9 and DX11... but again, such games are typically DX9 games with a little addon.
To be fair you should find the best looking DX11 game and compare it to its DX9 compatriots.
PCI-E bandwith will never be reached coz its theoretical. Its not streaming it works like a network. Packets thats being wrapped and send to the other side with a overhead and a 71 percent effeciency
You place middelware on top of middelware that is game engines+dx api you get one bug on top of another bug. Dont blame hardware design because of buggy software.
http://www.youtube.com/watch?v=QMgofmIQElE
The nvidia tesselation demo uses tessellation to procedurally generate 3d models, which if you wanted to transmit them precreated would take more than 10x the bandwidth of 16x pcie v2.
Yeah, but that's more a worst case scenario, I doubt any amount of reasonable tessellation would need that much.
you are looking at it from the wrong perspective. Its not a worst case scenario (for general rendering) but rather a best case scenario of what you can do when you build a game knowing that you have tessellation as an option.
Things that you wouldn't even bother trying before suddenly become a possibility and new engines can be written to take advantage of those features.
heres a road in DX9:
and a road in DX11:
First Dx9 can do whatever Dx11 can do, but Unigine fooled a lot of people as DX11 is much better than Dx9/Dx10 in terms of image quality, which is not.I've been playing BC2 for over a month now and my son continues to tell me how both of my rig graphics look identical.
I tried to explain the differences between 2 cards and DX9 and DX11 but it is SO little it's HARDLY noticeable.
We have to rigs below side by side so it's easy to compare......and it seems like he is right.
It's almost impossible to tell the difference between DX11 and DX9. We went to # of different maps on empty servers (to same spot) and MINIMAL is the only word I can use.
You really have to look for it. For example some of the details might look better on DX11 (character seems to be most noticeable) but in general I think the upgrade to DX11 cannot be justified for BC2.
Have you guys made any side by side comparisons?
I'm a bit surprised, cause DX11 was one of the main reasons for getting 460 (besides the fact that I needed 4870 in 2nd rig......7900 wasn't cutting it anymore)
Again, I think the difference is VERY minimal AT BEST. Very hard to spot unless you really look hard.
Can it be just BC2? I do remember that it's one of the first implementation of DX11.
First Dx9 can do whatever Dx11 can do, but Unigine fooled a lot of people as DX11 is much better than Dx9/Dx10 in terms of image quality, which is not.
In the programmer perspective, Dx9 is not complex, but to make the same effect from a game that works on Nvidia and AMD, 2 different set of codes are require. In fact, different Nvidia cards may require different code path, and so is AMD cards. At the end, programmers needs to make 30 some odd different code paths to make an effect work on all video cards. Yes, the number 30 is just a random number I pick from thin air.
Coding in Dx 10 is far simpler in theory as video cards are made to support Dx 10 so programmer can have 1 code path and it will work on all Dx10 video card. That was the intention at least, but things that are written with Dx9 can't be compiled into Dx10, but rewritten into Dx10, and vice versa. Since the market is still mixed with Dx9 video card+XP, games now needs to keep a Dx9 version, and a Dx10/Dx11 version, which defeated its purpose. On top of that, Dx 10 has its bugs, but can't be fixed as it requires hardware support, which ended up in nightmares.
Dx11 is technically Dx10 + all the bug fixes + Dynamic Tessellation. The purpose of DT isn't for better image quality, but for size of the image file. Dx11 is capable of producing quality identical, or even slighty better(identical for stationary image, but better on things like cloth simulation) than Dx10 with much smaller file size and memory usage. Unigine is built based upon Dx11 so it looks good, but running unigine in Dx10 is just to show that it is backward compatible, but people mis-understood it as Dx11 IQ better than Dx10.
That means, in terms of IQ, Dx9 = Dx10 = Dx11 given that there are coded properly. However, in terms of size and resource utilization, Dx11 is better than Dx10, and Dx10 is better than Dx9.
Again, things that are coded in Dx11 looks like crap with Dx10 cards, and won't even run on Dx9 cards. Things that are coded in Dx10 can retrofit some Dx11 feature by having something small coded in Dx11 so Dx10 cards still runs the game with good IQ most of the time, it however still won't run on Dx9 cards and Dx11 cards will only have small eye candies as the game is really written mostly in Dx10. Things that are written in Dx9 works on all cards, with the same qualities.
If there exists a game that naively built from Dx11, then you can't see it in Dx9 because it isn't backward compatible. To make a game that will run under Dx9 and Dx10/11, it must have independent executables, one support Dx11(which is backward compatible with Dx10 cards), and one support Dx9. That way, the game can be made to look alike under Dx9 and Dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.
that second pick is horriblly unrealistic to me. Has anyone ever seen a cobblestone road that is that bumpy? Talk about bent ankles and jaw shuddering wagon rides.
First Dx9 can do whatever Dx11 can do, but Unigine fooled a lot of people as DX11 is much better than Dx9/Dx10 in terms of image quality, which is not.
In the programmer perspective, Dx9 is not complex, but to make the same effect from a game that works on Nvidia and AMD, 2 different set of codes are require. In fact, different Nvidia cards may require different code path, and so is AMD cards. At the end, programmers needs to make 30 some odd different code paths to make an effect work on all video cards. Yes, the number 30 is just a random number I pick from thin air.
Coding in Dx 10 is far simpler in theory as video cards are made to support Dx 10 so programmer can have 1 code path and it will work on all Dx10 video card. That was the intention at least, but things that are written with Dx9 can't be compiled into Dx10, but rewritten into Dx10, and vice versa. Since the market is still mixed with Dx9 video card+XP, games now needs to keep a Dx9 version, and a Dx10/Dx11 version, which defeated its purpose. On top of that, Dx 10 has its bugs, but can't be fixed as it requires hardware support, which ended up in nightmares.
Dx11 is technically Dx10 + all the bug fixes + Dynamic Tessellation. The purpose of DT isn't for better image quality, but for size of the image file. Dx11 is capable of producing quality identical, or even slighty better(identical for stationary image, but better on things like cloth simulation) than Dx10 with much smaller file size and memory usage. Unigine is built based upon Dx11 so it looks good, but running unigine in Dx10 is just to show that it is backward compatible, but people mis-understood it as Dx11 IQ better than Dx10.
That means, in terms of IQ, Dx9 = Dx10 = Dx11 given that there are coded properly. However, in terms of size and resource utilization, Dx11 is better than Dx10, and Dx10 is better than Dx9.
Again, things that are coded in Dx11 looks like crap with Dx10 cards, and won't even run on Dx9 cards. Things that are coded in Dx10 can retrofit some Dx11 feature by having something small coded in Dx11 so Dx10 cards still runs the game with good IQ most of the time, it however still won't run on Dx9 cards and Dx11 cards will only have small eye candies as the game is really written mostly in Dx10. Things that are written in Dx9 works on all cards, with the same qualities.
If there exists a game that naively built from Dx11, then you can't see it in Dx9 because it isn't backward compatible. To make a game that will run under Dx9 and Dx10/11, it must have independent executables, one support Dx11(which is backward compatible with Dx10 cards), and one support Dx9. That way, the game can be made to look alike under Dx9 and Dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.
Great post, Thanks!!!
Which explains why Unreal 4 devs said no current console will be able to run it. I'm assuming it will be Native DX11.
if there exists a game that naively built from dx11, then you can't see it in dx9 because it isn't backward compatible. To make a game that will run under dx9 and dx10/11, it must have independent executables, one support dx11(which is backward compatible with dx10 cards), and one support dx9. That way, the game can be made to look alike under dx9 and dx11. The only difference that the user can see is a)performance and b)memory usage. However, these 2 variables can be impacted by the optimization of the source code itself.