ShintaiDK
Lifer
- Apr 22, 2012
- 20,378
- 145
- 106
Shader model 3.0 really started to get the ball rolling for ATI and nVidia, imho!
We got a winner!
Shader model 3.0 really started to get the ball rolling for ATI and nVidia, imho!
Unified shaders got nothing as such to do with STREAM capabilities as seen on the R580.
You are telling me a non unified shader is compatible with a unified shader?
STREAM wise? Yes.
Unified is not about capabilities. Its about efficiency usage.
I dont even think the Xbox360 with unified shaders are STREAM capable.
Take it to PM?
Shaming is pointless unless it's public.
Double or triple performance. Dream on.
DirectX has massive overhead compared to low level coding. This isn't debatable, you're either aware of it or not. Simply looking at how consoles perform compared to a PC with similar number crunching power running DX or OpenGL it makes it overwhelmingly obvious that it's a pretty plausible best case prediction. A computer running Windows with double the CPU and GPU power of an Xbox 360 will run Skyrim or Bioshock Infinite at the same settings at a lower FPS than it does entirely because of DirectX inefficiencies.
We don't really know just how much control Mantle will give game programmers but if it is as much as they get on consoles then double performance is definitely possible if they put the time in.
DirectX has massive overhead compared to low level coding. This isn't debatable, you're either aware of it or not. Simply looking at how consoles perform compared to a PC with similar number crunching power running DX or OpenGL it makes it overwhelmingly obvious that it's a pretty plausible best case prediction. A computer running Windows with double the CPU and GPU power of an Xbox 360 will run Skyrim or Bioshock Infinite at the same settings at a lower FPS than it does entirely because of DirectX inefficiencies.
We don't really know just how much control Mantle will give game programmers but if it is as much as they get on consoles then double performance is definitely possible if they put the time in.
DirectX has massive overhead compared to low level coding. This isn't debatable, you're either aware of it or not. Simply looking at how consoles perform compared to a PC with similar number crunching power running DX or OpenGL it makes it overwhelmingly obvious that it's a pretty plausible best case prediction. A computer running Windows with double the CPU and GPU power of an Xbox 360 will run Skyrim or Bioshock Infinite at the same settings at a lower FPS than it does entirely because of DirectX inefficiencies.
We don't really know just how much control Mantle will give game programmers but if it is as much as they get on consoles then double performance is definitely possible if they put the time in.
Console games never ever ever have the same settings as PC games (excluding performance non-dependent games like say tetris)
The people who peddle these lies really need to stop. DirectX's inefficiencies are all CPU related. Any performance increases would be as a result of relieving any CPU bottlenecks. If there are none present in the specific test system, then the difference would be negligible.
Another advantage for the consoles is that their APIs can expose every feature the GPU supports.
For example, both the Xbox 360 and AMD's DX10 cards have support for tessellation. Viva Pinata, Banjo Kazooie and Halo Wars are known to use it on the 360.
But since it wasn't part of the DX9/DX10.1 spec, it was never used on the PC.
Except no one can tell a difference in gameplay from this "huge" lead in a strategy game where a last generation card is getting 70 fps. A strategy game is not like a racing or an FPS game where you can actually tell apart 90 vs. 138 fps. Where are these other games where NV badly beats AMD in a DX11 game in a similar price bracket as a result of multi-threaded DX11 driver?
What difference does it make that AC3 uses driver command lists? 770 cannot beat R9 280X/7970GE in that title.
With NV refusing to lower prices on the 770, the 4GB version has become a laughing joke, priced $150 more than R9 280X cards and offering no performance advantage in DX9-11 games.
Right now a PC gamer can buy GTX770 level of performance for only $280 and set aside $100-170 for a next GPU upgrade. Considering 770 cannot win in Crysis 3, Metro LL or Tomb Raider, NV are absolutely mad if they think PC gamers are so brand-brainwashed/ignorant as to pay 42-61% more over HD7970Ghz for the same average gaming performance.
Having 64 UAV slots is definitely the biggest gaming related feature in DX11.1. I wonder why they don't mention this.
UAVOnlyRenderingForcedSampleCount and UAVs at all shader stages are also gaming related features.
Uhm, your image is for a different resolution... So not really an apples to apples comparison there. Although its obvious driver changes have been made on both sides.
snip
Having 64 UAV slots is definitely the biggest gaming related feature in DX11.1. I wonder why they don't mention this.
UAVOnlyRenderingForcedSampleCount and UAVs at all shader stages are also gaming related features.
Got any source about AC3 using driver command lists?
All I know is that Dice repeatedly has said it's not working for them.
Far Cry 3 had it initially but it was disabled in a new patch because it was reported to cause stability issues(people have been able to activate it manually in the config file though)
Here's a PDF about Deferred context which mentions AC3 using it
DICE said they couldn't get it working properly in BF3, but for BF4, who knows?
I don't think Far Cry 3 had it. Far Cry 3 had a DX 11 multithreading option, which isn't the same thing as far as I know. I tried enabling it and it didn't make any difference.
Thanks for that PDF
It's a good read.