- Oct 19, 2013
- 601
- 120
- 106
It seems every game release with DirectX 12 has been problematic or divisive in some way. Since the onus is now all on developers to do things that were traditionally done by the driver there seems to be a certain level of skill is required to even get a benefit out of the API and not every developer has the skills or resources.
Is it really better to have a low level API and shovel more work on developers and hope they get everything right and come out with a performance boost? Or have the vendor do most of the work with their driver and give the developer more time to do other things?
Another concern is about architecture. GPUs do eventually change architecture and when they do will all the optimizations the developer did for the then current GPUs apply to the new GPUs with new architectures? There is evidence already to suggest no. GCN 1.2 didn't do so hot in mantle with BF4 defaulting to D3D instead of mantle on GCN 1.2 cards. Not sure if that was ever fixed but if it was I don't think that will be the case for every game that uses a low level API.
Will DirectX12 be relegated to DirectX10 like status? Used but not the main API?
Is it really better to have a low level API and shovel more work on developers and hope they get everything right and come out with a performance boost? Or have the vendor do most of the work with their driver and give the developer more time to do other things?
Another concern is about architecture. GPUs do eventually change architecture and when they do will all the optimizations the developer did for the then current GPUs apply to the new GPUs with new architectures? There is evidence already to suggest no. GCN 1.2 didn't do so hot in mantle with BF4 defaulting to D3D instead of mantle on GCN 1.2 cards. Not sure if that was ever fixed but if it was I don't think that will be the case for every game that uses a low level API.
Will DirectX12 be relegated to DirectX10 like status? Used but not the main API?