While the old vs. new Doom results are odd indeed, its been quite a long time since we saw that video.
It is actually possible that parts of the new hardware features may not work and would require major reworking which is not feasable for AMD for a variety of reasons. What if their entire tile-based rasterization approach doesnt work as expected for example due to major bugs?
Of course we dont know that, but we dont have any facts pointing to a major increase in performance which would be necessary to make Vega worthwhile. A 10% increase wouldn´t be enough and AMDs statements are so ambigious in light of the current controversy. Vega is bad PR at the moment, and if they knew about a major performance increase (which they should by now), they better communicate that.
What i find strange though is that major features, such as the new rasterization method do not seem to work as of now. You can´t debug what does not work
@Glo.
Nvidia gradually went from a good amount of hardware scheduling to a software-based method over the course of years, and they have a lot more ressources than AMD to implement new features such as tile-based rasterization. Apparently, AMD has not yet produced a generalized software package that makes some significant features work in a stable fashion, which is worrying to say the least.
Maybe, and just maybe, it needs major work for each game title on a driver level (and maybe from the game devs themselves) which their wording could imply ("...much better optimized for all the
top gaming titles..."), but that would not be good either since AMDs ressources are limited, meaning that a host of games could be in slow "fall back" mode for ages.
Or...some hardware features are bugged, forcing AMD to disable them. Which would be another kind of disaster.
Something else, did anyone try to test the same scene from Prey that they showed with dual Vega on a 1080TI? That was a more recent showing involving games.