Azix
Golden Member
- Apr 18, 2014
- 1,438
- 67
- 91
You may want to check the 780TI numbers again.
A stock TI then.
You may want to check the 780TI numbers again.
You know it totally looks like CDPR purposely made non-hairworks hair look worse than it could have. There is no reason colors and such should be different between them.
I think eblveryone is forgetting one thing. nV has already released the drivers for the game, so NV cards are optimised for it. AMD, on the other hand, hasn't. So, we should see at least a small improvement on most AMD cards in the next few weeks, and the comparison is not exactly apples to apples... I think that once that happens, we will see just how crippled Kepler cards are compared to all of AMD cards.
Just smoke and mirrors.Neither Kepler nor Fermi have been dumped yet though, for example they will both get DX12 (the latest WHQL Windows 10 driver already supports Kepler).
I have Kepler cards in my PCs so I hope they will continue to get performance drivers for a long time.
Perhaps Kepler will get more game-specific optimizations now that the driver team has finished the DX12 driver.
Way to go nvidia. :thumbsdown:This is Nvidia you are talking about. "The way the consumer is meant to be played." :whiste:
That's all good and well, but is it really acceptable to have to wait for a driver release weeks after a game as big as this one is launched? Not saying it will take that long, but AMD should have game-ready drivers day one for this game. They still have time, lets hope they deliver.
And since I know people will say AMD probably never got the game because GameWorks and blah, blah, blah... Unless someone in the know says something like that, I don't believe it.
Well, they have time. As for the rest, well, I believe that they don't get access, or at least same level of access, but have no way of knowing for sure. It's just a hunch.That's all good and well, but is it really acceptable to have to wait for a driver release weeks after a game as big as this one is launched? Not saying it will take that long, but AMD should have game-ready drivers day one for this game. They still have time, lets hope they deliver.
And since I know people will say AMD probably never got the game because GameWorks and blah, blah, blah... Unless someone in the know says something like that, I don't believe it.
Just smoke and mirrors.
Way to go nvidia. :thumbsdown:
Why I'm not surprised that shintaiDK is just dodging the bullets in this thread and undermining this blasphemy.
Rather than linking that crap clickbait site, go to the source: http://www.pcper.com/news/Graphics-Cards/NVIDIA-Under-Attack-Again-GameWorks-Witcher-3-Wild-Hunt
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
This is in contrast to AMDs Gaming Evolved Program which the company claims puts no restrictions on developers to optimize game code for any of their competitors. In fact AMD actively worked to optimize its TressFX technology for Nvidia hardware to the point where it performed equally well on both. TressFX is a physics hair simulation technology from AMD thats comparable to Nvidias HairWorks, which produce hair and fur effects . However unlike HairWorks it performs equally well on both AMD and Nvidia hardware partly because the source code is publicly available and has been optimized for Nvidia as mentioned above and partly because code base is quite efficient to begin with.
In contrast to GameWorks code which Nvidia only provides to its co-marketing game developer partners, AMD makes the source code to all visual effects in its library available publicly and to everyone for free, including Nvidia.
Rather than linking that crap clickbait site, go to the source: http://www.pcper.com/news/Graphics-Cards/NVIDIA-Under-Attack-Again-GameWorks-Witcher-3-Wild-Hunt
Member call out is against the rules on this forum. But the answer should be really obvious: NVIDIA will prioritize and optimize for Maxwell as it's the current and best selling architecture. Furthermore, Kepler most likely has already maxed out and there's not much more to extract from it without a lot of dedicated resources. AMD on the other hand continues to optimize for it's current GCN because it's all they have available. What else are they going to optimize for in their drivers?
So you are saying its perfectly fine for a GTX780Ti to perform on par with a mid level AMD card (ie: R9 285) in The Witcher 3, but blow a 285 out of the water in any other game?
How can you defend a company that is screwing over its OWN CUSTOMERS by trying to make them think they need to upgrade thier 1.5 year old card, when they actually do not as the poor performance is a driver issue only?
Rather than linking that crap clickbait site, go to the source: http://www.pcper.com/news/Graphics-Cards/NVIDIA-Under-Attack-Again-GameWorks-Witcher-3-Wild-Hunt
So you are saying its perfectly fine for a GTX780Ti to perform on par with a mid level AMD card (ie: R9 285) in The Witcher 3, but blow a 285 out of the water in any other game?
How can you defend a company that is screwing over its OWN CUSTOMERS by trying to make them think they need to upgrade thier 1.5 year old card, when they actually do not as the poor performance is a driver issue only?
I wouldn't be surprised if GCN eventually edges out Maxwell as well, despite losing in current games.
Generally, AMD's architectures have been more forward looking than nvidia. Nvidia optimizes for the games that are out now and coming in the near future, and their proprietary effects work around the strengths of their latest architecture.
Kepler sucked at compute. Gameworks' effects are all compute based now, leveraging Maxwell's strength there. AMD is pretty good at compute too, so they're not as badly impacted at Kepler.
When Pascal comes out, I'm sure nvidia will create effects optimized for that architecture, and Maxwell will fall short in some new way. The Witcher 3 and other games make use of DirectCompute.
Take a look at this and look at the directcompute results:
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/20
Does the rough performance of the nvidia cards look familiar? The little Kepler results are all well behind the early GCN cards. Only big Kepler is able to beat the 7970.
Nvidia just doesn't build forward looking cards. They have enough influence on games to dictate when new features become prevalent, and heavily optimize their drivers around the deficiencies of their cards.
I wouldn't be surprised if GCN eventually edges out Maxwell as well, despite losing in current games. It's been a trend since the start of programmable cards between ATI and Nvidia. Radeon 8500 was slower than Geforce3/4 but did better in games later (and DirectX8.1 made a very noticeable visual difference, closer to DX9 in the games it was used in than DX8). Radeon 9700 pro started out faster than Geforce FX, and completely destroyed it once FP24/32 shaders were used. Radeon X1800/X1900 series held up better than the Geforce 7 series, and now GCN is beating Kepler.
Rather than linking that crap clickbait site, go to the source: http://www.pcper.com/news/Graphics-Cards/NVIDIA-Under-Attack-Again-GameWorks-Witcher-3-Wild-Hunt
if anyone here is the owner of a 780 ti, you have my utmost sympathy. you got screwed over hard.Wow, look at how much 780 tank, the 960 is quite a bit faster.
Kepler is dead, long live Maxwell!
R290X matching a highly boost 970 model is also very nice to see PrjRed did not neglect AMD performance.
Here's the bench at ultra minus NV features (Hairworks & HBAO+). Orange = 1080p. Yellow = 1440p.
The 780 & 770 is cut off, both are below the 960!!
Here's the bigger chart 1080p only:
That 780 Jetstream is no slouch, reviews show its ingame boost clocks are ~1.175ghz out of the box. It's sad to see it destroyed by a lowly 960/285.
The Phantom 780Ti also has a very high factory OC.
GTX960 faster than GTX780
LOL
I am pretty curious how Hawaii will fare versus Maxwell in DX12 games