guys, running this game @ Max detail (maybe because I enabled Native Upscaling?) runs @ 60 FPS no problem with my config.
i7-4770K @ 4.2
GTX 780
16GB
SBZ
running @ 1680x1050
not to stir the pot, but compared to xbone version, there is not a huge difference.
Better lighting and shadows and slightly better textures.
http://www.eurogamer.net/articles/digitalfoundry-2014-ryse-pc-face-off
at this higher resolution that we began to run into performance issues that brought our frame-rate down, necessitating a 30fps lock for a consistent update. It's clear now why Crytek went out of its way to note that the 4K experience is designed for 30fps when using high-end GPUs, but thanks to the beautiful post-processing and a superb motion blur implementation, it still looks excellent at the 'cinematic' frame-rate.
i wonder how much blow back will digitalfoundry get for this quote:
^^ I see this Ubi PR is sticking!
Is it the first game to use compute (double precision?) for rendering?
most games that nvidia puts their gameworks crap in turn out to be unoptimized garbage. and you cant always look at just framerate as games can play like crap even if the average framerate looks ok.You are right. How could i expect a great performing game as a nVidia user.
My bad. :hmm:
So i guess all this talk about Watch Dogs and Gameworks was just fud? Okay. I will remember this.
not to stir the pot, but compared to xbone version, there is not a huge difference.
Better lighting and shadows and slightly better textures.
http://www.eurogamer.net/articles/digitalfoundry-2014-ryse-pc-face-off
There's AMD biased, and then there's this. Holy crap, this game is one-sided.
are you? the 290x is beating the 980 by 15-18% where as most other games even the 970 beats the 290x. a 290x is doing about 40% better in this game than it does on average.You're looking at the same benches I am? I see a 290X and GTX980 tied for the top, then the GTX780TI and GTX970 close after and then followed by the GTX280 and then GTX770... and so on. Nothing sounds way out of order. The GTX980 is generally faster than a 290X, but not so much so that some games being faster on the 290X is unreasonable.
are you? the 290x is beating the 980 by 15-18% where as most other games even the 970 beats the 290x. a 290x is doing about 40% better in this game than it does on average.
are you? the 290x is beating the 980 by 15-18% where as most other games even the 970 beats the 290x. a 290x is doing about 40% better in this game than it does on average.
that is an overclocked 980 too. and yes this is way out of the ordinary.Aren't those overclocked 290X's? I dunno, maybe the GTX980 is faster than I thought, but things don't look that out of the ordinary to me.
that is an overclocked 980 too. and yes this is way out of the ordinary.
what part of "out of the ordinary" are you confused about?Because it s a given that the 980 should be systematicaly better in all games and resolutions under all possible settings..??.
http://www.hardware.fr/getgraphimg.php?id=77&n=7
http://www.hardware.fr/articles/928-16/benchmark-hitman-absolution.html
If it was compute, then the 780 Ti would be doing worse. Driver issues or bandwidth I would say.
We are making heavy use of some DX11 features like Compute Shaders, which however, perform better on some hardware architectures than others, so there will be some noticeable performance gaps between different desktop GPUs.