Sniper Ghost Warrior 3: RX 580 beats the GTX 1060 by 10-15fps across the board. Prey: 1060 slightly ahead of RX 580. Both are CryEngine titles.
One thing is for sure, NVIDIA's dominance in CryEngine is certainly not what it used to be.
Can confirm this game runs fantastic. Also super immersive, to me at least. Its the same kind of suspenseful, thrilling story and gameplay that Bioshock 1 nailed. And before that, System Shock 2. Would definitely recommend.
Looks like at 1:37 Nvidia is not rendring most of the lighting in the room (table, floor,..). Bug or cheating?
Looks like at 1:37 Nvidia is not rendring most of the lighting in the room (table, floor,..). Bug or cheating?
Looks like at 1:37 Nvidia is not rendring most of the lighting in the room (table, floor,..). Bug or cheating?
Whoa... Looks like it was missed entirely, . Some odd behaviour too it looks like in relation to frame pacing.
Normally I'd say missed , but nvidia always specifically says "game ready" so I have to go with missed on purpose. Curious where else stuff might be dropped on complicated scenes to help ease the GPU strain.
It might explain why carfax83 finds the game looking dated. Maybe he is missing a lot of graphic eye candy...
Loved the original Prey, and this one looks awesome. Runs awesome too. Can anyone confirm surround / eyefinity / ultrawide compatibility?
LoL...
On a more serious note, now I'm A LOT more curious if other effects are not rendered. This wasn't just badly rendered, it just wasn't there! As in discarded by the driver, cause it obviously was there in the game code since the AMD driver picked it up.
Need at least a second confirmation, cause drivers discarding effects is a big deal, reminds me of the early days!
Applications (including games) do not call driver methods directly. Applications use APIs (such as DirectX and OpenGL). API calls by the software can be implemented by calling (one or more) driver methods. Drivers are software just like everything else. GPU driver software basically translates high level concepts to a format understood by the GPU hardware, and does the required bit transfers to GPU over PCI-express bus (it might call another driver to perform the actual transfer).
Shader compiler is only a part of the driver. It translates platform independent shader code (HLSL / GLSL) to hardware specific microcode. DirectX actually has a built-in HLSL -> bytecode compiler. The driver thus doesn't need to parse the HLSL code (text) itself, only translate that general purpose bytecode to hardware specific microcode. DirectX shader compiler does the most generic optimizations for the shader code, but the driver shader compiler must do additional optimizations on top of the generic optimizations, because every GPU architecture is different.
Most GPU APIs hide GPU memory management from the application. Application cannot directly modify GPU memory mappings. Usually it's the responsibility of the driver to handle this. APIs support different kind of resources. Some resources are static and some are temporary (temporary resources often need to have multiple copies in memory to hide CPU<->GPU latency). Memory management is not easy. Fragmentation of a limited GPU memory space is always an issue. Life time management (and transfers between CPU<->GPU memory spaces) can be quite complicated as well.
Doesn't the game have different stages of day/night? Could it be possible that one was in daylight and therefore the "sun" lighting was there, and the other one is in night?Looks like at 1:37 Nvidia is not rendring most of the lighting in the room (table, floor,..). Bug or cheating?
Doesn't the game have different stages of day/night? Could it be possible that one was in daylight and therefore the "sun" lighting was there, and the other one is in night?
New video from BoostClock confirms that missing lighting is day/night difference
https://youtu.be/r5jmvavX7xI?t=78
I noticed that there was definitely a momentary stutter in some instances in the GTX 1060. There was even a brief freeze at 2:05.
EDIT: Upon closer inspection, stutters are very frequent on the GTX 1060 - any movement of the camera and its very obvious. As an example, when going through the gate labelled security in the beginning of the first sequence. This corroborates PCGH's findings.
According to our results, the Geforce driver 381.89 WHQL is recommendable, because with the "Game-Ready" driver, which was adapted for Prey, there was strong stuttering during our tests on a total of 3 testpatt forms, which is quite inconspicuous at the beginning of the game Later, larger areas, but every few seconds very disturbing Stocker provoked, which the play fun strongly - the Stocker are so large that they negatively affect the minimum fps of the measuring results. Although the 382.05 WHQL delivers somewhat higher image rates, we have therefore decided to use the older Geforce driver. A comparison with a GTX 1080 is shown in the following Frametime graph. Nvidia has already been informed by us and is investigating the phenomenon. The Radeon driver, which is also optimized for Prey, shows no such problems.
Interesting, I haven't experienced any stutter so far and I'm on the latest driver. I've only played it for an hour or so though.http://www.pcgameshardware.de/Prey-2017-Spiel-57339/Specials/Benchmark-Test-Review-1227151/
google translated:
That is an interesting read, even on a gtx1080 there is stutter. That the driver specially for the game causes more stutter than the old one is strange. Use the Geforce driver 381.89 WHQL is the advice.
Interesting, I haven't experienced any stutter so far and I'm on the latest driver. I've only played it for an hour or so though.
Sent from my Nexus 6P using Tapatalk