- Feb 19, 2009
- 10,457
- 10
- 76
https://developer.oculus.com/blog/asynchronous-timewarp-on-oculus-rift/
http://techreport.com/news/29914/oculus-sdk-1-3-brings-asynchronous-timewarp-to-windows
Interestingly they use the CPU for what is essentially a GPU compute task.
I don't know how NV GPUs get parallel compute execution while graphics is rendering, but this is AMD's implementation in their latest drivers for the Oculus Rift.
http://techreport.com/news/29917/radeon-software-crimson-edition-16-3-2-is-rift-ready
https://community.amd.com/community/gaming/blog/2016/03/28/asynchronous-shaders-evolved
Time for some tech sites to go in-depth and measure Motion to Photon latency with the Rift since both LiquidVR & VRWorks are functional!
Ars released their Rift review today:
http://arstechnica.com/gaming/2016/...ift-expands-pc-gaming-past-the-monitors-edge/
AP journalists at CES noted that some Rift apps were more an on-rails experience, to prevent rapid head motion. The above is a another trick, fading in/out scenes when players move their heads fast.
http://techreport.com/news/29914/oculus-sdk-1-3-brings-asynchronous-timewarp-to-windows
tl;dr: The Oculus PC SDK v1.3 implements Asynchronous Timewarp (ATW) on Windows. With the latest drivers and hardware, we reduce judder, deliver consistent low latency, and improve efficiency. All apps benefit from this without having to do anything special.
Asynchronous Timewarp (ATW)
We can deal with varying workloads and background apps by decoupling timewarp from the rendering loop and running it independently and asynchronously. By relying on CPU and GPU preemption, we should be able to pick up the latest available application frame and timewarp it before each display scanout at 90Hz.
Interestingly they use the CPU for what is essentially a GPU compute task.
Windows and several recent GPUs have had basic support for GPU preemption, but when we first tried to use it, it did not work well in practice. Over the last year, we have worked closely with Microsoft, NVIDIA, and AMD to change OS GPU scheduling, GPU command processor microcode, and GPU kernel driver design to enable ATW. Graphics driver VR extensions in the form of AMD’s Liquid VR and NVIDIA’s VRWorks were developed to support this.
I don't know how NV GPUs get parallel compute execution while graphics is rendering, but this is AMD's implementation in their latest drivers for the Oculus Rift.
http://techreport.com/news/29917/radeon-software-crimson-edition-16-3-2-is-rift-ready
https://community.amd.com/community/gaming/blog/2016/03/28/asynchronous-shaders-evolved
Time for some tech sites to go in-depth and measure Motion to Photon latency with the Rift since both LiquidVR & VRWorks are functional!
Ars released their Rift review today:
http://arstechnica.com/gaming/2016/...ift-expands-pc-gaming-past-the-monitors-edge/
Mastering nausea. It wouldn’t matter how comfortable the Rift is on your head if it made your stomach feel uncomfortable.
Oculus has spent years hammering a list of best practices for keeping players’ stomachs happy in virtual reality, and developers have by and large taken them to heart. For some games, this means only moving the player’s virtual perspective slowly or quickly “blinking” from one position to another with a quick fade to black.
AP journalists at CES noted that some Rift apps were more an on-rails experience, to prevent rapid head motion. The above is a another trick, fading in/out scenes when players move their heads fast.
Simple air hockey game Shufflepuck Cantina, for instance, featured a distracting stairstep judder as I moved my head around its alien environs. Running around the world of Albino Lullaby felt a little disorienting as well without any fixed focal point to stay centered on. This effect will probably vary from person to person, too—if you’re the type to get sick watching first-person games on a 2D screen, you might be in trouble.
Last edited: