Asynchronous Spacewarp: Difference between revisions
Appearance
Xinreality (talk | contribs) No edit summary |
Xinreality (talk | contribs) No edit summary |
||
| Line 29: | Line 29: | ||
'''September 2018''': Oculus announced ASW 2.0 at Oculus Connect 5, introducing [[Positional Timewarp]] (PTW) leveraging application-provided [[depth buffer]]s to account for the user's translational movement in addition to rotational movement.<ref name="ovr-news">[https://www.ovrnews.com/asynchronous-spacewarp-2-0-will-help-vr-run-dramatically-better-on-rift/ Asynchronous Spacewarp 2.0 Will Help VR Run Dramatically Better on Rift - OVR News]</ref> | '''September 2018''': Oculus announced ASW 2.0 at Oculus Connect 5, introducing [[Positional Timewarp]] (PTW) leveraging application-provided [[depth buffer]]s to account for the user's translational movement in addition to rotational movement.<ref name="ovr-news">[https://www.ovrnews.com/asynchronous-spacewarp-2-0-will-help-vr-run-dramatically-better-on-rift/ Asynchronous Spacewarp 2.0 Will Help VR Run Dramatically Better on Rift - OVR News]</ref> | ||
'''April 2019''': ASW 2.0 was released to all Rift users via a software update. This update promised better accuracy and lower latency in reprojected frames, resulting in far fewer motion estimation artifacts in VR scenes.<ref name="roadtovr- | '''April 2019''': ASW 2.0 was released to all Rift users via a software update. This update promised better accuracy and lower latency in reprojected frames, resulting in far fewer motion estimation artifacts in VR scenes.<ref name="roadtovr-asw2">[https://www.roadtovr.com/oculus-launches-asw-2-0-asynchronous-spacewarp/ Oculus Launches ASW 2.0 with Positional Timewarp to Reduce Latency - Road to VR]</ref> | ||
==How Does Asynchronous Spacewarp Work?== | ==How Does Asynchronous Spacewarp Work?== | ||
| Line 38: | Line 38: | ||
===Eight-Step Pipeline=== | ===Eight-Step Pipeline=== | ||
ASW operates through an eight-step pipeline executed asynchronously by the Oculus runtime:<ref name="asw2 | ASW operates through an eight-step pipeline executed asynchronously by the Oculus runtime:<ref name="asw2-guide">[https://developers.meta.com/horizon/blog/developer-guide-to-asw-20/ Developer Guide to ASW 2.0 - Meta Horizon OS Developers]</ref> | ||
# Captures textures from previous and current frames | # Captures textures from previous and current frames | ||
| Line 60: | Line 60: | ||
==ASW 2.0== | ==ASW 2.0== | ||
ASW 2.0, released in April 2019, represents a significant upgrade over the original ASW (now referred to as ASW 1.0). It combines ASW's frame extrapolation with Positional Timewarp (PTW), which uses depth information to correct for head translations and parallax effects, resulting in better temporal stability and lower latency.<ref name="asw2 | ASW 2.0, released in April 2019, represents a significant upgrade over the original ASW (now referred to as ASW 1.0). It combines ASW's frame extrapolation with Positional Timewarp (PTW), which uses depth information to correct for head translations and parallax effects, resulting in better temporal stability and lower latency.<ref name="asw2-guide"/><ref name="4experience">[https://4experience.co/welcome-to-oculus-asw-2-0/ Oculus ASW 2.0 is finally with us! | 4Experience.co]</ref> | ||
===Key Improvements=== | ===Key Improvements=== | ||
With access to the depth buffer, ASW 2.0 performs depth-aware frame generation that correctly handles [[parallax]] compensation, ensuring near objects and distant objects move at appropriate relative speeds. The system uses depth discontinuities to detect [[disocclusions]] (newly visible areas behind moving objects) and applies geometric accuracy by warping pixels in 3D space rather than 2D screen space.<ref name="asw2 | With access to the depth buffer, ASW 2.0 performs depth-aware frame generation that correctly handles [[parallax]] compensation, ensuring near objects and distant objects move at appropriate relative speeds. The system uses depth discontinuities to detect [[disocclusions]] (newly visible areas behind moving objects) and applies geometric accuracy by warping pixels in 3D space rather than 2D screen space.<ref name="asw2-guide"/> | ||
The depth buffer enables true 3D reprojection through the mathematical transformation: P_new = K × [R|t] × D × K^(-1) × P_old, where K represents the camera intrinsic matrix, R the rotation matrix, t the translation vector, and D the depth value from the [[z-buffer]].<ref name="asw2 | The depth buffer enables true 3D reprojection through the mathematical transformation: P_new = K × [R|t] × D × K^(-1) × P_old, where K represents the camera intrinsic matrix, R the rotation matrix, t the translation vector, and D the depth value from the [[z-buffer]].<ref name="asw2-guide"/> | ||
{| class="wikitable sortable" style="text-align: center" | {| class="wikitable sortable" style="text-align: center" | ||
| Line 86: | Line 86: | ||
|- | |- | ||
| Compatibility || Any eye buffer submission || Needs depth and metadata | | Compatibility || Any eye buffer submission || Needs depth and metadata | ||
|}<ref name="asw2 | |}<ref name="asw2-guide"/> | ||
For ASW 2.0, applications must provide depth buffers in compatible formats (e.g., OVR_FORMAT_D32_FLOAT without [[MSAA]]) and match resolutions between color and depth buffers. [[Unity (game engine)|Unity]] integration requires only a single checkbox to enable depth buffer submission, while [[Unreal Engine]] enables it by default through Oculus integrations.<ref name="asw2 | For ASW 2.0, applications must provide depth buffers in compatible formats (e.g., OVR_FORMAT_D32_FLOAT without [[MSAA]]) and match resolutions between color and depth buffers. [[Unity (game engine)|Unity]] integration requires only a single checkbox to enable depth buffer submission, while [[Unreal Engine]] enables it by default through Oculus integrations.<ref name="asw2-guide"/> If depth data is unavailable, the runtime falls back to ASW 1.0 behavior.<ref name="4experience"/> | ||
==Requirements== | ==Requirements== | ||
| Line 152: | Line 152: | ||
| '''Object Disocclusion''' || A shimmering, stretched, or trailing "halo" appears on the trailing edge of a moving object || As an object moves, it reveals the background behind it. The ASW algorithm has no information about what this newly revealed area should contain, so it stretches nearby background pixels to fill the void || A character running past a wall, the user's hand moving in front of their face, or looking at a cockpit frame while turning<ref name="meta-blog-asw"/> | | '''Object Disocclusion''' || A shimmering, stretched, or trailing "halo" appears on the trailing edge of a moving object || As an object moves, it reveals the background behind it. The ASW algorithm has no information about what this newly revealed area should contain, so it stretches nearby background pixels to fill the void || A character running past a wall, the user's hand moving in front of their face, or looking at a cockpit frame while turning<ref name="meta-blog-asw"/> | ||
|- | |- | ||
| '''Warping / Wobble''' || Edges of objects, particularly high-contrast ones, appear to ripple, warp, or "wobble" like jelly during movement || The motion vector estimation is imperfect. The algorithm incorrectly identifies the movement of a block of pixels, causing it to be reprojected to a slightly wrong position in the synthetic frame || Looking out the side of a fast-moving vehicle, rapid head movements, edges of a cockpit canopy against the sky<ref name="dcs- | | '''Warping / Wobble''' || Edges of objects, particularly high-contrast ones, appear to ripple, warp, or "wobble" like jelly during movement || The motion vector estimation is imperfect. The algorithm incorrectly identifies the movement of a block of pixels, causing it to be reprojected to a slightly wrong position in the synthetic frame || Looking out the side of a fast-moving vehicle, rapid head movements, edges of a cockpit canopy against the sky<ref name="dcs-forum">[https://forum.dcs.world/topic/209496-40fps-with-rift-s-is-this-normal/ DCS Forums - ASW Discussion]</ref> | ||
|- | |- | ||
| '''Pattern Mismatch''' || Fine, repeating patterns (like fences, grilles, or tight textures) can break up or shimmer during motion || The algorithm struggles to track individual elements within a repeating pattern. It may incorrectly match a part of the pattern from one frame with a different but similar-looking part in the next frame || Running alongside a chain-link fence, looking at a finely-grated floor while moving<ref name="meta-blog-asw"/> | | '''Pattern Mismatch''' || Fine, repeating patterns (like fences, grilles, or tight textures) can break up or shimmer during motion || The algorithm struggles to track individual elements within a repeating pattern. It may incorrectly match a part of the pattern from one frame with a different but similar-looking part in the next frame || Running alongside a chain-link fence, looking at a finely-grated floor while moving<ref name="meta-blog-asw"/> | ||
| Line 175: | Line 175: | ||
#'''Provide quality settings that are easy for users to understand''': Simple Low/Medium/High quality settings allow users to tweak a preferred quality sweet spot. Esoteric or hard to understand settings will result in users poorly tuning their application settings and having a negative experience with your software.<ref name="meta-blog-asw"/> | #'''Provide quality settings that are easy for users to understand''': Simple Low/Medium/High quality settings allow users to tweak a preferred quality sweet spot. Esoteric or hard to understand settings will result in users poorly tuning their application settings and having a negative experience with your software.<ref name="meta-blog-asw"/> | ||
#'''For ASW 2.0, submit depth buffers''': Use the ''ovrLayerEyeFovDepth'' layer type instead of basic ''ovrLayerEyeFov'', along with projection matrix parameters via ''ovrTimewarpProjectionDesc'' and world scale through ''ovrViewScaleDesc'' struct.<ref name="asw2 | #'''For ASW 2.0, submit depth buffers''': Use the ''ovrLayerEyeFovDepth'' layer type instead of basic ''ovrLayerEyeFov'', along with projection matrix parameters via ''ovrTimewarpProjectionDesc'' and world scale through ''ovrViewScaleDesc'' struct.<ref name="asw2-guide"/> | ||
===Testing Recommendations=== | ===Testing Recommendations=== | ||
Sample projects OculusRoomTiny and OculusWorldDemo provide reference implementations for depth buffer submission. OculusWorldDemo includes tools to toggle depth submission and adjust frame time to decrease frame rate, with navigation via Tab key menu → Timewarp → Layers → Main Layer. Performance monitoring displays FPS with '''Ctrl+F''', showing green numbers at native refresh rate and red "45 FPS" when ASW activates.<ref name="asw2 | Sample projects OculusRoomTiny and OculusWorldDemo provide reference implementations for depth buffer submission. OculusWorldDemo includes tools to toggle depth submission and adjust frame time to decrease frame rate, with navigation via Tab key menu → Timewarp → Layers → Main Layer. Performance monitoring displays FPS with '''Ctrl+F''', showing green numbers at native refresh rate and red "45 FPS" when ASW activates.<ref name="asw2-guide"/> | ||
==As a Consumer, How to Take Advantage of ASW== | ==As a Consumer, How to Take Advantage of ASW== | ||
| Line 199: | Line 199: | ||
* '''Normal operation without ASW''': 20-22ms motion-to-photon latency | * '''Normal operation without ASW''': 20-22ms motion-to-photon latency | ||
* '''ASW 1.0 active''': Approximately doubles latency to 40-44ms | * '''ASW 1.0 active''': Approximately doubles latency to 40-44ms | ||
* '''ASW 2.0 with PTW''': Significantly reduces both rotational and positional latency compared to ASW 1.0<ref name="asw2 | * '''ASW 2.0 with PTW''': Significantly reduces both rotational and positional latency compared to ASW 1.0<ref name="asw2-guide"/> | ||
Meta claims that "even apps using ASW 2.0 have lower head latency than any app on Quest without ASW."<ref name="asw2 | Meta claims that "even apps using ASW 2.0 have lower head latency than any app on Quest without ASW."<ref name="asw2-guide"/> | ||
==Related Technologies== | ==Related Technologies== | ||
| Line 228: | Line 228: | ||
===Synchronous Spacewarp (SSW)=== | ===Synchronous Spacewarp (SSW)=== | ||
[[Synchronous Spacewarp]] (SSW) is a proprietary frame-smoothing technology developed for the [[Virtual Desktop]] application, which streams PC VR games wirelessly to Quest headsets. Its key innovation is performing frame synthesis calculations on the Quest headset itself, leveraging the [[Qualcomm]] [[Snapdragon XR2]] chip.<ref name="qualcomm | [[Synchronous Spacewarp]] (SSW) is a proprietary frame-smoothing technology developed for the [[Virtual Desktop]] application, which streams PC VR games wirelessly to Quest headsets. Its key innovation is performing frame synthesis calculations on the Quest headset itself, leveraging the [[Qualcomm]] [[Snapdragon XR2]] chip.<ref name="qualcomm">[https://www.qualcomm.com/developer/blog/2022/09/virtual-boost-vr-rendering-performance-synchronous-space-warp Virtual Boost in VR Rendering Performance - Qualcomm Developer Blog]</ref> | ||
===Other Platform Technologies=== | ===Other Platform Technologies=== | ||
* '''[[SteamVR]] Motion Smoothing''': [[Valve Corporation|Valve's]] implementation for headsets like the [[Valve Index]] and [[HTC Vive]]. Operates similarly to ASW 1.0, forcing half-rate rendering and synthesizing intermediate frames.<ref name="roadtovr- | * '''[[SteamVR]] Motion Smoothing''': [[Valve Corporation|Valve's]] implementation for headsets like the [[Valve Index]] and [[HTC Vive]]. Operates similarly to ASW 1.0, forcing half-rate rendering and synthesizing intermediate frames.<ref name="roadtovr-smoothing">[https://www.roadtovr.com/steamvr-motion-smoothing-asw-reprojection/ SteamVR Motion Smoothing - Road to VR]</ref> | ||
* '''[[Windows Mixed Reality]] Motion Reprojection''': Microsoft's technology combining spatial reprojection (correcting for head pose) and temporal reprojection (synthesizing new frames based on motion vectors).<ref name="wmr-docs">[https://docs.microsoft.com/en-us/windows/mixed-reality/ Windows Mixed Reality Documentation - Microsoft]</ref> | * '''[[Windows Mixed Reality]] Motion Reprojection''': Microsoft's technology combining spatial reprojection (correcting for head pose) and temporal reprojection (synthesizing new frames based on motion vectors).<ref name="wmr-docs">[https://docs.microsoft.com/en-us/windows/mixed-reality/ Windows Mixed Reality Documentation - Microsoft]</ref> | ||
| Line 253: | Line 253: | ||
==References== | ==References== | ||
<references | <references/> | ||
[[Category:Terms]] | [[Category:Terms]] | ||
[[Category:Technical Terms]] | [[Category:Technical Terms]] | ||
[[Category:Virtual Reality Technologies]] | [[Category:Virtual Reality Technologies]] | ||