Asynchronous Spacewarp
- See also: Timewarp and Application SpaceWarp
Introduction
Asynchronous Spacewarp (ASW) is a frame interpolation technology developed by Oculus VR (now Meta Platforms) for PC VR headsets such as the Oculus Rift. It applies animation detection, camera translation, and head translation to previous frames in order to predict and generate intermediate frames, thereby smoothing motion and allowing applications to run on lower-performance hardware.[1] ASW analyzes the differences between successive frames to extrapolate new frames, thus almost halving the CPU/GPU workload required for a given VR experience.[2]
ASW extends upon Asynchronous Timewarp (ATW), which only applies to the rotational tracking of the head. While ATW handles head rotation perfectly for static or distant scenes, ASW addresses scene animation, object movement, and head translation that ATW cannot handle.[2] The Rift operates at 90Hz. When an application fails to submit frames at 90Hz, the Rift runtime drops the application down to 45Hz with ASW providing each intermediate frame. This enables users to experience VR on hardware below the recommended specifications, as ASW can halve the required workload while preserving much of the visual quality.[1]
Asynchronous SpaceWarp enables users to run the Oculus Rift on lower specification hardware than the original recommended specification. The introduction of ASW was pivotal for the consumer VR market - it reduced entry-level VR PC costs from $1,500 to under $500 and expanded VR accessibility to mainstream audiences.[3] ASW tends to predict linear motion better than non-linear motion. If your application is dropping frames, you can either adjust the resolution or simply allow ASW to take over.
ASW is available with the release of the Oculus 1.10 runtime (November 2016). It is enabled across all ranges of hardware and systems that support the feature, and activated automatically for all applications. ASW will automatically engage whenever the application needs extra time for rendering. For developers, the Oculus Debug Tool provides support for controlling ASW for development purposes.
History
October 6-7, 2016: CEO Brendan Iribe unveiled Asynchronous Spacewarp at Oculus Connect 3 in San Jose, California, explaining that the technology analyzes the difference between the application's two previous frames to calculate spatial transforms for extrapolating new synthetic frames. The announcement coincided with a dramatic reduction in minimum VR system requirements, from NVIDIA GTX 970/AMD R9 290 and Intel i5-4590 processors down to GTX 960/RX 470 and Intel i3-6100/AMD FX-4350 chips.[4]
November 10, 2016: ASW 1.0 became available with Oculus Runtime 1.10 as an automatic feature requiring zero developer implementation. The release enabled $499 "Oculus Ready" PCs from partners like CyberPower PC, addressing the primary barrier to VR adoption.[5]
September 2018: Oculus announced ASW 2.0 at Oculus Connect 5, introducing Positional Timewarp (PTW) leveraging application-provided depth buffers to account for the user's translational movement in addition to rotational movement.[6]
April 2019: ASW 2.0 was released to all Rift users via a software update. This update promised better accuracy and lower latency in reprojected frames, resulting in far fewer motion estimation artifacts in VR scenes.[7]
How Does Asynchronous Spacewarp Work?
Core Mechanism
ASW generates extrapolated frames from previous frames generated by the VR application. When the Oculus Runtime detects that an application is consistently dropping frames, it automatically activates ASW. The runtime then instructs the application to render at half the headset's refresh rate (e.g., 45 FPS for a 90Hz display). This doubles the time budget the application has to render each frame, from approximately 11.1 ms to 22.2 ms.[1]
ASW works in tandem with ATW to cover all visual motion within the virtual reality experience. This includes character movement, camera movement, Touch controller movement, and the player's own positional movement. If the application falls behind the display's frame rate, the experience typically remains smooth and enjoyable.[2]
Eight-Step Pipeline
ASW operates through an eight-step pipeline executed asynchronously by the Oculus runtime:[8]
- Captures textures from previous and current frames
- Generates a "pre-warp frame" by applying Timewarp to the previous frame using the current frame's pose
- Converts both frames to GPU-video-encoder-friendly resources
- Sends textures to the GPU video encoder for correspondence analysis
- Extracts motion vectors that describe pixel movement between frames
- Post-processes motion vectors for frame extrapolation
- Packages and injects motion vectors into compositor layers
- Applies standard Timewarp and barrel distortion using ASW-injected layer content
The motion vector generation relies on block-based optical flow analysis, where the GPU video encoder hardware compares blocks of pixels between frames to establish correspondence. This creates a tiled vector field representing displacement (Δx, Δy) of image features between time t and t+1.[9]
An example of this extrapolation is shown below. We have a scene in which a held gun which is moving from right to left in frames 0 and 1 (generated 1/45th of a second apart), and we want to generate an extrapolated frame from that movement (1/90th of a second after frame 1). We detect the movement of the gun and generate a new frame which we display on behalf the application.
When it comes to virtual reality, ATW and ASW are siblings and complement each other. Timewarp is great for accommodating head rotation. In fact, it's perfect for static images at a distance. For applications like 360 videos and features on the horizon, Spacewarp is unnecessary. Conversely, Spacewarp is pretty good for animated objects up close but not so great at tracking head rotation.
With ASW, Oculus is building on ATW to produce the best virtual reality experience possible. Effective latency is kept low and head tracking as before is smooth, and now, moving elements within VR are also kept smooth.[2]
ASW 2.0
ASW 2.0, released in April 2019, represents a significant upgrade over the original ASW (now referred to as ASW 1.0). It combines ASW's frame extrapolation with Positional Timewarp (PTW), which uses depth information to correct for head translations and parallax effects, resulting in better temporal stability and lower latency.[8][10]
Key Improvements
With access to the depth buffer, ASW 2.0 performs depth-aware frame generation that correctly handles parallax compensation, ensuring near objects and distant objects move at appropriate relative speeds. The system uses depth discontinuities to detect disocclusions (newly visible areas behind moving objects) and applies geometric accuracy by warping pixels in 3D space rather than 2D screen space.[8]
The depth buffer enables true 3D reprojection through the mathematical transformation: P_new = K × [R|t] × D × K^(-1) × P_old, where K represents the camera intrinsic matrix, R the rotation matrix, t the translation vector, and D the depth value from the z-buffer.[8]
| Feature | ASW 1.0 | ASW 2.0 |
|---|---|---|
| Motion Handling | Animation and basic tracking | Combines with PTW for rotation and translation |
| Timewarp Type | OTW (no depth) | PTW (with depth) or OTW fallback |
| Depth Buffer Usage | Not required | Required for full benefits |
| Parallax Correction | Handled by ASW | Mostly by PTW, reducing artifacts |
| Visual Artifacts | More prone to judder | 80% reduction in estimation artifacts |
| Worst-case Scenarios | Severe artifacts | 99% reduction in pattern artifacts |
| Latency | Reduces via ATW | Further lowered with PTW |
| Compatibility | Any eye buffer submission | Needs depth and metadata |
For ASW 2.0, applications must provide depth buffers in compatible formats (e.g., OVR_FORMAT_D32_FLOAT without MSAA) and match resolutions between color and depth buffers. Unity integration requires only a single checkbox to enable depth buffer submission, while Unreal Engine enables it by default through Oculus integrations.[8] If depth data is unavailable, the runtime falls back to ASW 1.0 behavior.[10]
Requirements
ASW requires the following:
Software Requirements
- Oculus Runtime 1.9 or later (1.10 for initial release)
- Windows 8 or later (Windows 10 recommended for best support)
- For NVIDIA, driver 373.06 or later
- For AMD, driver 16.40.2311 or later[1]
Hardware Requirements
Until the minimum specification is released, we recommend the following GPUs for ASW testing:
| Manufacturer | Series | Minimum RAM | Minimum Model |
|---|---|---|---|
| NVIDIA | Pascal | 3GB | GTX 1060 |
| Maxwell | 4GB | GTX 960 | |
| AMD | Polaris | 4GB | RX 470 |
Minimum PC Specifications
The introduction of ASW changed the accessibility equation for PC VR:
- CPU: Intel Core i3-6100 or AMD FX-4350
- GPU: NVIDIA GTX 960 or AMD RX 470
- RAM: 8GB
- OS: Windows 8 or newer (Windows 7 lacks ASW support)[1]
These reduced requirements compared to pre-ASW minimums of Intel i5-4590 and GTX 970/R9 290, enabling VR-ready systems at $499 versus previous $1,000+ entry points.[5]
Testing ASW
Developers and users can test ASW using the Oculus Debug Tool, located at C:\Program Files\Oculus\Support\oculus-diagnostics\OculusDebugTool.exe. Set the "Asynchronous Spacewarp" option to:
- Auto: Enables ASW when needed (default)
- Disabled: Turns off ASW
- Force 45fps, ASW disabled: Forces 45Hz without ASW (may cause judder)
- Force 45fps, ASW enabled: Forces 45Hz with ASW active[1]
Keyboard Shortcuts
While testing your application with ASW, you can switch between rendering modes:
- Control+Numpad1: Disables ASW and returns to the standard rendering mode
- Control+Numpad2: Forces apps to 45Hz with ASW disabled. Depending on the application, you are likely to experience judder
- Control+Numpad3: Forces apps to 45Hz with ASW enabled. Enabling and disabling ASW will help you see the effects of ASW
- Control+Numpad4: Enables ASW. ASW automatically turns on and off, depending on whether the app maintains a 90Hz frame rate. This is the default runtime rendering mode[1]
For permanent disabling, users can use third-party tools like Oculus Tray Tool or edit registry keys (e.g., set HKLM\Software\Oculus VR, LLC\LibOVR\AswEnabled to 0), though this is not officially supported.[11]
Downsides of ASW
Just as with ATW, ASW is active and enabled for all applications without any developer effort. There's no completely free lunch, however. ASW doesn't scale well below half the display's refresh rate. Depending on what's being displayed, there may be visual artifacts present as a result of imperfect extrapolation.[2]
Common Visual Artifacts
| Artifact | Visual Description | Technical Cause | Common Scenarios |
|---|---|---|---|
| Object Disocclusion | A shimmering, stretched, or trailing "halo" appears on the trailing edge of a moving object | As an object moves, it reveals the background behind it. The ASW algorithm has no information about what this newly revealed area should contain, so it stretches nearby background pixels to fill the void | A character running past a wall, the user's hand moving in front of their face, or looking at a cockpit frame while turning[2] |
| Warping / Wobble | Edges of objects, particularly high-contrast ones, appear to ripple, warp, or "wobble" like jelly during movement | The motion vector estimation is imperfect. The algorithm incorrectly identifies the movement of a block of pixels, causing it to be reprojected to a slightly wrong position in the synthetic frame | Looking out the side of a fast-moving vehicle, rapid head movements, edges of a cockpit canopy against the sky[12] |
| Pattern Mismatch | Fine, repeating patterns (like fences, grilles, or tight textures) can break up or shimmer during motion | The algorithm struggles to track individual elements within a repeating pattern. It may incorrectly match a part of the pattern from one frame with a different but similar-looking part in the next frame | Running alongside a chain-link fence, looking at a finely-grated floor while moving[2] |
| Transparency Issues | Transparent or semi-transparent objects (like smoke, glass, or particle effects) may appear to stutter, tear, or have incorrect depth | A single pixel containing a transparent object represents multiple layers of motion. The motion vector field can only store one motion for that pixel | Looking through a cockpit windshield with rain effects, explosions, or volumetric fog[13] |
| Rapid Brightness Changes | Strobe lights, lightning flashes, or rapid fades can cause the scene to waver or appear unstable | The algorithm relies on finding recognizable blocks of pixels between frames. A sudden, drastic change in lighting can make it impossible for the algorithm to find a match | Scenes with flashing lights, lightning effects, or rapid fade-to-black transitions[2] |
| Head-locked Elements | HUDs or menus not using proper layers may judder | When applications attempt head-locked elements without using the proper layer mechanism, ASW can't separate them from the moving background | Cockpit displays, HUD elements, or floating menus[2] |
Outside of head-locked elements, you shouldn't avoid scenarios that produce these artifacts but rather be mindful of their appearance. ASW works well under most, but not all, circumstances to cover sub-90fps rendering. We feel the experience of ASW is a significant improvement to judder and is largely indistinguishable from full-rate rendering in most circumstances.[2]
As a Developer, How to Make ASW Work Well
ASW enables a class of computers that were previously unable to drive VR. This means on recommended specification systems ASW should rarely be seen, if at all. Developers should maintain 90 fps rendering on recommended spec systems. Without any additional effort from the developer, the experience generally will scale to minimum spec machines and use ASW as needed. This is because apps that run at 90 fps on recommended spec systems can typically run at 45 fps on min spec systems.[2]
Best Practices
- Use layers for head-locked content, HUDs, and menus: These items are tracked more accurately using layers in the Oculus runtime compositor. Using layers correctly will make these elements appear crisp and track correctly even if ASW is unavailable, and also allow improved image quality and readability for text. Use the ovrLayerFlag_HeadLocked flag in the SDK.[2]
- Never assume the display frame time is 1/90th of a second (or any other constant value): Run application simulations including animations and physics based on elapsed real time, not frame counts. There are a number of frame rates found on current VR headsets. The range today is anywhere from 45 fps to 120 fps. Fixing computations to any expected value will guarantee your application runs at the wrong rate on any other hardware.[2]
- Provide quality settings that are easy for users to understand: Simple Low/Medium/High quality settings allow users to tweak a preferred quality sweet spot. Esoteric or hard to understand settings will result in users poorly tuning their application settings and having a negative experience with your software.[2]
- For ASW 2.0, submit depth buffers: Use the ovrLayerEyeFovDepth layer type instead of basic ovrLayerEyeFov, along with projection matrix parameters via ovrTimewarpProjectionDesc and world scale through ovrViewScaleDesc struct.[8]
Testing Recommendations
Sample projects OculusRoomTiny and OculusWorldDemo provide reference implementations for depth buffer submission. OculusWorldDemo includes tools to toggle depth submission and adjust frame time to decrease frame rate, with navigation via Tab key menu → Timewarp → Layers → Main Layer. Performance monitoring displays FPS with Ctrl+F, showing green numbers at native refresh rate and red "45 FPS" when ASW activates.[8]
As a Consumer, How to Take Advantage of ASW
For the end user (consumer) with an Oculus Rift, Asynchronous Spacewarp works automatically – there is nothing specific you need to do to enable it under normal circumstances. If your system meets the minimum requirements and is running the Oculus software 1.10 or later, ASW is always on by default at the runtime level.[2]
Consumer Recommendations
- Keep your drivers and software updated: ASW relies on driver support (partnership with GPU vendors) and OS features. Using the latest Oculus PC software and up-to-date NVIDIA/AMD drivers will ensure the best ASW performance and compatibility.[1]
- Ensure your system meets at least the minimum spec: While ASW can cover occasional frame drops, it cannot perform miracles if your hardware is far below the required level. ASW extends the bottom threshold a bit, but there is still a limit.[1]
- Understand when ASW is active: If you notice that a VR game's frame rate seems to cap at exactly half the headset refresh (e.g. 45 FPS on a 90 Hz Rift CV1), that indicates ASW is active. You can use the Oculus Debug Tool or third-party tools (like the Oculus Tray Tool) to display performance stats.[1]
- Tune your VR graphics settings: If you find ASW is activating too often (meaning your app is frequently hitting only 45 FPS), you might improve your experience by lowering graphics settings in the VR app. On a minimum-spec PC, it's expected that ASW will be active quite a lot (and that's okay).[2]
Performance Impact
ASW operation targets 45 FPS application rendering for Rift's 90 Hz display, with frame budget expanding from 11.1ms per frame at native 90 FPS to 22.2ms per frame when ASW generates synthetic intermediate frames. This effectively doubles rendering time available per real frame, explaining the approximate 50% GPU workload reduction.[14]
Latency Measurements
Motion-to-photon latency measurements reveal significant tradeoffs:
- Normal operation without ASW: 20-22ms motion-to-photon latency
- ASW 1.0 active: Approximately doubles latency to 40-44ms
- ASW 2.0 with PTW: Significantly reduces both rotational and positional latency compared to ASW 1.0[8]
Meta claims that "even apps using ASW 2.0 have lower head latency than any app on Quest without ASW."[8]
Related Technologies
The "Spacewarp" branding is used for several distinct technologies within the Meta ecosystem, each tailored to a different platform and architecture:
Application SpaceWarp (AppSW) for Meta Quest
Application SpaceWarp (AppSW) is a technologically distinct feature designed for standalone Meta Quest headsets and should not be confused with the PC-based ASW.[1] Unlike the automatic, system-level ASW, AppSW is a developer-driven optimization where the application's developer must explicitly implement support for it.[15]
| Feature | Asynchronous Spacewarp (PC VR) | Application SpaceWarp (Standalone Quest) |
|---|---|---|
| Platform | PC-tethered headsets (Rift, Quest via Link/Air Link) | Standalone headsets (Meta Quest 2, Meta Quest 3, Meta Quest Pro) |
| Activation | Automatic, system-level | Manual, developer-driven |
| Motion Vector Source | Estimation from previous frames | Ground truth from game engine |
| Data Requirement | Color buffers (ASW 1.0); Color + Depth (ASW 2.0) | Color, Depth, and Motion Vector buffers |
| Primary Goal | Safety net for performance drops | Core optimization for mobile hardware |
| Visual Fidelity | Good, but prone to artifacts | Higher quality with fewer artifacts |
Synchronous Spacewarp (SSW)
Synchronous Spacewarp (SSW) is a proprietary frame-smoothing technology developed for the Virtual Desktop application, which streams PC VR games wirelessly to Quest headsets. Its key innovation is performing frame synthesis calculations on the Quest headset itself, leveraging the Qualcomm Snapdragon XR2 chip.[16]
Other Platform Technologies
- SteamVR Motion Smoothing: Valve's implementation for headsets like the Valve Index and HTC Vive. Operates similarly to ASW 1.0, forcing half-rate rendering and synthesizing intermediate frames.[17]
- Windows Mixed Reality Motion Reprojection: Microsoft's technology combining spatial reprojection (correcting for head pose) and temporal reprojection (synthesizing new frames based on motion vectors).[18]
- PlayStation VR Reprojection: Console-optimized reprojection with three rendering modes: native 90Hz, native 120Hz, and 60Hz→120Hz asynchronous reprojection with motion interpolation.[19]
Patents and Academic Foundations
Microsoft's US9514571B2 patent titled "Late stage reprojection" forms the intellectual property foundation, filed July 25, 2013 and published December 6, 2016. The patent describes methods for generating and displaying images associated with virtual objects at frame rates exceeding rendering frame rates through homographic transformations and pixel offset adjustments for HMD displays.[20]
The seminal academic paper "The Asynchronous Time Warp for Virtual Reality on Consumer Hardware" by J.M.P. van Waveren appeared at ACM VRST '16 (22nd ACM Conference on Virtual Reality Software and Technology) in November 2016. Van Waveren, an Engineering Manager at Oculus VR who passed away in 2017, authored the foundational work on ATW that became the basis for OpenXR standards.[21]
See Also
References
- ↑ 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 1.10 1.11 Asynchronous SpaceWarp - Meta for Developers
- ↑ 2.00 2.01 2.02 2.03 2.04 2.05 2.06 2.07 2.08 2.09 2.10 2.11 2.12 2.13 2.14 2.15 Asynchronous Spacewarp - Blog | Meta Horizon OS Developers
- ↑ Asynchronous Spacewarp - XinReality Wiki
- ↑ Oculus Reveals Asynchronous Spacewarp, Lowers VR Minimum Spec - Tom's Hardware
- ↑ 5.0 5.1 The Oculus Rift now officially runs on cheaper PCs - The Verge
- ↑ Asynchronous Spacewarp 2.0 Will Help VR Run Dramatically Better on Rift - OVR News
- ↑ Oculus Launches ASW 2.0 with Positional Timewarp to Reduce Latency - Road to VR Cite error: Invalid
<ref>tag; name "roadtovr-2019" defined multiple times with different content - ↑ 8.00 8.01 8.02 8.03 8.04 8.05 8.06 8.07 8.08 8.09 Developer Guide to ASW 2.0 - Meta Horizon OS Developers Cite error: Invalid
<ref>tag; name "asw2-dev-guide" defined multiple times with different content - ↑ VR Timewarp, Spacewarp, Reprojection, And Motion Smoothing Explained - UploadVR
- ↑ 10.0 10.1 Oculus ASW 2.0 is finally with us! | 4Experience.co Cite error: Invalid
<ref>tag; name "4exp-asw20" defined multiple times with different content - ↑ How do I completely disable ASW? - Reddit
- ↑ DCS Forums - ASW Discussion Cite error: Invalid
<ref>tag; name "dcs-forums" defined multiple times with different content - ↑ Application SpaceWarp - Meta Developers
- ↑ NVIDIA VRWorks Developer Documentation
- ↑ 15.0 15.1 Introducing Application SpaceWarp - Meta Developers
- ↑ Virtual Boost in VR Rendering Performance - Qualcomm Developer Blog Cite error: Invalid
<ref>tag; name "qualcomm-ssw" defined multiple times with different content - ↑ SteamVR Motion Smoothing - Road to VR Cite error: Invalid
<ref>tag; name "roadtovr-motion" defined multiple times with different content - ↑ Windows Mixed Reality Documentation - Microsoft
- ↑ PlayStation VR Technical Documentation - Sony Interactive Entertainment
- ↑ US Patent 9514571B2 - Late stage reprojection
- ↑ The Asynchronous Time Warp for Virtual Reality on Consumer Hardware - ACM Digital Library