Jump to content

Latency: Difference between revisions

From VR & AR Wiki
No edit summary
No edit summary
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
Latency is the delay between action and reaction. Having low latency is crucial when using a [[Head-mounted Display]] for [[VR]] or [[AR]]. As you move your head, your [[HMD]] should display the resulting changes immediately. Any significant delay will cause your brain to reject the legitimacy of the virtual world and cause you to lose [[presence]]. High latency can also cause [[Motion Sickness]].
{{Infobox VRAR term
| name = Latency
| image =
| image_caption =
| aka = Motion-to-photon latency, End-to-end latency, MTP
| description = The time delay between a user's action (e.g., head movement) and the corresponding visual update displayed in a VR/AR headset
}}


Minimizing latency is one of VR and AR's biggest challenges. Human brains can detect very small latency in the visual and audio systems. These systems have to bring latency low enough to be undetectable by the brain.  
'''Latency''' in [[virtual reality]] (VR) and [[augmented reality]] (AR) refers to the time delay between a user's physical action (such as moving their head) and the corresponding visual feedback displayed in the [[head-mounted display]] (HMD). This delay, more formally known as '''[[motion-to-photon latency]]''' (MTP) or '''end-to-end latency''', measures the complete time from when a user initiates a movement until photons representing the updated view are emitted from the display and reach the user's eyes.<ref name="chioka">Chioka (2015). "What is Motion-To-Photon Latency?" Available: http://www.chioka.in/what-is-motion-to-photon-latency/</ref>


==Motion-to-photon latency / End-to-end latency==
Minimizing latency is arguably the single most critical technical challenge in creating comfortable and believable immersive experiences. Low latency is essential for maintaining [[presence]]—the psychological state of feeling truly "there" in the virtual environment. High latency not only breaks this illusion but can also cause severe physiological effects including [[simulator sickness]] (also known as [[cybersickness]]), characterized by symptoms such as nausea, disorientation, headaches, and general discomfort.<ref name="weech2020">Weech, S., Kenny, S., & Brüngül, E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review." Frontiers in Virtual Reality, 1, 582204. Available: https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2020.582204/full</ref> In AR specifically, latency causes virtual content to appear misaligned with real-world features, breaking the illusion that digital objects are anchored in physical space.<ref name="unc">University of North Carolina at Chapel Hill. "Managing Latency in Complex Augmented Reality Systems." Available: http://www.cs.unc.edu/Research/us/Latency/ManagingRelativeLatency.html</ref>
[[Motion-to-photon latency]] also known as the [[End-to-end latency]] is the delay between the movement of the user's head and the change of the display of VR device reflecting the user's movement. As soon as the user's head moves, the VR scenery should match the movement. The more delay (latency) between these 2 actions, the more unrealistic the VR world seems. To make the VR world realistic, VR systems want really low latency of <20ms.


==How to Reduce Latency==
== Motion-to-Photon Latency ==
Utilize [[Timewarp]]


Decrease [[Pixel switching time]]
[[Motion-to-photon latency]], also known as end-to-end latency, represents the total system delay from the moment a user moves until the display updates to reflect that movement. This encompasses the entire processing pipeline: sensor detection, tracking computation, application logic, rendering, and display output.<ref name="chioka" /><ref name="wagner2018">Wagner, D. (2018). "Motion-to-Photon Latency in Mobile AR and VR." Medium (DAQRI). Available: https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926</ref>


Increase [[Refresh rate]]
The human [[vestibular system]] and visual perception are extraordinarily sensitive to temporal mismatches. Research indicates that trained observers can detect latency as low as 3.2-3.4 milliseconds under optimal conditions, though practical detection thresholds for most users fall around 15-17ms.<ref name="jerald2012">Jerald, J. (2012). "The VR Book: Human-Centered Design for Virtual Reality." Morgan & Claypool Publishers.</ref> Industry consensus has established 20ms as the critical threshold—latencies below this are generally imperceptible during normal use, while latencies above begin to degrade the experience.<ref name="carmack2013">Carmack, J. (2013). "Latency Mitigation Strategies." Presentation at GDC 2013. Available: https://www.wired.com/2013/02/john-carmacks-latency-mitigation-strategies/</ref><ref name="varjo">Varjo. "Latency in virtual and mixed reality explained." Available: https://support.varjo.com/hc/en-us/latency</ref>


== Types and Sources of Latency ==


=== Components of the Latency Pipeline ===


{| class="wikitable"
|+ Motion-to-Photon Latency Budget Breakdown
! Pipeline Stage !! Naive/Unoptimized System (ms) !! Modern Optimized System (ms) !! Key Optimization Techniques
|-
! Sensor Sampling ([[IMU]])
| 2–4 ms || < 1 ms || High-frequency sampling (1000Hz+), dedicated sensor fusion
|-
! Tracking Computation
| 15–25 ms (camera-based) || 1–2 ms (with IMU fusion) || [[Kalman filter]]ing, predictive algorithms
|-
! Application/Game Logic ([[CPU]])
| 11–16 ms || 3–5 ms || Multi-threaded rendering, optimized code
|-
! Render Queue (CPU → [[GPU]])
| 16–33 ms || 1–2 ms || [[Late latching]], view bypass techniques
|-
! GPU Rendering
| 11–16 ms || 5–10 ms || [[Foveated rendering]], simplified shaders
|-
! Display Scanout & Persistence
| 16 ms (60Hz) || 2–4 ms (90Hz low-persistence) || High refresh rates, [[low-persistence display]]s
|-
! '''Total Motion-to-Photon'''
| '''~56–107 ms''' || '''~11–23 ms''' || Combined optimizations plus [[timewarp]]/[[reprojection]]
|}
=== Sensor and Tracking Latency ===
'''[[Inertial Measurement Unit]] (IMU) tracking''' provides the foundation for low-latency orientation tracking. Modern VR headsets incorporate 6-degree-of-freedom (6DOF) IMUs sampling at 500-1000Hz, providing orientation updates with latency below 2ms. High-end systems like the Bosch BMI085 contribute less than 3ms to total motion-to-photon latency.<ref name="wagner2018" />
'''Visual tracking latency''' varies significantly based on implementation. Camera-based [[inside-out tracking]] typically operates at 30-90Hz with additional processing time. A camera image can be 10-15ms old by the time it's processed, which is why high-frequency IMU data proves essential for compensating.<ref name="wagner2018" />
'''[[Outside-in tracking]]''' using external sensors like [[SteamVR Lighthouse]] base stations can achieve sub-5ms tracking latency, representing the gold standard for precision, though requiring fixed sensor installation.<ref name="brennan2022">Brennan, C., Pike, M., & Humphreys, G. (2022). "Measuring and ameliorating motion-to-photon latency in virtual reality." Available: https://pmc.ncbi.nlm.nih.gov/articles/PMC10616216/</ref>
=== Display Latency ===
Display technology significantly impacts latency through several factors:
'''[[Refresh rate]]''' fundamentally limits temporal resolution. Higher refresh rates reduce the maximum time between frame updates:
* 60 Hz: 16.67 ms per frame
* 72 Hz: 13.9 ms per frame 
* 90 Hz: 11.1 ms per frame (industry standard)
* 120 Hz: 8.3 ms per frame
* 144 Hz: 6.9 ms per frame ([[Valve Index]] maximum)
'''[[Pixel response time]]''' measures how quickly display pixels change state:
* [[OLED]]: < 1 ms (ideal for VR)
* Modern "Fast [[LCD]]": < 5 ms with optimizations
* Traditional LCD: 10+ ms (unsuitable for VR)
'''Display persistence''' refers to how long pixels remain illuminated. [[Low-persistence display]]s flash pixels briefly (0.33-2ms) rather than continuously, dramatically reducing motion blur and perceived latency.<ref name="chioka" />
=== Rendering Latency ===
The rendering pipeline contributes significant latency through multiple stages:
'''CPU processing''' includes game logic, physics calculations, and command generation, typically requiring 8-15ms per frame. '''GPU rendering''' depends heavily on scene complexity and resolution—VR demands over 457 million pixels per second for dual displays at 90Hz, compared to 124 million for standard 1080p60.<ref name="abrash">Abrash, M. (2014). "What VR Could, Should, and Almost Certainly Will Be within Two Years." Presentation at Steam Dev Days.</ref>
'''Command buffer depth''' can add substantial latency if frames are queued between CPU and GPU. Default driver behavior often buffers 2-3 frames to maximize GPU utilization, adding 22-48ms. VR-optimized rendering explicitly synchronizes to prevent this buffering.<ref name="arm2016">Jeffries, F. (2016). "Virtual Reality – Blatant Latency and How to Avoid It." ARM Community Blog. Available: https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/virtual-reality-blatant-latency-and-how-to-avoid-it</ref>
== Human Perception and Physiological Effects ==
=== Perception Thresholds ===
Human sensitivity to latency varies considerably:
* '''3.2-3.4 ms''': Minimum detectable by most sensitive individuals under controlled conditions<ref name="jerald2012" />
* '''< 17 ms''': Generally imperceptible to most users during normal interaction
* '''20 ms''': Industry standard threshold for comfortable VR
* '''20-50 ms''': Noticeable but tolerable for many applications
* '''> 50 ms''': Clearly perceptible, degrades experience significantly
* '''> 75 ms''': Motor performance and simultaneity perception affected<ref name="waltemate2016">Waltemate, T., et al. (2016). "The Impact of Latency on Perceptual and Motor Performance in Immersive Virtual Environments." IEEE VR 2016.</ref>
For optical see-through AR, where the real world provides a zero-latency reference, these thresholds drop dramatically to 5ms or less.<ref name="wagner2018" />
=== Simulator Sickness and Cybersickness ===
The primary physiological consequence of excessive latency is '''[[cybersickness]]''', caused by sensory conflict between visual and vestibular systems. The [[vestibular system]] in the inner ear detects head rotation instantly (vestibulo-ocular reflex operates at ~15-20ms), while visual feedback lags behind. This mismatch triggers an evolutionary defense mechanism—the brain interprets the sensory conflict as potential poisoning, inducing nausea to expel supposed toxins.<ref name="weech2020" />
{| class="wikitable"
|+ Cybersickness Symptoms and Latency Thresholds
! Latency Range !! Effects on Users !! Symptom Severity
|-
| < 20 ms || Generally imperceptible, comfortable for extended use || None to minimal
|-
| 20-30 ms || Subtle discomfort in sensitive users || Mild eye strain possible
|-
| 30-50 ms || Noticeable lag, increased discomfort || Moderate symptoms begin
|-
| 50-75 ms || Clear degradation of experience || Significant nausea risk
|-
| > 75 ms || Severe discomfort for most users || High probability of sickness
|-
| Variable/Jittery || Worse than constant higher latency || Severe symptoms likely
|}
'''Latency jitter'''—inconsistent frame-to-frame delays—proves more problematic than constant latency, as the brain cannot adapt to unpredictable timing.<ref name="stauffert2020">Stauffert, J.P., Niebling, F., & Latoschik, M.E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review." Frontiers in Virtual Reality.</ref>
== Latency Reduction Techniques ==
=== Software Techniques ===
==== Asynchronous Timewarp (ATW) ====
'''[[Asynchronous Timewarp]]''' (ATW) represents one of the most effective latency mitigation techniques. When an application fails to render a new frame in time, ATW takes the previously rendered frame and warps it based on the latest head tracking data. This occurs on a high-priority thread separate from main rendering, ensuring consistent frame delivery even when the application struggles.<ref name="oculus_atw">Oculus. "Asynchronous Timewarp Examined." Available: https://developers.meta.com/horizon/blog/asynchronous-timewarp-examined/</ref>
ATW can reduce judder by 20-100x and maintain perceived latency of 2-13ms even when initial measurements show 21-42ms.<ref name="brennan2022" /> However, it only corrects rotational movement—positional movement still shows artifacts on nearby objects.
==== Asynchronous Spacewarp (ASW) ====
'''[[Asynchronous Spacewarp]]''' (ASW) extends ATW by generating synthetic intermediate frames when performance drops. When an application cannot maintain 90fps, ASW forces rendering at 45fps and extrapolates intermediate frames using depth buffers and motion vectors. This maintains smooth 90Hz output while reducing GPU requirements by nearly 50%.<ref name="oculus_asw">Oculus. "Asynchronous Spacewarp." Available: https://developers.meta.com/horizon/blog/asynchronous-spacewarp/</ref>
ASW 2.0 improvements include:
* Full depth buffer utilization for higher quality synthesis
* Better disocclusion handling (areas becoming visible as objects move)
* Reduced artifacts with transparent objects and rapid brightness changes
==== Prediction Algorithms ====
Prediction algorithms anticipate future head positions 20-50ms ahead, compensating for pipeline latency:
* '''[[Kalman Filter]]s''': Industry standard for sensor fusion, combining IMU and camera data while estimating velocity and acceleration
* '''Dead Reckoning''': Simple constant-velocity assumption, suitable only for very short predictions
* '''Alpha-Beta-Gamma Predictors''': Lighter alternative to Kalman filters with tunable responsiveness
Modern systems achieve remarkable results—Valve Index shows 21-42ms initial latency dropping to 2-13ms once prediction stabilizes within 25-58ms.<ref name="brennan2022" />
==== Late Latching ====
'''[[Late latching]]''' updates head pose data at the last possible moment before GPU execution, rather than when rendering commands are first generated. This technique can reduce motion-to-photon latency by 2-5ms without the buffering penalties of traditional pipelines.<ref name="pico_late">Pico Developer. "Bringing Late Latching to Unreal Engine 5." Available: https://developer.picoxr.com/news/LateLatching/</ref>
=== Hardware Solutions ===
==== Display Technologies ====
{| class="wikitable"
|+ Display Technology Comparison for VR/AR
! Technology !! Response Time !! Refresh Rate !! Persistence !! Current Usage
|-
| [[OLED]] || < 1 ms || Up to 90 Hz || 1-2 ms typical || PSVR2, older VR headsets
|-
| Fast [[LCD]] || < 5 ms || Up to 144 Hz || 0.33-2 ms || Valve Index, Quest 3
|-
| [[Micro-OLED]] || < 1 μs || 90+ Hz || Ultra-low || Apple Vision Pro
|-
| [[Waveguide]] || N/A || 60 Hz || N/A || HoloLens 2 (AR)
|}
==== Dedicated Processing Hardware ====
Specialized chips dramatically reduce latency:
* '''Apple R1 chip''': Processes sensor inputs and displays images in 12ms, achieving ~11ms passthrough latency in Vision Pro<ref name="optofidelity">OptoFidelity. "Apple Vision Pro Benchmark Test 2: Angular Motion-to-Photon." Available: https://www.optofidelity.com/insights/blogs/apple-vision-pro-bencmark-test-2.-angular-motion-to-photon-latency-in-vr</ref>
* '''Qualcomm XR2 Gen 2''': Powers Quest 3 with hardware acceleration for timewarp
* '''Microsoft HPU 2.0''': Holographic Processing Unit for HoloLens 2, handles spatial mapping and hand tracking
== Measurement Methods ==
=== High-Speed Camera Methods ===
The most system-independent approach uses cameras (240fps or higher) to simultaneously record physical motion and display output. By counting frames between physical movement and visual update, researchers can measure true end-to-end latency.<ref name="stauffert2020" />
=== Photodiode-Based Methods ===
Photodiodes detect screen brightness changes with sub-millisecond precision. Combined with motion sensors and oscilloscopes sampling at 25,000Hz, this provides highly accurate per-frame latency measurements including jitter analysis.<ref name="vrmark">UL Solutions. "VRMark Latency Measurement." Available: https://benchmarks.ul.com/vrmark</ref>
=== Software Tools ===
* '''Oculus Debug Tool''': Real-time Performance HUD showing latency timing
* '''SteamVR Frame Timing''': Visualizes CPU/GPU contributions and dropped frames
* '''Unity Profiler'''/'''Unreal Insights''': Internal timing analysis (not true motion-to-photon)
=== Cognitive Measurement ===
An innovative approach measures latency through user task performance degradation. As artificial delay increases, performance on rapid motor tasks (like catching falling objects) declines predictably, providing a hardware-independent measurement method.<ref name="cognitive">Microsoft Research. "Measuring System Visual Latency through Cognitive Psychophysics." IEEE VR 2020.</ref>
== Industry Standards and Current Achievements ==
=== VR Standards ===
Industry consensus has established clear targets:
* '''< 20 ms''': Target for imperceptible latency (Carmack, Abrash, industry standard)
* '''< 50 ms''': Maximum acceptable for responsive feel
* '''7-15 ms''': Theoretical ideal for perfect presence (Michael Abrash)<ref name="abrash" />
Frame rate minimums:
* '''60 fps''': Absolute minimum (mobile VR)
* '''72 fps''': Early mobile standard (Quest 1)
* '''90 fps''': Industry standard (most PC VR)
* '''120 fps''': Premium tier (Index, PSVR2)
* '''144 fps''': Maximum consumer (Valve Index)
=== AR Requirements ===
AR faces stricter requirements due to real-world reference:
* '''Optical see-through AR''': < 5 ms for unnoticeable misregistration
* '''Video passthrough AR''': 15-20 ms acceptable (similar to VR)
* '''Current best''': Apple Vision Pro at ~11 ms passthrough<ref name="optofidelity" />
=== Current Consumer Headset Performance ===
{| class="wikitable"
|+ Motion-to-Photon Latency: Current Consumer VR/AR Headsets (2024-2025)
! Headset !! VR Mode (ms) !! Passthrough (ms) !! Refresh Rate (Hz) !! Display Type !! Notable Features
|-
| [[Apple Vision Pro]] || 2-4 || ~11 || 90 || Micro-OLED || Dedicated R1 chip for sensor processing
|-
| [[Meta Quest 3]] || 2-4 || 35-39 || 90/120 || Fast LCD || Advanced spacewarp algorithms
|-
| [[Meta Quest Pro]] || 2-4 || 35-38 || 90 || Fast LCD || Eye-tracked foveated rendering
|-
| [[Valve Index]] || 2-13 || N/A || 80/90/120/144 || Fast LCD || Highest refresh rate, outside-in tracking
|-
| [[PSVR2]] || ~20-30 (est.) || N/A || 90/120 || OLED || HDR display, foveated rendering
|-
| [[HTC Vive XR Elite]] || ~15-20 || 35-40 || 90 || LCD || Hybrid tracking support
|-
| [[Microsoft HoloLens 2]] || ~139 (AR) || 0 (optical) || 60 || Waveguide || Zero-latency real world view
|-
| [[Pico 4]] || ~15-25 || 40-45 || 72/90 || Fast LCD || Pancake optics
|}
== Special Considerations ==
=== Differences Between VR and AR ===
VR and AR face fundamentally different latency challenges:
'''VR challenges''':
* Entire view must be rendered
* Uniform latency across visual field
* Brain can partially adapt to consistent delay
* 20ms threshold generally acceptable
'''AR challenges''':
* Real world provides zero-latency reference
* Any virtual lag creates immediate misregistration
* Video passthrough adds camera capture/processing time
* Optical see-through requires < 5ms for stability
=== Network and Cloud Rendering ===
Emerging technologies address network latency for cloud-rendered XR:
* '''[[5G]]''': Promises single-digit millisecond round-trip times
* '''[[Edge computing]]''': Moves processing closer to users
* '''[[Predictive streaming]]''': Pre-renders multiple viewpoints
However, even with 5G, network latency adds to local processing, making sub-20ms total latency challenging for cloud VR.<ref name="tencent">Tencent Cloud. "Network Latency Impact on VR/AR Applications." Available: https://www.tencentcloud.com/techpedia/112184</ref>
=== Multimodal Latency ===
Beyond visual latency, other senses have distinct requirements:
* '''[[Haptic feedback]]''': < 1 ms for realistic touch sensation<ref name="haptics">Meta Reality Labs. "Measuring the perception of latency with a haptic glove." Available: https://tech.facebook.com/reality-labs/2019/7/measuring-the-perception-of-latency-with-a-haptic-glove/</ref>
* '''Audio latency''': 20-40 ms acceptable, but delayed audio can make low-latency haptics feel laggy through "auditory capture"
* '''Controller latency''': 30-50 ms typical, masked by visual feedback
== Future Directions ==
=== Emerging Technologies ===
* '''AI-driven prediction''': Machine learning models that adapt to individual user movement patterns
* '''[[Foveated rendering]]''': Can tolerate 50-70ms latency in peripheral vision while maintaining low foveal latency<ref name="nvidia_foveated">NVIDIA Research. "Latency Requirements for Foveated Rendering in Virtual Reality." Available: https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf</ref>
* '''Neural interfaces''': Direct brain-computer interfaces could theoretically predict movement before physical motion begins
* '''Lightfield displays''': Could eliminate certain rendering steps
=== Research Frontiers ===
The University of North Carolina achieved 80 microseconds (0.08ms) latency in 2016 using specialized hardware—demonstrating theoretical possibilities though impractical for consumer devices.<ref name="lincoln2016">Lincoln, P., et al. (2016). "From Motion to Photons in 80 Microseconds." IEEE VR 2016 Best Paper.</ref>
== See Also ==
* [[Motion-to-Photon]]
* [[Simulator sickness]]
* [[Cybersickness]]
* [[Presence]]
* [[Timewarp]]
* [[Asynchronous Spacewarp]]
* [[Predictive tracking]]
* [[Low-persistence display]]
* [[Foveated rendering]]
* [[Head-mounted display]]
* [[Virtual reality]]
* [[Augmented reality]]
== References ==
<references>
<ref name="chioka">Chioka (2015). "What is Motion-To-Photon Latency?" Available: http://www.chioka.in/what-is-motion-to-photon-latency/</ref>
<ref name="weech2020">Weech, S., Kenny, S., & Brüngül, E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review." Frontiers in Virtual Reality, 1, 582204. Available: https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2020.582204/full</ref>
<ref name="unc">University of North Carolina at Chapel Hill. "Managing Latency in Complex Augmented Reality Systems." Available: http://www.cs.unc.edu/Research/us/Latency/ManagingRelativeLatency.html</ref>
<ref name="wagner2018">Wagner, D. (2018). "Motion-to-Photon Latency in Mobile AR and VR." Medium (DAQRI). Available: https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926</ref>
<ref name="jerald2012">Jerald, J. (2012). "The VR Book: Human-Centered Design for Virtual Reality." Morgan & Claypool Publishers.</ref>
<ref name="carmack2013">Carmack, J. (2013). "Latency Mitigation Strategies." Presentation at GDC 2013. Available: https://www.wired.com/2013/02/john-carmacks-latency-mitigation-strategies/</ref>
<ref name="varjo">Varjo. "Latency in virtual and mixed reality explained." Available: https://support.varjo.com/hc/en-us/latency</ref>
<ref name="brennan2022">Brennan, C., Pike, M., & Humphreys, G. (2022). "Measuring and ameliorating motion-to-photon latency in virtual reality." Available: https://pmc.ncbi.nlm.nih.gov/articles/PMC10616216/</ref>
<ref name="abrash">Abrash, M. (2014). "What VR Could, Should, and Almost Certainly Will Be within Two Years." Presentation at Steam Dev Days.</ref>
<ref name="arm2016">Jeffries, F. (2016). "Virtual Reality – Blatant Latency and How to Avoid It." ARM Community Blog. Available: https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/virtual-reality-blatant-latency-and-how-to-avoid-it</ref>
<ref name="waltemate2016">Waltemate, T., et al. (2016). "The Impact of Latency on Perceptual and Motor Performance in Immersive Virtual Environments." IEEE VR 2016.</ref>
<ref name="stauffert2020">Stauffert, J.P., Niebling, F., & Latoschik, M.E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review." Frontiers in Virtual Reality.</ref>
<ref name="oculus_atw">Oculus. "Asynchronous Timewarp Examined." Available: https://developers.meta.com/horizon/blog/asynchronous-timewarp-examined/</ref>
<ref name="oculus_asw">Oculus. "Asynchronous Spacewarp." Available: https://developers.meta.com/horizon/blog/asynchronous-spacewarp/</ref>
<ref name="pico_late">Pico Developer. "Bringing Late Latching to Unreal Engine 5." Available: https://developer.picoxr.com/news/LateLatching/</ref>
<ref name="optofidelity">OptoFidelity. "Apple Vision Pro Benchmark Test 2: Angular Motion-to-Photon." Available: https://www.optofidelity.com/insights/blogs/apple-vision-pro-bencmark-test-2.-angular-motion-to-photon-latency-in-vr</ref>
<ref name="vrmark">UL Solutions. "VRMark Latency Measurement." Available: https://benchmarks.ul.com/vrmark</ref>
<ref name="cognitive">Microsoft Research. "Measuring System Visual Latency through Cognitive Psychophysics." IEEE VR 2020.</ref>
<ref name="tencent">Tencent Cloud. "Network Latency Impact on VR/AR Applications." Available: https://www.tencentcloud.com/techpedia/112184</ref>
<ref name="haptics">Meta Reality Labs. "Measuring the perception of latency with a haptic glove." Available: https://tech.facebook.com/reality-labs/2019/7/measuring-the-perception-of-latency-with-a-haptic-glove/</ref>
<ref name="nvidia_foveated">NVIDIA Research. "Latency Requirements for Foveated Rendering in Virtual Reality." Available: https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf</ref>
<ref name="lincoln2016">Lincoln, P., et al. (2016). "From Motion to Photons in 80 Microseconds." IEEE VR 2016 Best Paper.</ref>
</references>


[[Category:Terms]]
[[Category:Terms]]
[[Category:Technical specifications]]
[[Category:Human factors]]
[[Category:Display technology]]

Latest revision as of 01:14, 28 October 2025

Template:Infobox VRAR term

Latency in virtual reality (VR) and augmented reality (AR) refers to the time delay between a user's physical action (such as moving their head) and the corresponding visual feedback displayed in the head-mounted display (HMD). This delay, more formally known as motion-to-photon latency (MTP) or end-to-end latency, measures the complete time from when a user initiates a movement until photons representing the updated view are emitted from the display and reach the user's eyes.[1]

Minimizing latency is arguably the single most critical technical challenge in creating comfortable and believable immersive experiences. Low latency is essential for maintaining presence—the psychological state of feeling truly "there" in the virtual environment. High latency not only breaks this illusion but can also cause severe physiological effects including simulator sickness (also known as cybersickness), characterized by symptoms such as nausea, disorientation, headaches, and general discomfort.[2] In AR specifically, latency causes virtual content to appear misaligned with real-world features, breaking the illusion that digital objects are anchored in physical space.[3]

Motion-to-Photon Latency

Motion-to-photon latency, also known as end-to-end latency, represents the total system delay from the moment a user moves until the display updates to reflect that movement. This encompasses the entire processing pipeline: sensor detection, tracking computation, application logic, rendering, and display output.[1][4]

The human vestibular system and visual perception are extraordinarily sensitive to temporal mismatches. Research indicates that trained observers can detect latency as low as 3.2-3.4 milliseconds under optimal conditions, though practical detection thresholds for most users fall around 15-17ms.[5] Industry consensus has established 20ms as the critical threshold—latencies below this are generally imperceptible during normal use, while latencies above begin to degrade the experience.[6][7]

Types and Sources of Latency

Components of the Latency Pipeline

Motion-to-Photon Latency Budget Breakdown
Pipeline Stage Naive/Unoptimized System (ms) Modern Optimized System (ms) Key Optimization Techniques
Sensor Sampling (IMU) 2–4 ms < 1 ms High-frequency sampling (1000Hz+), dedicated sensor fusion
Tracking Computation 15–25 ms (camera-based) 1–2 ms (with IMU fusion) Kalman filtering, predictive algorithms
Application/Game Logic (CPU) 11–16 ms 3–5 ms Multi-threaded rendering, optimized code
Render Queue (CPU → GPU) 16–33 ms 1–2 ms Late latching, view bypass techniques
GPU Rendering 11–16 ms 5–10 ms Foveated rendering, simplified shaders
Display Scanout & Persistence 16 ms (60Hz) 2–4 ms (90Hz low-persistence) High refresh rates, low-persistence displays
Total Motion-to-Photon ~56–107 ms ~11–23 ms Combined optimizations plus timewarp/reprojection

Sensor and Tracking Latency

Inertial Measurement Unit (IMU) tracking provides the foundation for low-latency orientation tracking. Modern VR headsets incorporate 6-degree-of-freedom (6DOF) IMUs sampling at 500-1000Hz, providing orientation updates with latency below 2ms. High-end systems like the Bosch BMI085 contribute less than 3ms to total motion-to-photon latency.[4]

Visual tracking latency varies significantly based on implementation. Camera-based inside-out tracking typically operates at 30-90Hz with additional processing time. A camera image can be 10-15ms old by the time it's processed, which is why high-frequency IMU data proves essential for compensating.[4]

Outside-in tracking using external sensors like SteamVR Lighthouse base stations can achieve sub-5ms tracking latency, representing the gold standard for precision, though requiring fixed sensor installation.[8]

Display Latency

Display technology significantly impacts latency through several factors:

Refresh rate fundamentally limits temporal resolution. Higher refresh rates reduce the maximum time between frame updates:

  • 60 Hz: 16.67 ms per frame
  • 72 Hz: 13.9 ms per frame
  • 90 Hz: 11.1 ms per frame (industry standard)
  • 120 Hz: 8.3 ms per frame
  • 144 Hz: 6.9 ms per frame (Valve Index maximum)

Pixel response time measures how quickly display pixels change state:

  • OLED: < 1 ms (ideal for VR)
  • Modern "Fast LCD": < 5 ms with optimizations
  • Traditional LCD: 10+ ms (unsuitable for VR)

Display persistence refers to how long pixels remain illuminated. Low-persistence displays flash pixels briefly (0.33-2ms) rather than continuously, dramatically reducing motion blur and perceived latency.[1]

Rendering Latency

The rendering pipeline contributes significant latency through multiple stages:

CPU processing includes game logic, physics calculations, and command generation, typically requiring 8-15ms per frame. GPU rendering depends heavily on scene complexity and resolution—VR demands over 457 million pixels per second for dual displays at 90Hz, compared to 124 million for standard 1080p60.[9]

Command buffer depth can add substantial latency if frames are queued between CPU and GPU. Default driver behavior often buffers 2-3 frames to maximize GPU utilization, adding 22-48ms. VR-optimized rendering explicitly synchronizes to prevent this buffering.[10]

Human Perception and Physiological Effects

Perception Thresholds

Human sensitivity to latency varies considerably:

  • 3.2-3.4 ms: Minimum detectable by most sensitive individuals under controlled conditions[5]
  • < 17 ms: Generally imperceptible to most users during normal interaction
  • 20 ms: Industry standard threshold for comfortable VR
  • 20-50 ms: Noticeable but tolerable for many applications
  • > 50 ms: Clearly perceptible, degrades experience significantly
  • > 75 ms: Motor performance and simultaneity perception affected[11]

For optical see-through AR, where the real world provides a zero-latency reference, these thresholds drop dramatically to 5ms or less.[4]

Simulator Sickness and Cybersickness

The primary physiological consequence of excessive latency is cybersickness, caused by sensory conflict between visual and vestibular systems. The vestibular system in the inner ear detects head rotation instantly (vestibulo-ocular reflex operates at ~15-20ms), while visual feedback lags behind. This mismatch triggers an evolutionary defense mechanism—the brain interprets the sensory conflict as potential poisoning, inducing nausea to expel supposed toxins.[2]

Cybersickness Symptoms and Latency Thresholds
Latency Range Effects on Users Symptom Severity
< 20 ms Generally imperceptible, comfortable for extended use None to minimal
20-30 ms Subtle discomfort in sensitive users Mild eye strain possible
30-50 ms Noticeable lag, increased discomfort Moderate symptoms begin
50-75 ms Clear degradation of experience Significant nausea risk
> 75 ms Severe discomfort for most users High probability of sickness
Variable/Jittery Worse than constant higher latency Severe symptoms likely

Latency jitter—inconsistent frame-to-frame delays—proves more problematic than constant latency, as the brain cannot adapt to unpredictable timing.[12]

Latency Reduction Techniques

Software Techniques

Asynchronous Timewarp (ATW)

Asynchronous Timewarp (ATW) represents one of the most effective latency mitigation techniques. When an application fails to render a new frame in time, ATW takes the previously rendered frame and warps it based on the latest head tracking data. This occurs on a high-priority thread separate from main rendering, ensuring consistent frame delivery even when the application struggles.[13]

ATW can reduce judder by 20-100x and maintain perceived latency of 2-13ms even when initial measurements show 21-42ms.[8] However, it only corrects rotational movement—positional movement still shows artifacts on nearby objects.

Asynchronous Spacewarp (ASW)

Asynchronous Spacewarp (ASW) extends ATW by generating synthetic intermediate frames when performance drops. When an application cannot maintain 90fps, ASW forces rendering at 45fps and extrapolates intermediate frames using depth buffers and motion vectors. This maintains smooth 90Hz output while reducing GPU requirements by nearly 50%.[14]

ASW 2.0 improvements include:

  • Full depth buffer utilization for higher quality synthesis
  • Better disocclusion handling (areas becoming visible as objects move)
  • Reduced artifacts with transparent objects and rapid brightness changes

Prediction Algorithms

Prediction algorithms anticipate future head positions 20-50ms ahead, compensating for pipeline latency:

  • Kalman Filters: Industry standard for sensor fusion, combining IMU and camera data while estimating velocity and acceleration
  • Dead Reckoning: Simple constant-velocity assumption, suitable only for very short predictions
  • Alpha-Beta-Gamma Predictors: Lighter alternative to Kalman filters with tunable responsiveness

Modern systems achieve remarkable results—Valve Index shows 21-42ms initial latency dropping to 2-13ms once prediction stabilizes within 25-58ms.[8]

Late Latching

Late latching updates head pose data at the last possible moment before GPU execution, rather than when rendering commands are first generated. This technique can reduce motion-to-photon latency by 2-5ms without the buffering penalties of traditional pipelines.[15]

Hardware Solutions

Display Technologies

Display Technology Comparison for VR/AR
Technology Response Time Refresh Rate Persistence Current Usage
OLED < 1 ms Up to 90 Hz 1-2 ms typical PSVR2, older VR headsets
Fast LCD < 5 ms Up to 144 Hz 0.33-2 ms Valve Index, Quest 3
Micro-OLED < 1 μs 90+ Hz Ultra-low Apple Vision Pro
Waveguide N/A 60 Hz N/A HoloLens 2 (AR)

Dedicated Processing Hardware

Specialized chips dramatically reduce latency:

  • Apple R1 chip: Processes sensor inputs and displays images in 12ms, achieving ~11ms passthrough latency in Vision Pro[16]
  • Qualcomm XR2 Gen 2: Powers Quest 3 with hardware acceleration for timewarp
  • Microsoft HPU 2.0: Holographic Processing Unit for HoloLens 2, handles spatial mapping and hand tracking

Measurement Methods

High-Speed Camera Methods

The most system-independent approach uses cameras (240fps or higher) to simultaneously record physical motion and display output. By counting frames between physical movement and visual update, researchers can measure true end-to-end latency.[12]

Photodiode-Based Methods

Photodiodes detect screen brightness changes with sub-millisecond precision. Combined with motion sensors and oscilloscopes sampling at 25,000Hz, this provides highly accurate per-frame latency measurements including jitter analysis.[17]

Software Tools

  • Oculus Debug Tool: Real-time Performance HUD showing latency timing
  • SteamVR Frame Timing: Visualizes CPU/GPU contributions and dropped frames
  • Unity Profiler/Unreal Insights: Internal timing analysis (not true motion-to-photon)

Cognitive Measurement

An innovative approach measures latency through user task performance degradation. As artificial delay increases, performance on rapid motor tasks (like catching falling objects) declines predictably, providing a hardware-independent measurement method.[18]

Industry Standards and Current Achievements

VR Standards

Industry consensus has established clear targets:

  • < 20 ms: Target for imperceptible latency (Carmack, Abrash, industry standard)
  • < 50 ms: Maximum acceptable for responsive feel
  • 7-15 ms: Theoretical ideal for perfect presence (Michael Abrash)[9]

Frame rate minimums:

  • 60 fps: Absolute minimum (mobile VR)
  • 72 fps: Early mobile standard (Quest 1)
  • 90 fps: Industry standard (most PC VR)
  • 120 fps: Premium tier (Index, PSVR2)
  • 144 fps: Maximum consumer (Valve Index)

AR Requirements

AR faces stricter requirements due to real-world reference:

  • Optical see-through AR: < 5 ms for unnoticeable misregistration
  • Video passthrough AR: 15-20 ms acceptable (similar to VR)
  • Current best: Apple Vision Pro at ~11 ms passthrough[16]

Current Consumer Headset Performance

Motion-to-Photon Latency: Current Consumer VR/AR Headsets (2024-2025)
Headset VR Mode (ms) Passthrough (ms) Refresh Rate (Hz) Display Type Notable Features
Apple Vision Pro 2-4 ~11 90 Micro-OLED Dedicated R1 chip for sensor processing
Meta Quest 3 2-4 35-39 90/120 Fast LCD Advanced spacewarp algorithms
Meta Quest Pro 2-4 35-38 90 Fast LCD Eye-tracked foveated rendering
Valve Index 2-13 N/A 80/90/120/144 Fast LCD Highest refresh rate, outside-in tracking
PSVR2 ~20-30 (est.) N/A 90/120 OLED HDR display, foveated rendering
HTC Vive XR Elite ~15-20 35-40 90 LCD Hybrid tracking support
Microsoft HoloLens 2 ~139 (AR) 0 (optical) 60 Waveguide Zero-latency real world view
Pico 4 ~15-25 40-45 72/90 Fast LCD Pancake optics

Special Considerations

Differences Between VR and AR

VR and AR face fundamentally different latency challenges:

VR challenges:

  • Entire view must be rendered
  • Uniform latency across visual field
  • Brain can partially adapt to consistent delay
  • 20ms threshold generally acceptable

AR challenges:

  • Real world provides zero-latency reference
  • Any virtual lag creates immediate misregistration
  • Video passthrough adds camera capture/processing time
  • Optical see-through requires < 5ms for stability

Network and Cloud Rendering

Emerging technologies address network latency for cloud-rendered XR:

However, even with 5G, network latency adds to local processing, making sub-20ms total latency challenging for cloud VR.[19]

Multimodal Latency

Beyond visual latency, other senses have distinct requirements:

  • Haptic feedback: < 1 ms for realistic touch sensation[20]
  • Audio latency: 20-40 ms acceptable, but delayed audio can make low-latency haptics feel laggy through "auditory capture"
  • Controller latency: 30-50 ms typical, masked by visual feedback

Future Directions

Emerging Technologies

  • AI-driven prediction: Machine learning models that adapt to individual user movement patterns
  • Foveated rendering: Can tolerate 50-70ms latency in peripheral vision while maintaining low foveal latency[21]
  • Neural interfaces: Direct brain-computer interfaces could theoretically predict movement before physical motion begins
  • Lightfield displays: Could eliminate certain rendering steps

Research Frontiers

The University of North Carolina achieved 80 microseconds (0.08ms) latency in 2016 using specialized hardware—demonstrating theoretical possibilities though impractical for consumer devices.[22]

See Also

References

  1. 1.0 1.1 1.2 Chioka (2015). "What is Motion-To-Photon Latency?" Available: http://www.chioka.in/what-is-motion-to-photon-latency/
  2. 2.0 2.1 Weech, S., Kenny, S., & Brüngül, E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review." Frontiers in Virtual Reality, 1, 582204. Available: https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2020.582204/full
  3. University of North Carolina at Chapel Hill. "Managing Latency in Complex Augmented Reality Systems." Available: http://www.cs.unc.edu/Research/us/Latency/ManagingRelativeLatency.html
  4. 4.0 4.1 4.2 4.3 Wagner, D. (2018). "Motion-to-Photon Latency in Mobile AR and VR." Medium (DAQRI). Available: https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926
  5. 5.0 5.1 Jerald, J. (2012). "The VR Book: Human-Centered Design for Virtual Reality." Morgan & Claypool Publishers.
  6. Carmack, J. (2013). "Latency Mitigation Strategies." Presentation at GDC 2013. Available: https://www.wired.com/2013/02/john-carmacks-latency-mitigation-strategies/
  7. Varjo. "Latency in virtual and mixed reality explained." Available: https://support.varjo.com/hc/en-us/latency
  8. 8.0 8.1 8.2 Brennan, C., Pike, M., & Humphreys, G. (2022). "Measuring and ameliorating motion-to-photon latency in virtual reality." Available: https://pmc.ncbi.nlm.nih.gov/articles/PMC10616216/
  9. 9.0 9.1 Abrash, M. (2014). "What VR Could, Should, and Almost Certainly Will Be within Two Years." Presentation at Steam Dev Days.
  10. Jeffries, F. (2016). "Virtual Reality – Blatant Latency and How to Avoid It." ARM Community Blog. Available: https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/virtual-reality-blatant-latency-and-how-to-avoid-it
  11. Waltemate, T., et al. (2016). "The Impact of Latency on Perceptual and Motor Performance in Immersive Virtual Environments." IEEE VR 2016.
  12. 12.0 12.1 Stauffert, J.P., Niebling, F., & Latoschik, M.E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review." Frontiers in Virtual Reality.
  13. Oculus. "Asynchronous Timewarp Examined." Available: https://developers.meta.com/horizon/blog/asynchronous-timewarp-examined/
  14. Oculus. "Asynchronous Spacewarp." Available: https://developers.meta.com/horizon/blog/asynchronous-spacewarp/
  15. Pico Developer. "Bringing Late Latching to Unreal Engine 5." Available: https://developer.picoxr.com/news/LateLatching/
  16. 16.0 16.1 OptoFidelity. "Apple Vision Pro Benchmark Test 2: Angular Motion-to-Photon." Available: https://www.optofidelity.com/insights/blogs/apple-vision-pro-bencmark-test-2.-angular-motion-to-photon-latency-in-vr
  17. UL Solutions. "VRMark Latency Measurement." Available: https://benchmarks.ul.com/vrmark
  18. Microsoft Research. "Measuring System Visual Latency through Cognitive Psychophysics." IEEE VR 2020.
  19. Tencent Cloud. "Network Latency Impact on VR/AR Applications." Available: https://www.tencentcloud.com/techpedia/112184
  20. Meta Reality Labs. "Measuring the perception of latency with a haptic glove." Available: https://tech.facebook.com/reality-labs/2019/7/measuring-the-perception-of-latency-with-a-haptic-glove/
  21. NVIDIA Research. "Latency Requirements for Foveated Rendering in Virtual Reality." Available: https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf
  22. Lincoln, P., et al. (2016). "From Motion to Photons in 80 Microseconds." IEEE VR 2016 Best Paper.