Jump to content

Motion-to-photon latency

From VR & AR Wiki
Revision as of 18:54, 28 October 2025 by Xinreality (talk | contribs)

Template:About Template:Infobox

Motion-to-photon latency (MTP latency), also known as end-to-end latency, is a critical performance metric in virtual reality (VR) and augmented reality (AR) systems. It measures the total time delay from the moment a user initiates a physical movement (e.g., turning their head or moving a controller) to the moment the first photons of light reflecting that movement are emitted from the system's display and reach the user's eyes.[1][2]

Minimizing this latency is paramount for creating a convincing and comfortable immersive experience. High latency creates a perceptible lag between a user's actions and the system's visual feedback. In VR, where users are fully immersed, this breaks the sense of presence and is a primary cause of cybersickness, a form of motion sickness with symptoms including nausea, disorientation, and eye strain.[3][4] In AR and mixed reality (MR), where virtual elements are overlaid onto the real world, the challenge is even greater, as even small delays can cause noticeable misregistration, making virtual objects appear to "swim" or float disconnectedly from their real-world anchors.[5]

The industry-accepted target for high-quality VR is an MTP latency of under 20 milliseconds (ms), a threshold below which the delay is generally imperceptible.[6] For AR, requirements are far stricter, often demanding latencies below 7 ms.[7] Thanks to advances in hardware and software, MTP latency has dropped from 50-60 ms in early VR headsets to as low as 2-3 ms in modern systems, transforming the technology's comfort and viability.[8]

The Motion-to-Photon Pipeline

MTP latency is not a single delay but the sum of many small delays that occur sequentially in a complex hardware and software pipeline. This pipeline begins with physical motion and ends with light being emitted from the head-mounted display (HMD).[9][10] Understanding each stage is crucial for identifying sources of latency and developing effective mitigation strategies.

Pipeline Stages

  1. Motion Detection & Sensor Sampling: Physical motion is captured by sensors, primarily an Inertial Measurement Unit (IMU) containing accelerometers and gyroscopes, often supplemented by optical tracking systems (e.g., cameras). The rate at which these sensors are sampled introduces the first delay. IMUs operate at very high frequencies (e.g., 500-1000 Hz), making their individual sample latency very low (~1 ms).[11][12] However, camera-based tracking systems are much slower (typically 30-90 Hz), which can introduce a significant initial delay of 15-33 ms.[11]
  2. Data Transmission & Processing: The raw sensor data is transmitted from the HMD or controller to the host computer (e.g., via USB-C or wirelessly). This data is then processed by a tracking system, which fuses inputs from multiple sensors (e.g., using visual-inertial odometry) to calculate the device's precise position and orientation in six degrees of freedom (6DoF).[2]
  3. Application & Game Engine Processing: The host CPU receives the pose data. The application or game engine then executes its logic for the upcoming frame based on this new pose. This includes tasks like physics simulations, AI calculations, and processing user inputs.[1][13]
  4. Render Pipeline Execution: The CPU sends drawing commands to the GPU. The GPU then renders the 3D scene from the perspective of the new pose into an off-screen image buffer. This is often the most time-consuming part of the pipeline. To improve performance on traditional benchmarks, graphics drivers may aggressively buffer these commands, which can add one or more full frames of latency.[6]
  5. Compositing and Post-Processing: The rendered image undergoes final processing steps. This includes applying lens distortion correction to compensate for the HMD's optics, chromatic aberration correction, and executing latency-mitigation techniques like reprojection.[1]
  6. Display Refresh & Scan-out: The final image is sent to the HMD's display. Further delay is introduced by the display's refresh rate (e.g., a 90 Hz display introduces a potential wait of up to 11.1 ms) and the physical time it takes for the display to "scan out" the image, typically from top to bottom. On many displays, the bottom of the screen is updated several milliseconds after the top.[6][11] Finally, the display's pixel response time—the time it takes for a pixel to change from one color to another—adds a final small delay. OLED panels, for example, can switch pixels in under 1 ms.[14]

Latency Budget Allocation

To achieve the sub-20 ms target, every millisecond in the pipeline is critical. The following table provides an example breakdown of a typical latency budget, illustrating how quickly delays can accumulate.

Example Motion-to-Photon Latency Budget Breakdown
Stage Description Estimated Latency Contribution (ms)
1. Sensor Sampling Time for the IMU or camera to capture physical motion. IMUs are fast (~1000 Hz), while cameras are slower (30-90 Hz). 1–2 ms (IMU); 15–33 ms (Camera)[11]
2. Data Transmission & Processing Delay in sending sensor data to the host CPU and for the tracking system to compute the 6DoF pose. 1–4 ms[15]
3. Application / CPU Processing Time for the game engine/application to process the new pose and prepare the scene for the next frame. Variable (typically aims for < 5–10 ms)[13]
4. GPU Render Queue Time the frame waits in the GPU driver's command buffer before rendering begins. Aggressive buffering can increase this delay. 1–2 frames (can be >16 ms)[6]
5. GPU Rendering Time for the GPU to render the scene. This is dependent on scene complexity and GPU power. For a 90 Hz target, this must be under 11.1 ms. < 11.1 ms (at 90 Hz)[13]
6. Compositor & Timewarp Time for the system compositor to apply distortion correction and run latency-hiding techniques like Asynchronous Timewarp. 1–2 ms[16]
7. Display Scan-out & Response Time Time for the display to physically draw the frame (scan-out) and for pixels to change color (response time). OLEDs are much faster than LCDs. 1–16.7 ms (scan-out at 60 Hz) + 1–8 ms (pixel response)[6][17]
Total (Example) Sum of all stages. Without mitigation, this can easily exceed 50–100 ms. > 50 ms

Latency Thresholds and Human Perception

The acceptable limit for MTP latency is not arbitrary; it is dictated by the thresholds of human perception. Extensive research and development have established clear targets that differ significantly between virtual and augmented reality.

Approximate Latency Thresholds for User Comfort
Display Mode Approximate Latency Threshold (for comfort)
Virtual Reality (VR) ~20 ms or less[18]
Augmented Reality (AR, optical see-through) ~5–10 ms[19]
Mixed Reality (MR, video see-through) ~20 ms[18]

The "20 Millisecond Rule" for Virtual Reality

For VR, a consensus has formed around the 20 millisecond rule: to be generally imperceptible and provide a comfortable, high-quality experience, the end-to-end MTP latency should remain below 20 ms.[1][20][21][22] This figure was largely established and popularized through the foundational work of John Carmack at Oculus.[6] Studies and user feedback indicate that latencies below this threshold are not consciously detected by most people, allowing for a convincing sense of presence.[20] Research by Jason Jerald in 2009 found that trained observers could detect latency differences as small as 3.2 ms.[23] As latency creeps above 20 ms, the lag becomes noticeable, and once it exceeds 50-60 ms, the risk of discomfort and cybersickness increases dramatically.[20][24]

Stricter Requirements for Augmented Reality

Augmented Reality (AR) systems, particularly optical see-through (OST) devices like the Microsoft HoloLens, face a far more demanding latency challenge.[25] In an OST system, the user views the real world directly through transparent optics, which provides an instantaneous, zero-latency visual reference. Any delay in rendering the virtual overlay causes it to lag behind the stable real world during head movements, an effect often described as "swimming" or misregistration.[26]

Because the human visual system is extremely sensitive to this relative motion, the acceptable MTP latency for AR is much lower than for VR. Industry experts and researchers, including work by Ellis and Adelstein at NASA, cite a target of under 7 ms, and ideally under 5 ms, for AR overlays to appear convincingly locked to the real world.[7][27][11][28] This makes latency a significantly greater engineering hurdle for AR than for VR.[25]

Pioneering Perspectives: John Carmack and Michael Abrash

The modern focus on minimizing MTP latency was championed by key industry figures.

  • John Carmack identified latency as "one of the most critical factors" for VR while at Oculus. His extensive writings and talks educated the developer community on the sources of latency throughout the PC graphics pipeline—from sensor sampling to driver buffering and display scan-out—and he pioneered techniques like reprojection to mitigate it.[6][22]
  • Michael Abrash, a leading researcher at Valve and later Meta, famously described latency as the "sine qua non"—the essential, indispensable ingredient—of VR and AR.[29] He argued that while 20 ms was a viable target for VR, the true requirement for believable AR could be as low as 15 ms or even 7 ms.[7] In 2013, he noted that typical systems were achieving around 36 ms, a figure he described as "a long way from 20 ms, and light-years away from 7 ms," highlighting the scale of the challenge ahead.[7]

Measurement Techniques

Accurately quantifying MTP latency is a complex task. Simply instrumenting the software to report its own timing is often unreliable, as the measurement process itself can introduce additional latency and alter the system's behavior.[6] Consequently, the standard for precise measurement relies on external hardware-based systems that can observe both the physical motion and the resulting photonic output without interfering with the device under test.

Hardware-Based Measurement Systems

Several methods have been developed to provide objective, high-precision latency measurements.

High-Speed Camera and Motion Rig Analysis

A widely used and robust method involves a high-speed camera to simultaneously film a physical motion and the corresponding update on the HMD's display.[6][2][30] The process typically involves:

  1. Controlled Motion: The HMD or controller is mounted on a precisely controlled robotic motion rig, such as a rotary platform, a linear rail system, or a pendulum.[10][31]
  2. High-Speed Recording: A camera recording at a high frame rate (e.g., 240 fps, 960 fps, or higher) captures both the physical object and the HMD screen in the same shot.[30][32] Commercial tools like OptoFidelity's BUDDY system represent the state-of-the-art for this method.[33]
  3. Analysis: The recorded video is analyzed frame-by-frame. The latency is calculated by counting the number of frames between the initiation of physical motion and the first visible change on the display.[6][30]

Photodiode and Oscilloscope Methods

For even higher temporal precision, a method using a photodiode (a high-speed light sensor) and an oscilloscope can be employed.[31][34]

  1. A photodiode is physically attached to a specific area of the HMD screen.
  2. A physical event (e.g., a pendulum swinging past a sensor) triggers the start of a timer on an oscilloscope.[31][34]
  3. The VR application is programmed to change the color of the screen area under the photodiode (e.g., from black to white) as soon as it virtually registers that same event.
  4. The photodiode detects the change in light and sends a signal to the oscilloscope, stopping the timer. The elapsed time displayed on the oscilloscope is a direct measurement of the MTP latency.[31]

Differentiating Initial vs. Continuous Latency

A crucial finding from modern latency measurement studies is that MTP latency is not a static value; it changes dynamically depending on the nature of the user's movement.[2][3]

  • Initial Latency (or Unpredicted Latency): This is the latency measured at the very onset of a sudden, abrupt movement from a standstill. In this phase, latency is typically at its highest because latency-mitigation algorithms like predictive tracking have not yet collected enough motion data to build an accurate model of the movement's trajectory.[2][9][32]
  • Continuous Latency (or Predicted Latency): This is the latency measured during a smooth, ongoing, and predictable movement (e.g., the middle of a steady head turn). During such movements, prediction algorithms have sufficient data to accurately extrapolate the future pose of the device, functionally reducing the perceived latency to a much lower value, in some cases to just a few milliseconds.[2][9]

Industry Standards

The importance of latency is reflected in formal standards. IEEE Standard 3079.1 was published to specify requirements and measurement procedures for "Motion to Photon (MTP) Latency in Virtual Environments."[35] Platform manufacturers also hold to strict internal targets (e.g., Sony <18ms for PSVR, Meta 13ms at 90Hz).[35]

Latency Mitigation Strategies

Given the physical and computational constraints that make zero latency impossible, VR and AR system designers have developed a sophisticated suite of software and hardware techniques to reduce or perceptually hide MTP latency.

Software and Algorithmic Solutions

Predictive Tracking

Predictive tracking is a foundational and universally employed technique to combat latency.[36] Instead of rendering the virtual scene based on where the user's head is, the system predicts where it will be at the future time when the frame is actually displayed.[37]

  • Mechanism: The tracking system analyzes the current motion's velocity and acceleration to extrapolate a future pose. The game engine then uses this predicted pose to render the scene.
  • Algorithms: These range from simple dead reckoning to more advanced models like Kalman filters, which are considered the gold standard for fusing noisy sensor data and predicting future states.[37][38]
  • Limitations: Prediction is most effective for smooth movements and less accurate for sudden, unpredictable changes in direction, which is why initial latency is higher than continuous latency.[2]

Asynchronous Timewarp (ATW)

Asynchronous Timewarp (ATW) is a powerful reprojection technique that decouples the final presentation of an image from the application's main render loop.[16][39]

  • Mechanism: ATW runs on a separate, high-priority thread. Just before the display refreshes (an event known as vsync), ATW takes the most recently completed frame and "warps" it based on the very latest head-tracking data. This warped image is what gets sent to the display.[16] This process is computationally cheap because basic ATW only corrects for rotational head movement.
  • Benefit: If the main application is slow and fails to render a new frame in time, ATW can still take the previous frame and re-project it with the new head rotation. This prevents the jarring "world judder" that would otherwise occur, making the experience perceptually more comfortable.[13][16]
  • Limitations: Because standard ATW does not account for translational (positional) movement, it can introduce an artifact known as positional judder, where near-field objects appear to stutter when the user moves their head side-to-side.[16]

Asynchronous Spacewarp (ASW) and Reprojection

Asynchronous Spacewarp (ASW) is an evolution of ATW designed to address the limitations of rotational-only reprojection and to handle cases where the application cannot consistently meet the target frame rate.[40]

  • Mechanism: When an application's performance drops, the system can intentionally lock its render rate to half (e.g., 45 Hz). ASW then generates a synthetic, extrapolated frame between each real frame, bringing the frame rate back up to the display's native rate (e.g., 90 Hz).[41]
  • Difference from ATW: ASW analyzes the motion between the last two rendered frames to create motion vectors for objects in the scene. This allows it to handle not just head rotation, but also user translation, controller movement, and animations. Modern versions like ASW 2.0 incorporate the depth buffer to reduce visual artifacts.[40][42]
  • Limitations: ASW is a form of motion estimation and can produce visual artifacts like object warping, ghosting, or shimmering, especially around the edges of moving objects or in areas of disocclusion.[43]
  • Related Techniques: Other platforms have similar technologies, such as Valve's Motion Smoothing.[43][44] More advanced research includes ML-based techniques like PredATW.[45]

System-Level Optimizations

  • Late Latching: A strategy that delays the sampling of the head and controller pose to the last possible moment before the GPU begins rendering.[46] This ensures the pose data is as fresh as possible.
  • Direct Mode Rendering: A driver-level feature that allows the VR runtime to write directly to the headset display, bypassing the operating system's desktop compositor, which could otherwise add one or more frames of latency.[47]

Hardware and Display Technologies

High Refresh Rates

A fundamental way to reduce latency is to use a display with a high refresh rate. Increasing the refresh rate shortens the time interval between updates (e.g., 60 Hz = 16.7 ms/frame, 90 Hz = 11.1 ms/frame, 120 Hz = 8.3 ms/frame), directly reducing the maximum "display wait" portion of the latency budget.[20][48]

Low-Persistence Displays: OLED vs. LCD

A critical factor in reducing motion blur, which is perceptually linked to latency, is display persistence. Persistence is the length of time a pixel remains illuminated during a single frame.[49]

  • Full Persistence: On a traditional display, a frame is shown for the entire refresh duration (e.g., 16.7 ms at 60 Hz). As the user's eye moves, the static image is smeared across the retina, creating significant motion blur.[50]
  • Low Persistence: Modern VR displays use a low-persistence technique. They illuminate the pixels for only a very brief portion of the frame time (e.g., 1-3 ms, or ~20% of the cycle) and keep the screen dark for the remainder. This strobing is too fast to be perceived as flicker but ensures the eye sees a sharp, distinct image, similar to the behavior of a CRT display. This dramatically reduces motion blur.[49][51]

The choice of display technology impacts the ability to achieve low persistence:

  • ]: Organic light-emitting diode displays are ideal, as each pixel is its own light source and can be turned on and off almost instantaneously, allowing for precise control over illumination time.[17]
  • ]: Liquid-crystal display panels traditionally have slower pixel response times. However, modern "fast LCD" panels have dramatically improved response times and are now widely used in many high-performance VR headsets, offering benefits like higher subpixel density which reduces the screen-door effect.[48]

Global Shutter Cameras

For optical tracking, global shutter cameras are preferred over rolling shutter cameras. A global shutter captures the entire image at the exact same instant, providing a clear, un-distorted tracking reference. A rolling shutter captures the image line-by-line, which can cause wobble or "jello" artifacts during fast motion, complicating the tracking calculations.[52]

Application and Comparative Analysis

Latency in Commercial VR Headsets

Empirical studies using high-speed camera systems have measured the MTP latency of various consumer VR headsets. These studies consistently validate the importance of distinguishing between initial latency at the onset of a sudden movement and the much lower continuous latency achieved once prediction algorithms are active.

Comparative MTP Latency in Commercial VR Headsets
Headset Year Motion-to-Photon Refresh Rate Resolution per Eye Display Type Notable Features
Historical Systems
Oculus Rift DK1 2013 50-60ms 60Hz 640×800 LCD First consumer dev kit
Oculus Rift DK2 2014 ~30ms 75Hz 960×1080 OLED Low-persistence breakthrough
Oculus Rift CV1 2016 ~22ms (2ms predicted) 90Hz 1080×1200 OLED Established 90Hz standard
HTC Vive 2016 21-42ms (2-13ms predicted)[53] 90Hz 1080×1200 OLED Lighthouse tracking
Current Generation
Meta Quest 3 2023 2-3ms[8] 90-120Hz 2064×2208 LCD Best-in-class latency
Meta Quest Pro 2022 2-3ms[8] 90Hz 1800×1920 LCD + local dimming Mixed reality
Valve Index 2019 21-42ms (2-13ms predicted)[53] 80-144Hz 1440×1600 LCD 130° FOV
Apple Vision Pro 2024 ~3ms angular, 11ms see-through[8] 90Hz 3660×3200 micro-OLED R1 chip
PlayStation VR2 2023 Not published 90-120Hz 2000×2040 OLED HDR Eye tracking

The Augmented Reality Challenge

As previously noted, AR's requirement for virtual objects to remain perfectly registered with the real world imposes much stricter latency constraints. The specific nature of this challenge depends on the type of AR display technology used.

Optical See-Through (OST) vs. Video See-Through (VST) Systems

  • Optical See-Through (OST) AR: In OST systems (e.g., Microsoft HoloLens), the user views reality directly through transparent displays. In this model, only the virtual content is subject to MTP latency. The primary engineering goal is to reduce the latency of this virtual content to an absolute minimum (ideally < 5 ms) to prevent it from "swimming" or lagging behind the perfectly stable, zero-latency real world.[26][25]
  • Video See-Through (VST) AR: In VST systems (used for the passthrough mode in VR headsets like the Meta Quest 3 and Apple Vision Pro), the user views reality through a set of cameras that stream video to the internal, opaque displays. In this architecture, both the real-world video feed and the virtual overlays are subject to latency.[26] This introduces a different engineering trade-off. While achieving an absolute MTP latency of < 5 ms for both streams is extremely difficult, a more achievable and perceptually effective goal is to match the latency of the virtual objects to the latency of the passthrough video. If both the real-world video and the virtual graphics are delayed by the same amount (e.g., 11 ms on Apple Vision Pro[8]), the virtual objects will appear perfectly registered and stable relative to the video feed, even though the entire combined view lags slightly behind the user's physical head motion. This prioritizes registration quality over absolute temporal accuracy.

Historical Development

The Oculus Rift DK1 in March 2013 brought VR to developers but suffered from 50-60ms latency that caused widespread nausea. John Carmack joining Oculus in August 2013 marked a turning point, as he focused the team on a "relentless pursuit of low latency." His October 2013 presentation outlined the comprehensive mitigation strategies that would become industry standard.[52]

The Oculus Rift DK2 (2014) was the first major breakthrough, introducing a low-persistence OLED display and positional tracking, cutting latency to ~30ms. The March 2016 consumer launches of the Oculus Rift CV1 and HTC Vive established the modern baseline with 90Hz OLED displays achieving approximately 20-22ms baseline latency (before prediction).

The 2023-2024 generation of headsets, like the Meta Quest 3 and Apple Vision Pro, achieved effectively imperceptible angular latency. The Quest 3's measured 2-3ms represents a 95% reduction from the DK1's 50-60ms over ten years, marking one of VR's greatest technical achievements.[8]

Future Challenges

With angular MTP latency largely solved for local rendering, future challenges shift to other parts of the system:

  • Cloud VR: Streaming VR from a remote server reintroduces significant network latency (e.g., 20-40ms on 5G) that must be hidden from the user, requiring advanced prediction and edge computing.
  • Eye tracking Latency: For foveated rendering to work, the system must detect where the user is looking and render a high-resolution image in that spot before the eye moves again (a saccade). This requires an "eye-motion-to-photon" latency that is also exceptionally low.
  • Varifocal displays: Systems that can dynamically change focus must do so in sync with the user's eye movements, adding another latency-critical pipeline.
  • Higher Resolutions: Pushing to "retinal resolution" (e.g., 8K per eye) while maintaining high frame rates (90-120Hz) creates an immense rendering bottleneck that can reintroduce latency if not managed.

See Also

References

  1. 1.0 1.1 1.2 1.3 "Motion-to-Photon Latency". Unity Technologies. https://unity.com/glossary/motion-to-photon-latency.
  2. 2.0 2.1 2.2 2.3 2.4 2.5 2.6
    Raw, Michael(2022-10-10). "Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems".{Template:Journal. 55(7)
    3658–3678. doi:10.3758/s13428-022-01983-5. PMID [36217006](https://pubmed.ncbi.nlm.nih.gov/36217006/). https://pmc.ncbi.nlm.nih.gov/articles/PMC10616216/.
  3. 3.0 3.1
    Niebling, Florian(2020). "Latency and Cybersickness
    A Survey".{Template:Journal
    586-603. doi:10.1109/ISMAR50242.2020.00088. https://doaj.org/article/2dfbbdc3fe704977bc34b594f96fdf8b.
  4. Lawson, Ben(2020-11-20). "Identifying Causes of and Solutions for Cybersickness in Immersive Technology
    A Review".{Template:Journal. 1
    587698. doi:10.3389/frvir.2020.587698. https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2020.587698/full.
  5. DAQRI (2018-08-20). "Motion to Photon Latency in Mobile AR and VR". https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926.
  6. 6.00 6.01 6.02 6.03 6.04 6.05 6.06 6.07 6.08 6.09 "John Carmack's battle with 20 millisecond latency". PCGamesN. https://www.pcgamesn.com/virtual-reality-john-carmacks-battle-20-millisecond-latency.
  7. 7.0 7.1 7.2 7.3 "Valve's Michael Abrash: Latency is getting in the way of VR". GamesIndustry.biz. https://www.gamesindustry.biz/valves-michael-abrash-latency-is-getting-in-the-way-of-vr.
  8. 8.0 8.1 8.2 8.3 8.4 8.5 OptoFidelity. "Apple Vision Pro Benchmark Test 2: Angular Motion-to-Photon Latency in VR". https://www.optofidelity.com/insights/blogs/apple-vision-pro-bencmark-test-2.-angular-motion-to-photon-latency-in-vr
  9. 9.0 9.1 9.2 Warburton et al. (2022-06-27). "Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems". https://www.biorxiv.org/content/10.1101/2022.06.24.497509v1.full-text.
  10. 10.0 10.1
    Ho, Adrian(2018-09-01). "A Time-Sequential Measurement System for Motion-to-Photon Latency in a Virtual Reality Head-Mounted Display".{Template:Journal. 7(9)
    171. doi:10.3390/electronics7090171. https://www.mdpi.com/2079-9292/7/9/171.
  11. 11.0 11.1 11.2 11.3 11.4 "Motion-to-Photon Latency in Mobile AR and VR". DAQRI. https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926.
  12. DAQRI. "Motion to Photon Latency in Mobile AR and VR". Medium. https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926
  13. 13.0 13.1 13.2 13.3 "Understanding Gameplay Latency for Oculus Quest, Oculus Go, and Gear VR". Meta. https://developers.meta.com/horizon/blog/understanding-gameplay-latency-for-oculus-quest-oculus-go-and-gear-vr/.
  14. OLED-Info. "Oculus VR shows a new HMD prototype with a low-persistence OLED". https://www.oled-info.com/oculus-vr-shows-new-hmd-prototype-low-persistance-oled
  15. "Question: Optimal latency of perceived hand tracking/viewing in VR?". https://www.reddit.com/r/oculus/comments/2s0eu3/question_optimal_latency_of_perceived_hand/.
  16. 16.0 16.1 16.2 16.3 16.4 "Asynchronous Timewarp Examined". Meta. https://developers.meta.com/horizon/blog/asynchronous-timewarp-examined/.
  17. 17.0 17.1 "OLEDs are substantially inferior to RGB LCDs in terms of both sharpness... and due to oversaturation". https://www.reddit.com/r/oculus/comments/212imk/oleds_are_substantially_inferior_to_rgb_lcds_in/.
  18. 18.0 18.1 Varjo (2025). "Latency in virtual and mixed reality explained." Varjo Support Knowledge Base. support.varjo.com
  19. Wagner, D. (2018). "Motion-to-Photon Latency in Mobile AR and VR." *DAQRI Blog* (Medium), Aug. 20, 2018. Medium
  20. 20.0 20.1 20.2 20.3 "Motion-to-Photon Latency". The Spatial Studio. https://thespatialstudio.de/en/xr-glossary/motion-to-photon-latency.
  21. "Motion-to-photon latency". XinReality. https://xinreality.com/wiki/Motion-to-photon_latency.
  22. 22.0 22.1 John Carmack (2013-02-22). "Latency Mitigation Strategies". https://danluu.com/latency-mitigation/.
  23. Jerald, J. (2009). "Scene-motion- and latency-perception thresholds for head-mounted displays". University of North Carolina. https://www.semanticscholar.org/paper/Scene-motion-and-latency-perception-thresholds-for-Jerald/42a432706ce172498080b7d879bd6a6059bdcaa5
  24. "John Carmack Talks Virtual Reality Latency Mitigation Strategies". Road to VR. https://www.roadtovr.com/john-carmack-talks-virtual-reality-latency-mitigation-strategies/.
  25. 25.0 25.1 25.2 "The Greatest Engineering Challenge to Improve Mobile Augmented Reality Headsets". Engineering.com. https://www.engineering.com/the-greatest-engineering-challenge-to-improve-mobile-augmented-reality-headsets/.
  26. 26.0 26.1 26.2
    Adelstein, Bernard D.(2016-12-23). "How Latency and Spatial Matching of Visual and Auditory Information Affects Presence in a Training Simulation".{Template:Journal. 3
    34. doi:10.3389/fict.2016.00034. https://www.frontiersin.org/journals/ict/articles/10.3389/fict.2016.00034/full.
  27. Heinrich Fink. "Optimizing Motion to Photon Latency". DAQRI. https://xdc2018.x.org/slides/Heinrick_Fink_daqri_optimizing_motion_to_photon_latency.pdf.
  28. Mania, K., Adelstein, B. D., Ellis, S. R., & Hill, M. I. (2004). "Perceptual sensitivity to head tracking latency in virtual environments with varying degrees of scene complexity". ACM Applied Perception. https://dl.acm.org/doi/10.1145/1012551.1012559
  29. "Michael Abrash: Latency is the sine qua non of AR and VR". https://www.reddit.com/r/programming/comments/15rj38/michael_abrash_latency_is_the_sine_qua_non_of_ar/.
  30. 30.0 30.1 30.2 Tsai, Yu-Ju; Wang, Yu-Xiang; Ouhyoung, Ming (2017-07-30). "Affordable System for Measuring Motion-to-Photon Latency of Virtual Reality in Mobile Device". SIGGRAPH '17: ACM SIGGRAPH 2017 Posters. Template:Hide in printTemplate:Only in print. https://liagm.github.io/pdf/affordable.pdf.
  31. 31.0 31.1 31.2 31.3 Pape, Sascha; Strothmann, Tobias; Wienrich, Carolin; Latochek, Marc Erich (2017-03-18). "Concept of a Low-Cost Device for the Automated Measurement of Motion-to-Photon Latency in Virtual Reality Environments". 2017 IEEE Symposium on 3D User Interfaces (3DUI). Template:Hide in printTemplate:Only in print. https://vr.rwth-aachen.de/media/papers/185/Pape_SEARIS_Calibratio.pdf.
  32. 32.0 32.1 "Measuring Head-Mounted Displays (HMD) Motion-to-Photon (MTP) Latency". OptoFidelity. https://www.optofidelity.com/insights/blogs/measuring-head-mounted-displays-hmd-motion-to-photon-mtp-latency.
  33. OptoFidelity. "Comparing VR headsets' tracking performance". https://www.optofidelity.com/blog/comparing-vr-headsets-tracking-performance
  34. 34.0 34.1 Steed, Anthony; Frisoli, Antonio; Giachritsis, Christos; Krog, Sue; Padrao, Daniela; Pan, Yuan (2020-03-22). "On the Plausibility of a Pass-Through Video See-Through Display". 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). Template:Hide in printTemplate:Only in print. https://www.microsoft.com/en-us/research/wp-content/uploads/2020/01/ieee_vr_2020___latency.pdf.
  35. 35.0 35.1 IEEE Standards Association. "IEEE 3079.1 - Motion to Photon (MTP) Latency in Virtual Environments". https://standards.ieee.org/ieee/3079.1/10383/
  36. "Predictive tracking". XinReality. https://xinreality.com/wiki/Predictive_tracking.
  37. 37.0 37.1 "Understanding Predictive Tracking and Why It's Important for AR/VR Headsets". Road to VR. https://www.roadtovr.com/understanding-predictive-tracking-important-arvr-headsets/.
  38. Road to VR. "Understanding Predictive Tracking and Why It's Important for AR/VR Headsets". https://www.roadtovr.com/understanding-predictive-tracking-important-arvr-headsets/
  39. "Asynchronous Time Warp in layman's terms?". https://www.reddit.com/r/oculus/comments/4bycfx/asynchronous_time_warp_in_laymans_terms/.
  40. 40.0 40.1 "Asynchronous Spacewarp". Meta. https://developers.meta.com/horizon/blog/asynchronous-spacewarp/.
  41. "Asynchronous Spacewarp". XinReality. https://xinreality.com/wiki/Asynchronous_Spacewarp.
  42. Oculus. "Asynchronous Spacewarp". https://developer.oculus.com/blog/asynchronous-spacewarp/
  43. 43.0 43.1 "Asynchronous Reprojection Explained (AKA Ghosting, Stuttering, etc.)". https://forum.dcs.world/topic/244962-asynchronous-reprojection-explained-aka-ghosting-stuttering-etc/.
  44. "Asynchronous reprojection". Wikipedia. https://en.wikipedia.org/wiki/Asynchronous_reprojection.
  45. Template:Cite arXiv
  46. "Optimizing VR Graphics With Late Latching". Meta. https://developers.meta.com/horizon/blog/optimizing-vr-graphics-with-late-latching/.
  47. UploadVR. "Latest Steam VR Update Brings Direct Mode To Vive Pre". https://www.uploadvr.com/latest-steam-vr-update-brings-direct-mode-to-vive-pre/
  48. 48.0 48.1 "How Does a VR LCD Display Improve Immersion?". Panox Display. https://blog.panoxdisplay.com/how-does-a-vr-lcd-display-improve-immersion/.
  49. 49.0 49.1 "Is 90hz a fast enough refresh rate that low persistence has been scrapped?". https://www.reddit.com/r/oculus/comments/2xoscc/is_90hz_a_fast_enough_refresh_rate_that_low/.
  50. "Virtual Reality: Blatant latency and how to avoid it". ARM. https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/virtual-reality-blatant-latency-and-how-to-avoid-it.
  51. Blur Busters. "Why Do Some OLEDs Have Motion Blur?". https://blurbusters.com/faq/oled-motion-blur/
  52. 52.0 52.1 Carmack, John. "Latency mitigation strategies". https://danluu.com/latency-mitigation/
  53. 53.0 53.1 Warburton, M., Mon-Williams, M., Mushtaq, F., & Morehead, J. R. (2022). "Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems". Behavior Research Methods. https://link.springer.com/article/10.3758/s13428-022-01983-5

Cite error: <ref> tag with name "M2P_Github" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "BlurBusters" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "SPIE_AR" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "6GShine2024" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "XRglossary2023" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "Nguyen2019" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "OculusBP2014" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "optofidelity" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "frontiers" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "varjo" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "spatial" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "meta" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "matrise" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "frontiers2020" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "chioka2013" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "meta2019" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "oculus2016" defined in <references> is not used in prior text.