Jump to content

Motion-to-photon latency: Difference between revisions

No edit summary
No edit summary
Line 1: Line 1:
{{main|Latency}}
{{about|the delay between user movement and the corresponding display update in virtual and augmented reality|other uses|Latency}}
'''Motion-to-photon latency''' also known as the '''End-to-end latency''' is the delay between the movement of the user's head and the change of the [[VR]] device's display reflecting the user's movement. As soon as the user's head moves, the VR scenery should match the movement. The more delay (latency) between these 2 actions, the more unrealistic the VR world seems. To make the VR world realistic, VR systems want low latency of <20ms and even really low latency of <7ms.
{{infobox
| title = Motion-to-Photon Latency
| image = Motion-to-photon_latency_pipeline.svg
| caption = Simplified diagram of the motion-to-photon pipeline. Each stage adds delay, and mitigation techniques like [[predictive tracking]] and [[reprojection]] are used to reduce the final perceived latency.
| label1 = Also known as
| data1 = End-to-end latency, MTP latency
| label2 = Field
| data2 = [[Virtual reality]], [[Augmented reality]], [[Human-computer interaction]]
| label3 = Key metric for
| data3 = [[Presence (virtual reality)|Presence]], [[Cybersickness]]
| label4 = Target (VR)
| data4 = < 20 [[millisecond]]s
| label5 = Target (AR)
| data5 = < 7 [[millisecond]]s
| label6 = Key figures
| data6 = [[John Carmack]], [[Michael Abrash]]
}}


'''Motion-to-photon latency''' ('''MTP latency'''), also known as '''end-to-end latency''', is a critical performance metric in [[virtual reality]] (VR) and [[augmented reality]] (AR) systems. It measures the total time delay from the moment a user initiates a physical movement (e.g., turning their head or moving a controller) to the moment the first [[photon]]s of light reflecting that movement are emitted from the system's display and reach the user's eyes.<ref name="UnityGlossary" /><ref name="Warburton2022" />


Minimizing this latency is paramount for creating a convincing and comfortable immersive experience. High latency creates a perceptible lag between a user's actions and the system's visual feedback. In VR, where users are fully immersed, this breaks the sense of [[presence (virtual reality)|presence]] and is a primary cause of [[cybersickness]], a form of motion sickness with symptoms including nausea, disorientation, and eye strain.<ref name="Stauffert2020" /><ref name="Stanney2020" /> In AR and [[mixed reality]] (MR), where virtual elements are overlaid onto the real world, the challenge is even greater, as even small delays can cause noticeable misregistration, making virtual objects appear to "swim" or float disconnectedly from their real-world anchors.<ref name="daqri" />
The industry-accepted target for high-quality VR is an MTP latency of under 20 [[millisecond]]s (ms), a threshold below which the delay is generally imperceptible.<ref name="CarmackLatency" /> For AR, requirements are far stricter, often demanding latencies below 7 ms.<ref name="Abrash_GI" /> Thanks to advances in hardware and software, MTP latency has dropped from 50-60 ms in early VR [[head-mounted display|headsets]] to as low as 2-3 ms in modern systems, transforming the technology's comfort and viability.<ref name="optofidelity2024" />
== The Motion-to-Photon Pipeline ==
MTP latency is not a single delay but the sum of many small delays that occur sequentially in a complex hardware and software pipeline. This pipeline begins with physical motion and ends with light being emitted from the [[head-mounted display]] (HMD).<ref name="Warburton2022_bioRxiv" /><ref name="So2021" /> Understanding each stage is crucial for identifying sources of latency and developing effective mitigation strategies.
=== Pipeline Stages ===
# '''Motion Detection & Sensor Sampling''': Physical motion is captured by sensors, primarily an [[Inertial Measurement Unit]] (IMU) containing [[accelerometer]]s and [[gyroscope]]s, often supplemented by optical tracking systems (e.g., cameras). The rate at which these sensors are sampled introduces the first delay. IMUs operate at very high frequencies (e.g., 500-1000 Hz), making their individual sample latency very low (~1 ms).<ref name="DAQRI_Pipeline" /><ref name="medium2017" /> However, camera-based tracking systems are much slower (typically 30-90 Hz), which can introduce a significant initial delay of 15-33 ms.<ref name="DAQRI_Pipeline" />
# '''Data Transmission & Processing''': The raw sensor data is transmitted from the HMD or controller to the host computer (e.g., via [[USB-C]] or wirelessly). This data is then processed by a tracking system, which fuses inputs from multiple sensors (e.g., using [[visual-inertial odometry]]) to calculate the device's precise position and orientation in [[six degrees of freedom]] (6DoF).<ref name="Warburton2022" />
# '''Application & Game Engine Processing''': The host [[Central processing unit|CPU]] receives the pose data. The application or game engine then executes its logic for the upcoming frame based on this new pose. This includes tasks like physics simulations, [[artificial intelligence|AI]] calculations, and processing user inputs.<ref name="UnityGlossary" /><ref name="Meta_Latency" />
# '''Render Pipeline Execution''': The CPU sends drawing commands to the [[Graphics Processing Unit|GPU]]. The GPU then renders the 3D scene from the perspective of the new pose into an off-screen image buffer. This is often the most time-consuming part of the pipeline. To improve performance on traditional benchmarks, graphics drivers may aggressively buffer these commands, which can add one or more full frames of latency.<ref name="CarmackLatency" />
# '''Compositing and Post-Processing''': The rendered image undergoes final processing steps. This includes applying lens [[distortion correction]] to compensate for the HMD's optics, [[chromatic aberration]] correction, and executing latency-mitigation techniques like [[reprojection|reprojection]].<ref name="UnityGlossary" />
# '''Display Refresh & Scan-out''': The final image is sent to the HMD's display. Further delay is introduced by the display's [[refresh rate]] (e.g., a 90 Hz display introduces a potential wait of up to 11.1 ms) and the physical time it takes for the display to "scan out" the image, typically from top to bottom. On many displays, the bottom of the screen is updated several milliseconds after the top.<ref name="CarmackLatency" /><ref name="DAQRI_Pipeline" /> Finally, the display's pixel response time—the time it takes for a pixel to change from one color to another—adds a final small delay. [[OLED]] panels, for example, can switch pixels in under 1 ms.<ref name="oledinfo2014" />
=== Latency Budget Allocation ===
To achieve the sub-20 ms target, every millisecond in the pipeline is critical. The following table provides an example breakdown of a typical latency budget, illustrating how quickly delays can accumulate.
{| class="wikitable"
|+ Example Motion-to-Photon Latency Budget Breakdown
! Stage
! Description
! Estimated Latency Contribution (ms)
|-
| 1. Sensor Sampling
| Time for the IMU or camera to capture physical motion. IMUs are fast (~1000 Hz), while cameras are slower (30-90 Hz).
| 1–2 ms (IMU); 15–33 ms (Camera)<ref name="DAQRI_Pipeline"/>
|-
| 2. Data Transmission & Processing
| Delay in sending sensor data to the host CPU and for the tracking system to compute the 6DoF pose.
| 1–4 ms<ref name="Leap_Tracking_Reddit"/>
|-
| 3. Application / CPU Processing
| Time for the game engine/application to process the new pose and prepare the scene for the next frame.
| Variable (typically aims for < 5–10 ms)<ref name="Meta_Latency"/>
|-
| 4. GPU Render Queue
| Time the frame waits in the GPU driver's command buffer before rendering begins. Aggressive buffering can increase this delay.
| 1–2 frames (can be >16 ms)<ref name="CarmackLatency"/>
|-
| 5. GPU Rendering
| Time for the GPU to render the scene. This is dependent on scene complexity and GPU power. For a 90 Hz target, this must be under 11.1 ms.
| < 11.1 ms (at 90 Hz)<ref name="Meta_Latency"/>
|-
| 6. Compositor & Timewarp
| Time for the system compositor to apply distortion correction and run latency-hiding techniques like Asynchronous Timewarp.
| 1–2 ms<ref name="Meta_ATW_Examined"/>
|-
| 7. Display Scan-out & Response Time
| Time for the display to physically draw the frame (scan-out) and for pixels to change color (response time). OLEDs are much faster than LCDs.
| 1–16.7 ms (scan-out at 60 Hz) + 1–8 ms (pixel response)<ref name="CarmackLatency"/><ref name="OLED_vs_LCD_Reddit"/>
|-
! Total (Example)
| '''Sum of all stages. Without mitigation, this can easily exceed 50–100 ms.'''
| '''> 50 ms'''
|}
== Latency Thresholds and Human Perception ==
The acceptable limit for MTP latency is not arbitrary; it is dictated by the thresholds of human perception. Extensive research and development have established clear targets that differ significantly between virtual and augmented reality.
{| class="wikitable" style="text-align:center; margin: 1em auto 1em auto;"
|+ Approximate Latency Thresholds for User Comfort
! Display Mode !! Approximate Latency Threshold (for comfort)
|-
| Virtual Reality (VR) || ~20 ms or less<ref name="Varjo2025" />
|-
| Augmented Reality (AR, optical see-through) || ~5–10 ms<ref name="Wagner2018" />
|-
| Mixed Reality (MR, video see-through) || ~20 ms<ref name="Varjo2025" />
|}
=== The "20 Millisecond Rule" for Virtual Reality ===
For VR, a consensus has formed around the '''20 millisecond rule''': to be generally imperceptible and provide a comfortable, high-quality experience, the end-to-end MTP latency should remain below 20 ms.<ref name="UnityGlossary" /><ref name="SpatialStudio" /><ref name="XinReality_MTP" /><ref name="CarmackPaper" /> This figure was largely established and popularized through the foundational work of John Carmack at Oculus.<ref name="CarmackLatency" /> Studies and user feedback indicate that latencies below this threshold are not consciously detected by most people, allowing for a convincing sense of presence.<ref name="SpatialStudio" /> Research by Jason Jerald in 2009 found that trained observers could detect latency differences as small as 3.2 ms.<ref name="jerald2009" /> As latency creeps above 20 ms, the lag becomes noticeable, and once it exceeds 50-60 ms, the risk of discomfort and cybersickness increases dramatically.<ref name="SpatialStudio" /><ref name="CarmackRoadToVR" />
=== Stricter Requirements for Augmented Reality ===
Augmented Reality (AR) systems, particularly [[optical see-through]] (OST) devices like the [[Microsoft HoloLens]], face a far more demanding latency challenge.<ref name="DAQRI_Mobile_AR" /> In an OST system, the user views the real world directly through transparent optics, which provides an instantaneous, zero-latency visual reference. Any delay in rendering the virtual overlay causes it to lag behind the stable real world during head movements, an effect often described as "swimming" or misregistration.<ref name="Ellis2016" />
Because the human visual system is extremely sensitive to this relative motion, the acceptable MTP latency for AR is much lower than for VR. Industry experts and researchers, including work by Ellis and Adelstein at NASA, cite a target of '''under 7 ms''', and ideally '''under 5 ms''', for AR overlays to appear convincingly locked to the real world.<ref name="Abrash_GI" /><ref name="DAQRI_Slides" /><ref name="DAQRI_Pipeline" /><ref name="mania2004" /> This makes latency a significantly greater engineering hurdle for AR than for VR.<ref name="DAQRI_Mobile_AR" />
=== Pioneering Perspectives: John Carmack and Michael Abrash ===
The modern focus on minimizing MTP latency was championed by key industry figures.
*'''John Carmack''' identified latency as "one of the most critical factors" for VR while at Oculus. His extensive writings and talks educated the developer community on the sources of latency throughout the PC graphics pipeline—from sensor sampling to driver buffering and display scan-out—and he pioneered techniques like [[reprojection]] to mitigate it.<ref name="CarmackLatency" /><ref name="CarmackPaper" />
*'''[[Michael Abrash]]''', a leading researcher at [[Valve]] and later [[Meta Platforms|Meta]], famously described latency as the "sine qua non"—the essential, indispensable ingredient—of VR and AR.<ref name="Abrash_Reddit" /> He argued that while 20 ms was a viable target for VR, the true requirement for believable AR could be as low as 15 ms or even 7 ms.<ref name="Abrash_GI" /> In 2013, he noted that typical systems were achieving around 36 ms, a figure he described as "a long way from 20 ms, and light-years away from 7 ms," highlighting the scale of the challenge ahead.<ref name="Abrash_GI" />
== Measurement Techniques ==
Accurately quantifying MTP latency is a complex task. Simply instrumenting the software to report its own timing is often unreliable, as the measurement process itself can introduce additional latency and alter the system's behavior.<ref name="CarmackLatency" /> Consequently, the standard for precise measurement relies on external hardware-based systems that can observe both the physical motion and the resulting photonic output without interfering with the device under test.
=== Hardware-Based Measurement Systems ===
Several methods have been developed to provide objective, high-precision latency measurements.
==== High-Speed Camera and Motion Rig Analysis ====
A widely used and robust method involves a high-speed camera to simultaneously film a physical motion and the corresponding update on the HMD's display.<ref name="CarmackLatency" /><ref name="Warburton2022" /><ref name="Tsai2017" /> The process typically involves:
# '''Controlled Motion''': The HMD or controller is mounted on a precisely controlled robotic motion rig, such as a rotary platform, a linear rail system, or a pendulum.<ref name="So2021" /><ref name="Pape2017" />
# '''High-Speed Recording''': A camera recording at a high frame rate (e.g., 240 fps, 960 fps, or higher) captures both the physical object and the HMD screen in the same shot.<ref name="Tsai2017" /><ref name="OptoFidelity" /> Commercial tools like OptoFidelity's BUDDY system represent the state-of-the-art for this method.<ref name="optofidelity2023" />
# '''Analysis''': The recorded video is analyzed frame-by-frame. The latency is calculated by counting the number of frames between the initiation of physical motion and the first visible change on the display.<ref name="CarmackLatency" /><ref name="Tsai2017" />
==== Photodiode and Oscilloscope Methods ====
For even higher temporal precision, a method using a [[photodiode]] (a high-speed light sensor) and an [[oscilloscope]] can be employed.<ref name="Pape2017" /><ref name="Steed2020" />
# A photodiode is physically attached to a specific area of the HMD screen.
# A physical event (e.g., a pendulum swinging past a sensor) triggers the start of a timer on an oscilloscope.<ref name="Pape2017" /><ref name="Steed2020" />
# The VR application is programmed to change the color of the screen area under the photodiode (e.g., from black to white) as soon as it virtually registers that same event.
# The photodiode detects the change in light and sends a signal to the oscilloscope, stopping the timer. The elapsed time displayed on the oscilloscope is a direct measurement of the MTP latency.<ref name="Pape2017" />
=== Differentiating Initial vs. Continuous Latency ===
A crucial finding from modern latency measurement studies is that MTP latency is not a static value; it changes dynamically depending on the nature of the user's movement.<ref name="Warburton2022" /><ref name="Stauffert2020" />
*'''Initial Latency''' (or Unpredicted Latency): This is the latency measured at the very onset of a sudden, abrupt movement from a standstill. In this phase, latency is typically at its highest because latency-mitigation algorithms like [[predictive tracking]] have not yet collected enough motion data to build an accurate model of the movement's trajectory.<ref name="Warburton2022" /><ref name="Warburton2022_bioRxiv" /><ref name="OptoFidelity" />
*'''Continuous Latency''' (or Predicted Latency): This is the latency measured during a smooth, ongoing, and predictable movement (e.g., the middle of a steady head turn). During such movements, prediction algorithms have sufficient data to accurately extrapolate the future pose of the device, functionally reducing the perceived latency to a much lower value, in some cases to just a few milliseconds.<ref name="Warburton2022" /><ref name="Warburton2022_bioRxiv" />
=== Industry Standards ===
The importance of latency is reflected in formal standards. '''[[IEEE]] Standard 3079.1''' was published to specify requirements and measurement procedures for "Motion to Photon (MTP) Latency in Virtual Environments."<ref name="ieee2021" /> Platform manufacturers also hold to strict internal targets (e.g., Sony <18ms for PSVR, Meta 13ms at 90Hz).<ref name="ieee2021" />
== Latency Mitigation Strategies ==
Given the physical and computational constraints that make zero latency impossible, VR and AR system designers have developed a sophisticated suite of software and hardware techniques to reduce or perceptually hide MTP latency.
=== Software and Algorithmic Solutions ===
==== Predictive Tracking ====
[[Predictive tracking]] is a foundational and universally employed technique to combat latency.<ref name="XinReality_Predictive" /> Instead of rendering the virtual scene based on where the user's head ''is'', the system predicts where it ''will be'' at the future time when the frame is actually displayed.<ref name="RoadToVR_Predictive" />
*'''Mechanism''': The tracking system analyzes the current motion's velocity and acceleration to extrapolate a future pose. The [[game engine]] then uses this predicted pose to render the scene.
*'''Algorithms''': These range from simple [[dead reckoning]] to more advanced models like [[Kalman filter]]s, which are considered the gold standard for fusing noisy sensor data and predicting future states.<ref name="RoadToVR_Predictive" /><ref name="roadtovr2017" />
*'''Limitations''': Prediction is most effective for smooth movements and less accurate for sudden, unpredictable changes in direction, which is why initial latency is higher than continuous latency.<ref name="Warburton2022" />
==== Asynchronous Timewarp (ATW) ====
[[Timewarp|Asynchronous Timewarp]] (ATW) is a powerful [[reprojection]] technique that decouples the final presentation of an image from the application's main render loop.<ref name="Meta_ATW_Examined" /><ref name="ATW_Reddit" />
*'''Mechanism''': ATW runs on a separate, high-priority thread. Just before the display refreshes (an event known as [[vsync]]), ATW takes the most recently completed frame and "warps" it based on the very latest head-tracking data. This warped image is what gets sent to the display.<ref name="Meta_ATW_Examined" /> This process is computationally cheap because basic ATW only corrects for '''rotational''' head movement.
*'''Benefit''': If the main application is slow and fails to render a new frame in time, ATW can still take the ''previous'' frame and re-project it with the new head rotation. This prevents the jarring "world judder" that would otherwise occur, making the experience perceptually more comfortable.<ref name="Meta_Latency" /><ref name="Meta_ATW_Examined" />
*'''Limitations''': Because standard ATW does not account for '''translational''' (positional) movement, it can introduce an artifact known as '''positional judder''', where near-field objects appear to stutter when the user moves their head side-to-side.<ref name="Meta_ATW_Examined" />
==== Asynchronous Spacewarp (ASW) and Reprojection ====
[[Asynchronous Spacewarp]] (ASW) is an evolution of ATW designed to address the limitations of rotational-only reprojection and to handle cases where the application cannot consistently meet the target frame rate.<ref name="Meta_ASW" />
*'''Mechanism''': When an application's performance drops, the system can intentionally lock its render rate to half (e.g., 45 Hz). ASW then generates a synthetic, extrapolated frame between each real frame, bringing the frame rate back up to the display's native rate (e.g., 90 Hz).<ref name="XinReality_ASW" />
*'''Difference from ATW''': ASW analyzes the motion between the last two rendered frames to create motion vectors for objects in the scene. This allows it to handle not just head rotation, but also user translation, controller movement, and animations. Modern versions like ASW 2.0 incorporate the [[depth buffer]] to reduce visual artifacts.<ref name="Meta_ASW" /><ref name="oculus2016asw" />
*'''Limitations''': ASW is a form of motion estimation and can produce visual artifacts like object warping, ghosting, or shimmering, especially around the edges of moving objects or in areas of disocclusion.<ref name="DCS_Reprojection" />
*'''Related Techniques''': Other platforms have similar technologies, such as Valve's [[Motion smoothing|Motion Smoothing]].<ref name="DCS_Reprojection" /><ref name="Wiki_ASW" /> More advanced research includes ML-based techniques like PredATW.<ref name="arxiv" />
==== System-Level Optimizations ====
*'''[[Late Latching]]''': A strategy that delays the sampling of the head and controller pose to the last possible moment before the GPU begins rendering.<ref name="Meta_Late_Latching" /> This ensures the pose data is as fresh as possible.
*'''Direct Mode Rendering''': A driver-level feature that allows the VR runtime to write directly to the headset display, bypassing the operating system's desktop compositor, which could otherwise add one or more frames of latency.<ref name="uploadvr2016" />
=== Hardware and Display Technologies ===
==== High Refresh Rates ====
A fundamental way to reduce latency is to use a display with a high refresh rate. Increasing the refresh rate shortens the time interval between updates (e.g., 60 Hz = 16.7 ms/frame, 90 Hz = 11.1 ms/frame, 120 Hz = 8.3 ms/frame), directly reducing the maximum "display wait" portion of the latency budget.<ref name="SpatialStudio" /><ref name="Panox" />
==== Low-Persistence Displays: OLED vs. LCD ====
A critical factor in reducing motion blur, which is perceptually linked to latency, is display '''persistence'''. Persistence is the length of time a pixel remains illuminated during a single frame.<ref name="LowPersistence_Reddit" />
*'''Full Persistence''': On a traditional display, a frame is shown for the entire refresh duration (e.g., 16.7 ms at 60 Hz). As the user's eye moves, the static image is smeared across the retina, creating significant motion blur.<ref name="ARM_VR" />
*'''Low Persistence''': Modern VR displays use a low-persistence technique. They illuminate the pixels for only a very brief portion of the frame time (e.g., 1-3 ms, or ~20% of the cycle) and keep the screen dark for the remainder. This strobing is too fast to be perceived as flicker but ensures the eye sees a sharp, distinct image, similar to the behavior of a [[CRT]] display. This dramatically reduces motion blur.<ref name="LowPersistence_Reddit" /><ref name="blurbusters2018" />
The choice of display technology impacts the ability to achieve low persistence:
*''']''': [[Organic light-emitting diode]] displays are ideal, as each pixel is its own light source and can be turned on and off almost instantaneously, allowing for precise control over illumination time.<ref name="OLED_vs_LCD_Reddit" />
*''']''': [[Liquid-crystal display]] panels traditionally have slower pixel response times. However, modern "fast LCD" panels have dramatically improved response times and are now widely used in many high-performance VR headsets, offering benefits like higher subpixel density which reduces the [[screen-door effect]].<ref name="Panox" />
==== Global Shutter Cameras ====
For optical tracking, [[global shutter]] cameras are preferred over [[rolling shutter]] cameras. A global shutter captures the entire image at the exact same instant, providing a clear, un-distorted tracking reference. A rolling shutter captures the image line-by-line, which can cause wobble or "jello" artifacts during fast motion, complicating the tracking calculations.<ref name="carmack2013" />
== Application and Comparative Analysis ==
=== Latency in Commercial VR Headsets ===
Empirical studies using high-speed camera systems have measured the MTP latency of various consumer VR headsets. These studies consistently validate the importance of distinguishing between initial latency at the onset of a sudden movement and the much lower continuous latency achieved once prediction algorithms are active.
{| class="wikitable"
|+ Comparative MTP Latency in Commercial VR Headsets
! Headset !! Year !! Motion-to-Photon !! Refresh Rate !! Resolution per Eye !! Display Type !! Notable Features
|-
! colspan="7" | Historical Systems
|-
| [[Oculus Rift DK1]] || 2013 || 50-60ms || 60Hz || 640×800 || LCD || First consumer dev kit
|-
| [[Oculus Rift DK2]] || 2014 || ~30ms || 75Hz || 960×1080 || OLED || Low-persistence breakthrough
|-
| [[Oculus Rift CV1]] || 2016 || ~22ms (2ms predicted) || 90Hz || 1080×1200 || OLED || Established 90Hz standard
|-
| [[HTC Vive]] || 2016 || 21-42ms (2-13ms predicted)<ref name="warburton2022" /> || 90Hz || 1080×1200 || OLED || [[Lighthouse (tracking system)|Lighthouse tracking]]
|-
! colspan="7" | Current Generation
|-
| [[Meta Quest 3]] || 2023 || '''2-3ms'''<ref name="optofidelity2024"/> || 90-120Hz || 2064×2208 || LCD || Best-in-class latency
|-
| [[Meta Quest Pro]] || 2022 || '''2-3ms'''<ref name="optofidelity2024"/> || 90Hz || 1800×1920 || LCD + local dimming || [[Mixed reality]]
|-
| [[Valve Index]] || 2019 || 21-42ms (2-13ms predicted)<ref name="warburton2022" /> || 80-144Hz || 1440×1600 || LCD || 130° [[Field of view|FOV]]
|-
| [[Apple Vision Pro]] || 2024 || '''~3ms angular, 11ms see-through'''<ref name="optofidelity2024"/> || 90Hz || 3660×3200 || [[micro-OLED]] || [[Apple R1|R1 chip]]
|-
| [[PlayStation VR2]] || 2023 || Not published || 90-120Hz || 2000×2040 || OLED HDR || [[Eye tracking]]
|}
=== The Augmented Reality Challenge ===
As previously noted, AR's requirement for virtual objects to remain perfectly registered with the real world imposes much stricter latency constraints. The specific nature of this challenge depends on the type of AR display technology used.
==== Optical See-Through (OST) vs. Video See-Through (VST) Systems ====
*'''Optical See-Through (OST) AR''': In OST systems (e.g., [[Microsoft HoloLens]]), the user views reality directly through transparent displays. In this model, only the virtual content is subject to MTP latency. The primary engineering goal is to reduce the latency of this virtual content to an absolute minimum (ideally < 5 ms) to prevent it from "swimming" or lagging behind the perfectly stable, zero-latency real world.<ref name="Ellis2016" /><ref name="DAQRI_Mobile_AR" />
*'''Video See-Through (VST) AR''': In VST systems (used for the [[passthrough]] mode in VR headsets like the [[Meta Quest 3]] and [[Apple Vision Pro]]), the user views reality through a set of cameras that stream video to the internal, opaque displays. In this architecture, both the real-world video feed and the virtual overlays are subject to latency.<ref name="Ellis2016" /> This introduces a different engineering trade-off. While achieving an absolute MTP latency of < 5 ms for both streams is extremely difficult, a more achievable and perceptually effective goal is to '''match''' the latency of the virtual objects to the latency of the passthrough video. If both the real-world video and the virtual graphics are delayed by the same amount (e.g., 11 ms on Apple Vision Pro<ref name="optofidelity2024" />), the virtual objects will appear perfectly registered and stable relative to the video feed, even though the entire combined view lags slightly behind the user's physical head motion. This prioritizes registration quality over absolute temporal accuracy.
== Historical Development ==
The [[Oculus Rift DK1]] in March 2013 brought VR to developers but suffered from '''50-60ms latency that caused widespread nausea.''' John Carmack joining Oculus in August 2013 marked a turning point, as he focused the team on a "relentless pursuit of low latency." His October 2013 presentation outlined the comprehensive mitigation strategies that would become industry standard.<ref name="carmack2013" />
The [[Oculus Rift DK2]] (2014) was the first major breakthrough, introducing a low-persistence OLED display and positional tracking, cutting latency to ~30ms. The March 2016 consumer launches of the [[Oculus Rift CV1]] and [[HTC Vive]] established the modern baseline with 90Hz OLED displays achieving approximately 20-22ms baseline latency (before prediction).
The 2023-2024 generation of headsets, like the [[Meta Quest 3]] and [[Apple Vision Pro]], achieved effectively imperceptible angular latency. The Quest 3's measured 2-3ms represents a '''95% reduction from the DK1's 50-60ms over ten years''', marking one of VR's greatest technical achievements.<ref name="optofidelity2024" />
== Future Challenges ==
With angular MTP latency largely solved for local rendering, future challenges shift to other parts of the system:
* '''[[Cloud VR]]''': Streaming VR from a remote server reintroduces significant network latency (e.g., 20-40ms on [[5G]]) that must be hidden from the user, requiring advanced prediction and [[edge computing]].
* '''[[Eye tracking]] Latency''': For [[foveated rendering]] to work, the system must detect where the user is looking and render a high-resolution image in that spot before the eye moves again (a [[saccade]]). This requires an "eye-motion-to-photon" latency that is also exceptionally low.
* '''[[Varifocal displays]]''': Systems that can dynamically change focus must do so in sync with the user's eye movements, adding another latency-critical pipeline.
* '''Higher Resolutions''': Pushing to "retinal resolution" (e.g., 8K per eye) while maintaining high frame rates (90-120Hz) creates an immense rendering bottleneck that can reintroduce latency if not managed.
== See Also ==
* [[Asynchronous Spacewarp]]
* [[Asynchronous time warp]]
* [[Cybersickness]]
* [[Eye tracking]]
* [[Foveated rendering]]
* [[Head-mounted display]]
* [[Inertial measurement unit]]
* [[John Carmack]]
* [[Latency (engineering)]]
* [[Low-persistence displays]]
* [[Michael Abrash]]
* [[Persistence of vision]]
* [[Predictive tracking]]
* [[Presence (virtual reality)]]
* [[Timewarp]]
== References ==
<references>
<ref name="UnityGlossary">{{cite web |url=https://unity.com/glossary/motion-to-photon-latency |title=Motion-to-Photon Latency |publisher=Unity Technologies}}</ref>
<ref name="Warburton2022">{{cite journal |last1=Warburton |first1=Mark |last2=Raw |first2=Michael |last3=Read |first3=Jenny C. A. |last4=Wilkie |first4=Richard M. |title=Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems |journal=Behavior Research Methods |date=2022-10-10 |volume=55 |issue=7 |pages=3658–3678 |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10616216/ |pmid=36217006 |pmc=10616216 |doi=10.3758/s13428-022-01983-5}}</ref>
<ref name="Stauffert2020">{{cite journal |last1=Stauffert |first1=Jan-Philipp |last2=Niebling |first2=Florian |last3=Latoschik |first3=Marc Erich |title=Latency and Cybersickness: A Survey |journal=2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) |date=2020 |pages=586-603 |doi=10.1109/ISMAR50242.2020.00088 |url=https://doaj.org/article/2dfbbdc3fe704977bc34b594f96fdf8b}}</ref>
<ref name="CarmackLatency">{{cite web |url=https://www.pcgamesn.com/virtual-reality-john-carmacks-battle-20-millisecond-latency |title=John Carmack's battle with 20 millisecond latency |publisher=PCGamesN}}</ref>
<ref name="CarmackPaper">{{cite web |url=https://danluu.com/latency-mitigation/ |title=Latency Mitigation Strategies |author=John Carmack |date=2013-02-22}}</ref>
<ref name="CarmackRoadToVR">{{cite web |url=https://www.roadtovr.com/john-carmack-talks-virtual-reality-latency-mitigation-strategies/ |title=John Carmack Talks Virtual Reality Latency Mitigation Strategies |publisher=Road to VR}}</ref>
<ref name="Stanney2020">{{cite journal |last1=Stanney |first1=Kay |last2=Lawson |first2=Ben |last3=Rokers |first3=Bas |last4=Dennison |first4=Mark |last5=Blackwell |first5=Katherine |last6=Stoffregen |first6=Thomas |last7=Fuchs |first7=Henry |last8=Welch |first8=Gregory |title=Identifying Causes of and Solutions for Cybersickness in Immersive Technology: A Review |journal=Frontiers in Virtual Reality |date=2020-11-20 |volume=1 |pages=587698 |doi=10.3389/frvir.2020.587698 |url=https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2020.587698/full}}</ref>
<ref name="SpatialStudio">{{cite web |url=https://thespatialstudio.de/en/xr-glossary/motion-to-photon-latency |title=Motion-to-Photon Latency |publisher=The Spatial Studio}}</ref>
<ref name="Warburton2022_bioRxiv">{{cite web |url=https://www.biorxiv.org/content/10.1101/2022.06.24.497509v1.full-text |title=Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems |author=Warburton et al. |date=2022-06-27}}</ref>
<ref name="So2021">{{cite journal |last1=So |first1=Richard H. Y. |last2=Ho |first2=Adrian |last3=Lo |first3=W. T. |title=A Time-Sequential Measurement System for Motion-to-Photon Latency in a Virtual Reality Head-Mounted Display |journal=Electronics |date=2018-09-01 |volume=7 |issue=9 |pages=171 |doi=10.3390/electronics7090171 |url=https://www.mdpi.com/2079-9292/7/9/171}}</ref>
<ref name="DAQRI_Pipeline">{{cite web |url=https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926 |title=Motion-to-Photon Latency in Mobile AR and VR |publisher=DAQRI}}</ref>
<ref name="Meta_Latency">{{cite web |url=https://developers.meta.com/horizon/blog/understanding-gameplay-latency-for-oculus-quest-oculus-go-and-gear-vr/ |title=Understanding Gameplay Latency for Oculus Quest, Oculus Go, and Gear VR |publisher=Meta}}</ref>
<ref name="OLED_vs_LCD_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/212imk/oleds_are_substantially_inferior_to_rgb_lcds_in/ |title=OLEDs are substantially inferior to RGB LCDs in terms of both sharpness... and due to oversaturation}}</ref>
<ref name="Leap_Tracking_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/2s0eu3/question_optimal_latency_of_perceived_hand/ |title=Question: Optimal latency of perceived hand tracking/viewing in VR?}}</ref>
<ref name="Meta_ATW_Examined">{{cite web |url=https://developers.meta.com/horizon/blog/asynchronous-timewarp-examined/ |title=Asynchronous Timewarp Examined |publisher=Meta}}</ref>
<ref name="XinReality_MTP">{{cite web |url=https://xinreality.com/wiki/Motion-to-photon_latency |title=Motion-to-photon latency |publisher=XinReality}}</ref>
<ref name="DAQRI_Mobile_AR">{{cite web |url=https://www.engineering.com/the-greatest-engineering-challenge-to-improve-mobile-augmented-reality-headsets/ |title=The Greatest Engineering Challenge to Improve Mobile Augmented Reality Headsets |publisher=Engineering.com}}</ref>
<ref name="Ellis2016">{{cite journal |last1=Ellis |first1=Stephen R. |last2=Adelstein |first2=Bernard D. |last3=Ye |first3=Yong |last4=Kaiser |first4=Mary K. |last5=Begault |first5=Durand R. |title=How Latency and Spatial Matching of Visual and Auditory Information Affects Presence in a Training Simulation |journal=Frontiers in ICT |date=2016-12-23 |volume=3 |pages=34 |doi=10.3389/fict.2016.00034 |url=https://www.frontiersin.org/journals/ict/articles/10.3389/fict.2016.00034/full}}</ref>
<ref name="Abrash_GI">{{cite web |url=https://www.gamesindustry.biz/valves-michael-abrash-latency-is-getting-in-the-way-of-vr |title=Valve's Michael Abrash: Latency is getting in the way of VR |publisher=GamesIndustry.biz}}</ref>
<ref name="DAQRI_Slides">{{cite web |url=https://xdc2018.x.org/slides/Heinrick_Fink_daqri_optimizing_motion_to_photon_latency.pdf |title=Optimizing Motion to Photon Latency |author=Heinrich Fink |publisher=DAQRI}}</ref>
<ref name="Abrash_Reddit">{{cite web |url=https://www.reddit.com/r/programming/comments/15rj38/michael_abrash_latency_is_the_sine_qua_non_of_ar/ |title=Michael Abrash: Latency is the sine qua non of AR and VR}}</ref>
<ref name="Tsai2017">{{cite conference |last1=Tsai |first1=Yu-Ju |last2=Wang |first2=Yu-Xiang |last3=Ouhyoung |first3=Ming |title=Affordable System for Measuring Motion-to-Photon Latency of Virtual Reality in Mobile Device |booktitle=SIGGRAPH '17: ACM SIGGRAPH 2017 Posters |date=2017-07-30 |doi=10.1145/3102163.3102188 |url=https://liagm.github.io/pdf/affordable.pdf}}</ref>
<ref name="Pape2017">{{cite conference |last1=Pape |first1=Sascha |last2=Strothmann |first2=Tobias |last3=Wienrich |first3=Carolin |last4=Latochek |first4=Marc Erich |title=Concept of a Low-Cost Device for the Automated Measurement of Motion-to-Photon Latency in Virtual Reality Environments |booktitle=2017 IEEE Symposium on 3D User Interfaces (3DUI) |date=2017-03-18 |doi=10.1109/3DUI.2017.7893344 |url=https://vr.rwth-aachen.de/media/papers/185/Pape_SEARIS_Calibratio.pdf}}</ref>
<ref name="Steed2020">{{cite conference |last1=Steed |first1=Anthony |last2=Frisoli |first2=Antonio |last3=Giachritsis |first3=Christos |last4=Krog |first4=Sue |last5=Padrao |first5=Daniela |last6=Pan |first6=Yuan |title=On the Plausibility of a Pass-Through Video See-Through Display |booktitle=2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) |date=2020-03-22 |doi=10.1109/VR46266.2020.00017 |url=https://www.microsoft.com/en-us/research/wp-content/uploads/2020/01/ieee_vr_2020___latency.pdf}}</ref>
<ref name="OptoFidelity">{{cite web |url=https://www.optofidelity.com/insights/blogs/measuring-head-mounted-displays-hmd-motion-to-photon-mtp-latency |title=Measuring Head-Mounted Displays (HMD) Motion-to-Photon (MTP) Latency |publisher=OptoFidelity}}</ref>
<ref name="M2P_Github">{{cite web |url=https://github.com/immersivecognition/motion-to-photon-measurement |title=motion-to-photon-measurement |publisher=GitHub}}</ref>
<ref name="XinReality_Predictive">{{cite web |url=https://xinreality.com/wiki/Predictive_tracking |title=Predictive tracking |publisher=XinReality}}</ref>
<ref name="RoadToVR_Predictive">{{cite web |url=https://www.roadtovr.com/understanding-predictive-tracking-important-arvr-headsets/ |title=Understanding Predictive Tracking and Why It's Important for AR/VR Headsets |publisher=Road to VR}}</ref>
<ref name="ATW_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/4bycfx/asynchronous_time_warp_in_laymans_terms/ |title=Asynchronous Time Warp in layman's terms?}}</ref>
<ref name="Meta_ASW">{{cite web |url=https://developers.meta.com/horizon/blog/asynchronous-spacewarp/ |title=Asynchronous Spacewarp |publisher=Meta}}</ref>
<ref name="XinReality_ASW">{{cite web |url=https://xinreality.com/wiki/Asynchronous_Spacewarp |title=Asynchronous Spacewarp |publisher=XinReality}}</ref>
<ref name="DCS_Reprojection">{{cite web |url=https://forum.dcs.world/topic/244962-asynchronous-reprojection-explained-aka-ghosting-stuttering-etc/ |title=Asynchronous Reprojection Explained (AKA Ghosting, Stuttering, etc.)}}</ref>
<ref name="Wiki_ASW">{{cite web |url=https://en.wikipedia.org/wiki/Asynchronous_reprojection |title=Asynchronous reprojection |publisher=Wikipedia}}</ref>
<ref name="Meta_Late_Latching">{{cite web |url=https://developers.meta.com/horizon/blog/optimizing-vr-graphics-with-late-latching/ |title=Optimizing VR Graphics With Late Latching |publisher=Meta}}</ref>
<ref name="Panox">{{cite web |url=https://blog.panoxdisplay.com/how-does-a-vr-lcd-display-improve-immersion/ |title=How Does a VR LCD Display Improve Immersion? |publisher=Panox Display}}</ref>
<ref name="LowPersistence_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/2xoscc/is_90hz_a_fast_enough_refresh_rate_that_low/ |title=Is 90hz a fast enough refresh rate that low persistence has been scrapped?}}</ref>
<ref name="ARM_VR">{{cite web |url=https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/virtual-reality-blatant-latency-and-how-to-avoid-it |title=Virtual Reality: Blatant latency and how to avoid it |publisher=ARM}}</ref>
<ref name="BlurBusters">{{cite web |url=https://blurbusters.com/faq/oled-motion-blur/ |title=Why Some OLEDs Have Motion Blur}}</ref>
<ref name="SPIE_AR">{{cite conference |last1=Räty |first1=Kalle |last2=Kovalev |first2=Dmitry |last3=Gotchev |first3=Atanas |title=Reducing motion-to-photon latency in multi-focal augmented reality display |booktitle=Proc. SPIE 11765, Digital Optics for Immersive Displays III |date=2021-04-01 |doi=10.1117/12.2578144 |url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/11765/117650W/Reducing-motion-to-photon-latency-in-multi-focal-augmented-reality/10.1117/12.2578144.full}}</ref>
<ref name="optofidelity2024">OptoFidelity. "Apple Vision Pro Benchmark Test 2: Angular Motion-to-Photon Latency in VR". https://www.optofidelity.com/insights/blogs/apple-vision-pro-bencmark-test-2.-angular-motion-to-photon-latency-in-vr</ref>
<ref name="Varjo2025">Varjo (2025). "Latency in virtual and mixed reality explained." Varjo Support Knowledge Base. [https://support.varjo.com/hc/en-us/articles/4403442941201-Latency-in-virtual-and-mixed-reality-explained support.varjo.com]</ref>
<ref name="Wagner2018">Wagner, D. (2018). "Motion-to-Photon Latency in Mobile AR and VR." *DAQRI Blog* (Medium), Aug. 20, 2018. [https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926 Medium]</ref>
<ref name="6GShine2024">Sampath, H. et al. (2024). "Preliminary results on the management of traffic for XR services" (6G-SHINE Deliverable D4.2). (Citing that >20ms latency is too high for VR/AR, with ~15ms or even 7ms as a possible threshold.) [https://6gshine.eu/wp-content/uploads/2024/11/D4.2_Preliminary-results-on-the-management-of-traffic-v1.0.pdf 6G-SHINE]</ref>
<ref name="XRglossary2023">The Spatial Studio (2023). "Motion-to-Photon Latency – XR Glossary." (Noting typical latencies 10–50 ms, comfort thresholds, and importance of high frame rates and technologies like Time Warp.) [https://thespatialstudio.de/en/xr-glossary/motion-to-photon-latency thespatialstudio.de]</ref>
<ref name="Nguyen2019">Nguyen, T. (2019). "Low-latency Mixed Reality Headset." Project report, UC Berkeley CS262A. (Noting that by the time an image is observed, it is "tens of milliseconds" out-of-date due to pipeline delays.) [https://people.eecs.berkeley.edu/~kubitron/courses/cs262a-F19/projects/reports/project14_report.pdf PDF]</ref>
<ref name="OculusBP2014">Oculus VR (2014). ''Oculus Rift Best Practices Guide''. Oculus VR, Jan 2014. (See Section: "Minimizing Latency" – recommends "<20ms motion-to-photon latency"). [https://s3.amazonaws.com/arena-attachments/238441/2330603062c2e502c5c2ca40443c2fa4.pdf PDF]</ref>
<ref name="daqri">{{Cite web |url=https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926 |title=Motion to Photon Latency in Mobile AR and VR |author=DAQRI |date=2018-08-20 |access-date=2025-10-28}}</ref>
<ref name="optofidelity">{{Cite web |url=https://www.optofidelity.com/insights/blogs/measuring-head-mounted-displays-hmd-motion-to-photon-mtp-latency |title=Measuring Head-Mounted Display's (HMD) Motion-To-Photon (MTP) Latency |publisher=OptoFidelity |date=2021-05-20 |access-date=2025-10-28}}</ref>
<ref name="frontiers">{{Cite journal |url=https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2020.582204/full |title=Latency and Cybersickness: Impact, Causes, and Measures. A Review |journal=Frontiers in Virtual Reality |volume=1 |pages=582204 |date=2020-11-25 |doi=10.3389/frvir.2020.582204 |access-date=2025-10-28}}</ref>
<ref name="varjo">{{Cite web |url=https://support.varjo.com/hc/en-us/latency |title=Latency in virtual and mixed reality explained |publisher=Varjo |access-date=2025-10-28}}</ref>
<ref name="spatial">{{Cite web |url=https://thespatialstudio.de/en/xr-glossary/motion-to-photon-latency |title=Motion-to-Photon Latency definition and description |publisher=The Spatial Studio |access-date=2025-10-28}}</ref>
<ref name="arxiv">{{Cite arXiv |url=https://arxiv.org/pdf/2301.10408 |title=Minimizing the Motion-to-Photon-delay (MPD) in Virtual Reality using GPU-accelerated Asynchronous Time Warp |date=2023-01-25 |access-date=2025-10-28}}</ref>
<ref name="meta">{{Cite web |url=https://developers.meta.com/horizon/blog/optimizing-vr-graphics-with-late-latching/ |title=Optimizing VR Graphics with Late Latching |publisher=Meta for Developers |access-date=2025-10-28}}</ref>
<ref name="matrise">{{Cite web |url=https://www.matrise.no/2018/07/the-history-of-virtual-reality/ |title=The History of Virtual Reality - How Did The Technology Develop? |publisher=Matrise |date=2018-07-01 |access-date=2025-10-28}}</ref>
<ref name="frontiers2020">Stauffert, J., Niebling, F., & Latoschik, M. E. (2020). "Latency and Cybersickness: Impact, Causes, and Measures. A Review". Frontiers in Virtual Reality. https://www.frontiersin.org/articles/10.3389/frvir.2020.582204/full</ref>
<ref name="chioka2013">Chioka. "What is Motion-To-Photon Latency?". http://www.chioka.in/what-is-motion-to-photon-latency/</ref>
<ref name="medium2017">DAQRI. "Motion to Photon Latency in Mobile AR and VR". Medium. https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926</ref>
<ref name="carmack2013">Carmack, John. "Latency mitigation strategies". https://danluu.com/latency-mitigation/</ref>
<ref name="oledinfo2014">OLED-Info. "Oculus VR shows a new HMD prototype with a low-persistence OLED". https://www.oled-info.com/oculus-vr-shows-new-hmd-prototype-low-persistance-oled</ref>
<ref name="jerald2009">Jerald, J. (2009). "Scene-motion- and latency-perception thresholds for head-mounted displays". University of North Carolina. https://www.semanticscholar.org/paper/Scene-motion-and-latency-perception-thresholds-for-Jerald/42a432706ce172498080b7d879bd6a6059bdcaa5</ref>
<ref name="mania2004">Mania, K., Adelstein, B. D., Ellis, S. R., & Hill, M. I. (2004). "Perceptual sensitivity to head tracking latency in virtual environments with varying degrees of scene complexity". ACM Applied Perception. https://dl.acm.org/doi/10.1145/1012551.1012559</ref>
<ref name="ieee2021">IEEE Standards Association. "IEEE 3079.1 - Motion to Photon (MTP) Latency in Virtual Environments". https://standards.ieee.org/ieee/3079.1/10383/</ref>
<ref name="warburton2022">Warburton, M., Mon-Williams, M., Mushtaq, F., & Morehead, J. R. (2022). "Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems". Behavior Research Methods. https://link.springer.com/article/10.3758/s13428-022-01983-5</ref>
<ref name="optofidelity2023">OptoFidelity. "Comparing VR headsets' tracking performance". https://www.optofidelity.com/blog/comparing-vr-headsets-tracking-performance</ref>
<ref name="meta2019">Meta. "Understanding Gameplay Latency for Oculus Quest, Oculus Go and Gear VR". https://developers.meta.com/horizon/blog/understanding-gameplay-latency-for-oculus-quest-oculus-go-and-gear-vr/</ref>
<ref name="blurbusters2018">Blur Busters. "Why Do Some OLEDs Have Motion Blur?". https://blurbusters.com/faq/oled-motion-blur/</ref>
<ref name="oculus2016">Oculus. "Asynchronous Timewarp on Oculus Rift". https://developer.oculus.com/blog/asynchronous-timewarp-on-oculus-rift/</ref>
<ref name="oculus2016asw">Oculus. "Asynchronous Spacewarp". https://developer.oculus.com/blog/asynchronous-spacewarp/</ref>
<ref name="roadtovr2017">Road to VR. "Understanding Predictive Tracking and Why It's Important for AR/VR Headsets". https://www.roadtovr.com/understanding-predictive-tracking-important-arvr-headsets/</ref>
<ref name="uploadvr2016">UploadVR. "Latest Steam VR Update Brings Direct Mode To Vive Pre". https://www.uploadvr.com/latest-steam-vr-update-brings-direct-mode-to-vive-pre/</ref>
</references>
[[Category:Virtual reality]]
[[Category:Augmented reality]]
[[Category:Human-computer interaction]]
[[Category:Computer graphics]]
[[Category:Terms]]
[[Category:Terms]]
[[Category:VR Technology]]
[[Category:Display Technology]]
[[Category:Human Factors]]