Jump to content

Vergence-accommodation conflict: Difference between revisions

From VR & AR Wiki
No edit summary
Tag: Reverted
Undo revision 34595 by Xinreality (talk)
Tag: Undo
Line 1: Line 1:
{{see also|Terms|Technical Terms}}
{{see also|Terms|Technical Terms}}
'''Vergence‑accommodation conflict''' ('''VAC'''), also called '''accommodation‑vergence conflict''' or '''accommodation–vergence mismatch''', is a visual–perceptual problem that arises when the cues that drive the eyes’ rotation ('''[[Vergence]]''') and the cues that drive the crystalline lens’s focus ('''[[Accommodation (eye)|accommodation]]''') specify **different viewing distances**. Under natural viewing the two mechanisms are neurally coupled via the [[Accommodation reflex|accommodation–vergence reflex]], so they agree; most stereoscopic displays break that coupling, which is now recognised as a primary source of visual discomfort, blurred vision and reduced task performance in modern [[Virtual reality|VR]], [[Augmented reality|AR]], and other [[Stereoscopy|stereoscopic]] systems.<ref name="Hoffman2008" /><ref name="Shibata2011" />
'''Vergence-accommodation conflict''' ('''VAC'''), also known as '''accommodation-vergence conflict''' or sometimes '''accommodation–vergence mismatch''', is a visual and perceptual phenomenon that occurs when the [[Brain|brain]] receives mismatching cues between the distance to which the eyes are pointed or converged ([[Vergence|vergence]]) and the distance at which the eyes' lenses are focused ([[Accommodation (eye)|accommodation]]).<ref name="Hoffman2008">{{cite web |last=Hoffman |first=D. M. |last2=Girshick |first2=A. R. |last3=Akeley |first3=K. |last4=Banks |first4=M. S. |title=Vergence–Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue (Journal of Vision, Vol. 8, Issue 3, Article 33) |url=https://jov.arvojournals.org/article.aspx?articleid=2192424 |year=2008 |access-date= [Insert Access Date Here] }}</ref><ref name="Kreylos2014VAC">{{cite web |last=Kreylos |first=Oliver |title=Accommodation and Vergence in Head-mounted Displays |url=http://doc-ok.org/?p=1602 |website=Doc-Ok.org |date=2014-04-13 |access-date= [Insert Access Date Here] }}</ref> Because natural viewing conditions tightly couple these two mechanisms, breaking that link is a primary cause of visual discomfort and performance issues in modern [[Virtual reality|virtual reality]] (VR), [[Augmented reality|augmented reality]] (AR), and other [[Stereoscopy|stereoscopic]] 3-D displays, including nearly all mainstream [[Head-Mounted Display|head-mounted displays]] (HMDs).<ref name="Hoffman2008" />


==Physiological basis==
==Physiological Basis==
When fixating a real‑world target the visual system executes two tightly linked responses:
When fixating on an object in the real world, the human [[Visual system|visual system]] simultaneously performs two key actions:
* '''Vergence''' – the two eyes rotate (converge or diverge) so that their lines of sight intersect at the target, driven mainly by binocular disparity.
*   '''[[Vergence]]''': The two eyes rotate inwards ([[Convergence (eye)|convergence]]) or outwards ([[Divergence (eye)|divergence]]) via the [[Extraocular muscles|extraocular muscles]] so their lines of sight intersect at the target object, enabling single [[Binocular vision|binocular vision]]. This response is primarily driven by [[Binocular disparity|retinal disparity]].
* '''Accommodation''' – the ciliary muscle changes the lens’s optical power so that the retinal image is in sharp focus, driven mainly by retinal blur.
*   '''[[Accommodation (eye)|Accommodation]]''': The [[Ciliary muscle|ciliary muscle]] adjusts the shape and thus the [[Optical power|optical power]] of the [[Crystalline lens|crystalline lens]] within each eye to bring the image of the target object into sharp focus on the [[Retina|retina]]. This response is primarily driven by retinal blur.


Reciprocal neurons link the two responses so they normally operate at the *same* distance. Stereoscopic displays decouple them because disparity cues stimulate vergence to the *simulated* depth of a virtual object, while blur cues still stimulate accommodation to the *physical* surface of the display, which is usually fixed at one optical distance.<ref name="Kramida2016" />
In natural vision, these two systems are tightly linked through fast, reciprocal neurological signals known as the [[Accommodation reflex|accommodation–vergence reflex]].<ref name="Kreylos2014VAC" /><ref name="Kramida2016">{{cite web |last=Kramida |first=G. |title=Resolving the Vergence–Accommodation Conflict in Head-Mounted Displays (IEEE Transactions on Visualization and Computer Graphics, Vol. 22, Issue 7) |url=https://ieeexplore.ieee.org/document/7296633 |year=2016 |access-date= [Insert Access Date Here] }}</ref> This coupling ensures that the eyes focus at the same distance they are pointed, allowing for clear, comfortable, and efficient vision. Stereoscopic displays disrupt this natural coupling because binocular disparity cues drive the vergence system to the ''simulated'' depth of a virtual object, while the accommodation system is driven by blur cues to focus on the ''physical'' display surface, which is typically at a fixed optical distance.<ref name="Kramida2016Resolving">{{cite web |last=Kramida |first=Gregory |last2=Varshney |first2=Amitabh |title=Resolving the Vergence-Accommodation Conflict in Head Mounted Displays |url=https://www.cs.umd.edu/sites/default/files/scholarly_papers/Kramidarev.pdf |website=Department of Computer Science, University of Maryland |year=2016 |access-date= [Insert Access Date Here] }}</ref>


==Causes / occurrence in display technologies==
==Causes / Occurrence in Display Technologies==
* '''Fixed‑focus HMDs''' – Nearly all consumer VR headsets place the screens at a single focal distance between ≈1 m and 2 m. Vergence varies with rendered disparity, so objects that appear nearer than the focal plane create a **positive VAC** (vergence distance < accommodation distance), whereas objects that appear farther create a **negative VAC** (vergence distance > accommodation distance).<ref name="Shibata2011" />
The vergence-accommodation conflict is inherent in display technologies where the perceived depth of content differs from the physical or optical distance of the display surface:
* '''3‑D cinema & television''' – The screen is farther away and subtends a smaller field‑of‑view, so conflicts are weaker, but they still constrain comfortable stereoscopic depth budgets.<ref name="ISO2015" />
* '''Optical see‑through AR''' – Virtual imagery is often rendered at a fixed focus (e.g., ≈2 m). When it is overlaid on real objects at different depths, users may be unable to focus sharply on both simultaneously, degrading registration and comfort.<ref name="Zhou2021" />


==Effects and symptoms==
*  '''Fixed-focus HMDs''': Nearly all consumer VR and many AR headsets use lenses to place a virtual image of the display screens at a fixed focal distance, typically between 1.3 and 2 meters (though this varies).<ref name="Kreylos2013HMD">{{cite web |last=Kreylos |first=Oliver |title=Head-mounted Displays and Lenses |url=http://doc-ok.org/?p=1360 |website=Doc-Ok.org |date=2013-07-24 |access-date= [Insert Access Date Here] }}</ref> Viewers must accommodate to this fixed plane to see a sharp image. However, stereoscopic rendering creates virtual objects that appear at various depths, requiring vergence changes. Objects rendered virtually nearer than the fixed focal plane induce a ''positive VAC'' (eyes converge more than they accommodate), while objects rendered virtually farther induce a ''negative VAC'' (eyes converge less than they accommodate).<ref name="Shibata2011">{{cite web |last=Shibata |first=T. |last2=Kim |first2=J. |last3=Hoffman |first3=D. M. |last4=Banks |first4=M. S. |title=The Zone of Comfort: Predicting Visual Discomfort With Stereo Displays (Journal of Vision, Vol. 11, Issue 8, Article 11) |url=https://jov.arvojournals.org/article.aspx?articleid=2192969 |year=2011 |access-date= [Insert Access Date Here] }}</ref>
Prolonged VAC drives extraocular and ciliary muscles in conflicting directions, empirically producing:<ref name="Hoffman2008" /><ref name="Lin2022" />
*  '''[[3D television|3D Cinema and Television]]''': VAC also occurs here, but symptoms are often milder. The screen is typically farther away, the [[Field of view|field of view]] is smaller, and content creators can limit disparities to keep virtual objects within a "zone of comfort" relative to the screen distance.<ref name="ISO2015">ISO 9241-392:2015, Ergonomics of human-system interaction — Part 392: Ergonomic requirements for the reduction of visual fatigue from stereoscopic images. International Organization for Standardization, 2015.</ref>
* visual fatigue / eyestrain and headaches;
*   '''[[Optical see-through display|Optical See-Through (OST) AR]]''': In OST AR glasses, virtual images (often at a fixed focus) are overlaid onto the real world. This creates a conflict not only between vergence and accommodation for virtual objects but also a potential mismatch between focusing on real-world objects at various distances and the fixed focus of the virtual overlay. This can introduce depth discontinuities, reduce the perceived registration accuracy of virtual objects, and cause discomfort.<ref name="Zhou2021">{{cite web |last=Zhou |first=Y. |last2=Li |first2=X. |last3=Yuan |first3=C. |title=Vergence-Accommodation Conflict in Optical See-Through Display: Review and Prospect (Results in Optics, Vol. 5) |url=https://www.sciencedirect.com/science/article/pii/S266717002100160X |year=2021 |access-date= [Insert Access Date Here] }}</ref>
* transient or persistent blur;
* diplopia (double vision) when fusion fails;
* reduced reading speed and depth‑judgement accuracy;
* contributions to [[Virtual Reality Sickness|VR sickness]] symptoms such as nausea and dizziness.


Susceptibility is **highly individual** but large-scale studies indicate that conflicts < ≈0.5 dioptres (D) in front of the focal plane and < ≈1 D behind it fall inside a broad ‘zone of comfort’ for most viewers.<ref name="Shibata2011" /><ref name="ISO2015" />
==Effects and Symptoms==
The sustained conflict between vergence and accommodation forces the visual system and brain to work harder, potentially leading to a range of negative effects:<ref name="Kramida2016Resolving" /><ref name="Hoffman2008" />
*  '''[[Visual fatigue]] / Eyestrain''': Tired, aching, or burning eyes resulting from the prolonged effort to resolve conflicting cues.
*  '''[[Headache]]s'''.
*  '''Blurred Vision''': Difficulty maintaining sharp focus on virtual objects, especially those perceived as very near or very far relative to the display's fixed focal plane.
*   '''[[Diplopia]] (Double Vision)''': Incorrect vergence responses due to the conflict can sometimes lead to seeing double images.
*   '''Focusing Problems''': Difficulty rapidly refocusing between virtual objects at different apparent depths because the natural reflex is disrupted. Users may also experience lingering focus issues or unusual visual sensations after removing the HMD.
*   '''[[Virtual Reality Sickness|VR Sickness]] / Discomfort''': VAC is considered a significant contributor to symptoms like nausea, dizziness, and general discomfort associated with VR/AR use.
*   '''Reduced Visual Performance''': Measurable degradation in tasks requiring fine depth judgments, reduced reading speed, slower visuomotor reaction times, and increased time required to fuse binocular images.<ref name="Hoffman2008" /><ref name="Lin2022">{{cite web |last=Lin |first=C-J. |last2=Chi |first2=C-F. |last3=Lin |first3=C-K. |last4=Chang |first4=E-C. |title=Effects of Virtual Target Size, Position and Parallax on Vergence–Accommodation Conflict as Estimated by Actual Gaze (Scientific Reports, Vol. 12, Article 20100) |url=https://www.nature.com/articles/s41598-022-24450-9 |year=2022 |access-date= [Insert Access Date Here] }}</ref>
*  '''[[Focal Rivalry]]''': Particularly in AR, the conflict between focusing on a real-world object and a virtual object projected at a different focal distance can make it difficult or impossible to see both sharply simultaneously.
 
The severity of these symptoms varies significantly between individuals and depends on factors such as the magnitude of the VAC (the difference between vergence and accommodation distances), the duration of exposure, the nature of the visual content, and individual visual capabilities. Conflicts below approximately 0.4 to 0.6 [[Diopter|diopters]] are often tolerated, but larger conflicts, especially for near-field virtual objects (within arm's reach), become increasingly problematic.<ref name="Shibata2011" /><ref name="ISO2015" />


==Measurement==
==Measurement==
<code>VAC (D) = | (/ Accommodation distance) – (/ Vergence distance) |</code>
VAC can be quantified by comparing the optical power (measured in [[Diopter|diopters]], D, which is the reciprocal of distance in meters) required for accommodation versus the optical power corresponding to the vergence distance.<ref name="Kramida2016Resolving" />
*  `VAC (Diopters) = | (1 / Accommodation Distance (m)) - (1 / Vergence Distance (m)) |`
 
For example, if an HMD has a fixed focus set to 2 meters (requiring 1/2.0 = 0.5 D of accommodation) and displays a virtual object that appears to be 0.5 meters away (requiring 1/0.5 = 2.0 D of vergence), the VAC magnitude is |0.5 D - 2.0 D| = 1.5 D.
 
==Mitigation Strategies==
Addressing VAC is a major focus of VR and AR research and development. Strategies fall into two main categories: content design and technological solutions.


==Mitigation strategies==
===Content and Interaction Guidelines===
===Content & interaction guidelines===
Careful design can minimize VAC-induced discomfort in fixed-focus displays:
# Keep critical UI and text on, or slightly behind, the display’s focal plane.
#   '''Limit Depth Range''': Keep critical interactive content and prolonged visual targets within the "zone of comfort," typically corresponding to less than ~0.6 D of VAC for foreground objects (closer than the focal plane) and ~1.0 D for background objects (farther than the focal plane).<ref name="ISO2015" /><ref name="Shibata2011" />
# Limit positive VAC to < 0.5 D and negative VAC to < 1 D for sustained viewing.<ref name="ISO2015" />
#   '''Avoid Rapid Depth Changes''': Avoid sudden disparity jumps (> 1 D) or rapid oscillations in depth for prominent objects. Allow the visual system time (at least 500 ms) to adjust to significant depth changes.<ref name="Kramida2016" />
# Avoid rapid disparity jumps (> 1 D within 0.5 s).
#   '''Optimize UI Placement''': Present user interface elements, text, and critical information at or slightly behind the display's native focal plane where VAC is zero or negative (which is generally better tolerated).<ref name="Shibata2011" />
# Use gaze‑contingent depth‑of‑field blur or foveated rendering to reinforce correct monocular depth cues.<ref name="Koulieris2017" />
#  '''Simulate Blur''': When hardware cannot provide correct focus cues, incorporate [[Gaze-contingent display|gaze-contingent]] [[Depth of field|depth-of-field]] rendering (simulating blur for objects not being looked at) to provide [[Monocular cues|monocular]] depth information that aligns better with vergence, potentially reducing cue conflicts.<ref name="Koulieris2017">{{cite web |last=Koulieris |first=G-A. |last2=Buhler |first2=K. |last3=Drettakis |first3=G. |last4=Banks |first4=M. S. |title=Accommodation and Comfort in Head-Mounted Displays (ACM SIGGRAPH 2017 Courses) |url=https://dl.acm.org/doi/10.1145/3084873.3084901 |year=2017 |access-date= [Insert Access Date Here] }}</ref>


===Technological solutions===
===Technological Solutions===
These hardware approaches aim to create displays where the accommodation distance can dynamically match the vergence distance demanded by the virtual content:
{| class="wikitable plainrowheaders"
{| class="wikitable plainrowheaders"
! Approach !! Principle !! Representative prototypes !! Notes
! Approach !! Principle !! Representative prototypes / Research !! Strengths / Limitations
|-
! [[Varifocal display|Varifocal]]
| [[Eye tracking|Eye-tracking]] determines the user's gaze depth, and the display system adjusts a single focal plane to match that depth using [[Tunable lens|tunable lenses]] (e.g., liquid crystal, liquid lens, Alvarez) or mechanically moving components (screen or lens). | Meta Reality Labs Butterscotch Varifocal (2023);<ref name="DisplayDaily2023">{{cite web |title=Meta’s Going to SIGGRAPH 2023 and Showing Flamera and Butterscotch VR Technologies |url=https://displaydaily.com/metas-going-to-siggraph-2023-and-showing-flamera-and-butterscotch-vr-technologies/ |website=Display Daily |date=2023-08-04 |access-date=[Insert Access Date Here]}}</ref> UNC Wide-FOV deformable-mirror NED.<ref name="Dunn2017">{{cite web |last=Dunn |first=D. |last2=Tippets |first2=C. |last3=Torell |first3=K. |last4=Kellnhofer |first4=P. |last5=Akşit |first5=K. |last6=Didyk |first6=P. |last7=Myszkowski |first7=K. |last8=Luebke |first8=D. |last9=Fuchs |first9=H. |title=Wide Field-of-View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors (IEEE Transactions on Visualization and Computer Graphics (TVCG), Vol. 23, Issue 4) |url=https://ieeexplore.ieee.org/document/7850947 |year=2017 |access-date=[Insert Access Date Here]}}</ref> | Delivers correct focus cue at the depth of fixation. Challenges include eye-tracking latency and accuracy, depth switching speed, limited depth range, and potentially incorrect blur cues for objects not at the fixation depth.<ref name="UNC2019">{{cite web |title=Dynamic Focus Augmented Reality Display |url=https://telepresence.web.unc.edu/research/dynamic-focus-augmented-reality-display/ |website=UNC Graphics and Virtual Reality Group |year=2019 |access-date=[Insert Access Date Here]}}</ref>
|-
|-
! Varifocal
! [[Multifocal display|Multifocal / Multiplane]]
| Eye‑tracking selects gaze depth; tunable lenses or moving optics shift a *single* focal plane accordingly.
| Presents images on several fixed focal planes simultaneously (e.g., using stacked LCDs, beam splitters) or time-sequentially. Content is rendered on the plane closest to its virtual depth. | Stanford light-field HMD research;<ref name="Wired2015">{{cite web |last=Zhang |first=S. |title=The Obscure Neuroscience Problem That’s Plaguing VR |url=https://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr/ |website=Wired |date=2015-08-11 |access-date=[Insert Access Date Here]}}</ref> Magic Leap 1 (2 planes). | Provides more correct focus cues across multiple depths simultaneously without necessarily requiring eye-tracking. Challenges include complexity, cost, reduced brightness/contrast, potential visible transitions between planes, and limited number of planes.
| Meta Reality Labs “Butterscotch” (2023); UNC deformable‑mirror NED (2017).<ref name="MetaButterscotch2023" /><ref name="Dunn2017" />
| Delivers correct focus at fixation depth; challenges include eye‑tracking latency and blur mismatch for peripheral objects.
|-
|-
! Multifocal / multiplane
! [[Light field display|Light Field]]
| Renders content on two or more discrete focal planes, either simultaneously or time‑multiplexed.
| Attempts to reconstruct the 4D light field of the scene (rays of light with position and direction). This allows the eye's lens to naturally focus at different depths within the reproduced volume. | Research using lenslet arrays, parallax barriers, holographic optical elements, super-multi-view displays. | Potentially provides true continuous focus cues without eye-tracking. Challenges include extremely high resolution and bandwidth requirements, computational complexity, limited field of view, and tradeoffs between spatial and angular resolution.
| Magic Leap 1 (two planes); Stanford light‑field prototypes.<ref name="Wired2015" />
| Provides approximate focus cues across depth; added optical complexity and brightness loss.
|-
|-
! Light‑field & holographic
! [[Holography|Holographic Displays]]
| Reconstruct 4‑D light field or full wavefront so the eye can naturally accommodate within the volume.
| Aims to fully reconstruct the wavefront of light from the virtual scene using diffraction patterns generated by [[Spatial light modulator|spatial light modulators]]. | Research by Microsoft Research, VividQ, Light Field Lab. | Theoretically the ultimate solution, providing all depth cues including accommodation correctly. Challenges include high computational cost ("speckle" noise), limited field of view, and hardware complexity for real-time, high-quality HMDs.
| Holographic near‑eye displays (Microsoft; VividQ) remain research prototypes.
| Highest theoretical fidelity but currently limited by compute, resolution and field‑of‑view.
|-
|-
! Retinal projection / scanning
! [[Retinal projection|Retinal Projection / Scanning]]
| Scans modulated light directly onto the retina (Maxwellian view).
| Scans modulated light (often laser) directly onto the retina, potentially creating an image that is always in focus regardless of the eye's accommodation state (Maxwellian view). | Research systems; formerly North Focals (acquired by Google). | Can bypass VAC by eliminating the need for accommodation. Challenges include small [[Eyebox|eyebox]], potential for visual artifacts (e.g., [[Floater|floaters]] becoming more visible), safety concerns, and achieving high resolution/FOV.
| Early commercial attempts (North Focals); research scanners.
| Eliminates accommodation demand but suffers from small eyebox and sparkle/floaters.
|-
|-
! Emerging optics
! Emerging Optics
| Alvarez freeform lenses, fluidic or MEMS deformable elements for compact varifocal modules.
| Novel optical components like Alvarez freeform lenses,<ref name="Liu2024">{{cite web |last=Liu |first=Y. |last2=Cheng |first2=D. |last3=Wang |first3=Y. |last4=Hua |first4=H. |title=A Varifocal Augmented-Reality Head-Up Display Using Alvarez Freeform Lenses (Journal of the Society for Information Display, Vol. 32, Issue 4) |url=https://onlinelibrary.wiley.com/doi/abs/10.1002/jsid.1286 |year=2024 |access-date=[Insert Access Date Here]}}</ref> tunable fluidic lenses, and deformable membranes are being explored for compact, low-power varifocal or multifocal elements. | Primarily research stage. | Aim for integration into smaller form factors. Manufacturing challenges, response time, optical quality, and control complexity remain active research areas.
| Large‑aperture Alvarez AR head‑up display demonstrator (2024).<ref name="Liu2024" />
| Research stage; challenges include manufacturing tolerances and power consumption.
|}
|}


==Temporary workarounds==
==Temporary Workarounds / Adaptation==
* **Monocular viewing** removes disparity and therefore vergence cues, eliminating VAC at the cost of stereopsis.<ref name="Kreylos2014" />
*   '''Closing One Eye''': Viewing with only one eye eliminates [[Binocular disparity|binocular disparity]] cues, thereby removing the vergence signal and the conflict. This can sometimes make it easier to focus on virtual objects, particularly near ones, at the cost of losing stereoscopic depth perception.<ref name="Kreylos2014VAC" />
* Wearing up‑to‑date refractive correction prevents additional accommodative effort.
*   '''Optimal Optical Correction''': Ensuring users wear their correct [[Eyeglass prescription|prescription glasses]] or [[Contact lens|contact lenses]] minimizes any additional strain on the visual system from uncorrected [[Refractive error|refractive errors]].
* Adaptation: some users partially adapt to VAC over tens of minutes, but after‑effects (e.g., shift in resting focus) can persist for several hours.<ref name="Hoffman2008" />
*   '''Adaptation''': The visual system can exhibit some adaptation to VAC during prolonged use, temporarily weakening the coupling between vergence and accommodation.<ref name="Kreylos2014VAC" /> However, this adaptation might be slow, incomplete, and can lead to the lingering aftereffects mentioned earlier when returning to natural viewing conditions.


==Current research frontiers==
==Current Research Frontiers==
* **High‑resolution varifocal HMDs** integrating retinal‑resolution displays with low‑latency depth actuation.<ref name="MetaButterscotch2023" />
*   '''High-Resolution Varifocal Displays''': Prototypes like Meta’s Butterscotch demonstrate progress towards retinal resolution (e.g., 60 pixels per degree) combined with reasonably fast depth switching, suggesting potential commercial viability.<ref name="DisplayDaily2023" />
* **Focus‑correct passthrough AR** combining dynamic‑focus optics with camera‑based passthrough to align both real and virtual imagery.<ref name="Dunn2017" />
*   '''Focus-Correct Passthrough AR''': Integrating varifocal or multifocal optics into [[Video passthrough|video-see-through]] AR systems to correctly render both real-world and virtual imagery at appropriate focal depths.<ref name="UNC2019" />
* **Individualised comfort models** derived from eye‑tracking and psychophysical data to adapt depth budgets per user.<ref name="Lin2022" />
*   '''Standards and Health Implications''': Ongoing work by standards bodies (e.g., ISO TC159, IEC TC100) to develop guidelines for extended VR/AR use, particularly concerning children and workplace applications.
*   '''Perceptual Modeling''': Research using large-sample studies to better understand individual variability in the accommodation-vergence relationship, potentially enabling personalized comfort settings or adaptive display parameters.<ref name="Lin2022" />


==See also==
==See Also==
* [[Accommodation (eye)]]
*   [[Accommodation (eye)|Accommodation]]
* [[Depth perception]]
*   [[Depth perception]]
* [[Eye tracking]]
*   [[Eye tracking]]
* [[Head-Mounted Display]]
*   [[Head-Mounted Display]]
* [[Light field display]]
*   [[Light field display]]
* [[Visual fatigue]]
*   [[Stereoscopy]]
* [[Virtual reality sickness]]
*  [[Varifocal display]]
*  [[Vergence]]
[[Visual fatigue]]
*   [[Virtual Reality Sickness]]


==References==
==References==
<references>
<references />
<ref name="Hoffman2008">{{cite journal |last1=Hoffman |first1=David M. |last2=Girshick |first2=Ahna R. |last3=Akeley |first3=Kurt |last4=Banks |first4=Martin S. |title=Vergence–accommodation conflicts hinder visual performance and cause visual fatigue |journal=Journal of Vision |volume=8 |issue=3 |pages=33 |year=2008 |doi=10.1167/8.3.33 |access-date=2025-04-27}}</ref>
<ref name="Shibata2011">{{cite journal |last1=Shibata |first1=Takayuki |last2=Kim |first2=Joohwan |last3=Hoffman |first3=David M. |last4=Banks |first4=Martin S. |title=The zone of comfort: predicting visual discomfort with stereo displays |journal=Journal of Vision |volume=11 |issue=8 |pages=11 |year=2011 |doi=10.1167/11.8.11 |access-date=2025-04-27}}</ref>
<ref name="Kramida2016">{{cite journal |last=Kramida |first=Gregory |title=Resolving the vergence–accommodation conflict in head‑mounted displays |journal=IEEE Transactions on Visualization and Computer Graphics |volume=22 |issue=7 |pages=1912–1931 |year=2016 |doi=10.1109/TVCG.2016.2535300 |access-date=2025-04-27}}</ref>
<ref name="Koulieris2017">{{cite journal |last1=Koulieris |first1=George‑Alex |last2=Bui |first2=Bee |last3=Banks |first3=Martin S. |last4=Drettakis |first4=George |title=Accommodation and comfort in head‑mounted displays |journal=ACM Transactions on Graphics |volume=36 |issue=4 |pages=87 |year=2017 |doi=10.1145/3072959.3073622 |access-date=2025-04-27}}</ref>
<ref name="Zhou2021">{{cite journal |last1=Zhou |first1=Yujia |last2=Li |first2=Xuan |last3=Yuan |first3=Chang |title=Vergence–accommodation conflict in optical see‑through display: review and prospect |journal=Results in Optics |volume=5 |pages=100160 |year=2021 |doi=10.1016/j.rio.2021.100160 |access-date=2025-04-27}}</ref>
<ref name="Lin2022">{{cite journal |last1=Lin |first1=Chia‑Jung |last2=Chi |first2=Chien‑Fu |last3=Lin |first3=Chih‑Kang |last4=Chang |first4=En‑Chuan |title=Effects of virtual target size, position and parallax on vergence–accommodation conflict as estimated by actual gaze |journal=Scientific Reports |volume=12 |pages=20100 |year=2022 |doi=10.1038/s41598-022-24450-9 |access-date=2025-04-27}}</ref>
<ref name="ISO2015">{{cite report |title=ISO 9241‑392:2015 Ergonomics of human‑system interaction — Part 392: Reduction of visual fatigue from stereoscopic images |publisher=International Organization for Standardization |year=2015 |access-date=2025-04-27}}</ref>
<ref name="Dunn2017">{{cite journal |last1=Dunn |first1=Damon |last2=Tippets |first2=Caleb |last3=Torell |first3=Kevin |last4=Kellnhofer |first4=Philipp |last5=Akşit |first5=Kaan |last6=Didyk |first6=Piotr |last7=Myszkowski |first7=Karol |last8=Luebke |first8=David |last9=Fuchs |first9=Henry |title=Wide field‑of‑view varifocal near‑eye display using see‑through deformable membrane mirrors |journal=IEEE Transactions on Visualization and Computer Graphics |volume=23 |issue=4 |pages=1322–1331 |year=2017 |doi=10.1109/TVCG.2017.2657138 |access-date=2025-04-27}}</ref>
<ref name="MetaButterscotch2023">{{cite web |title=Demo or die: How Reality Labs’ display systems research team is tackling the biggest challenges in VR |url=https://www.meta.com/blog/reality-labs-research-display-systems-siggraph-2023-butterscotch-varifocal-flamera/ |website=Reality Labs Research Blog |publisher=Meta Platforms |date=2023-08-04 |access-date=2025-04-27}}</ref>
<ref name="Liu2024">{{cite journal |last1=Liu |first1=Yujin |last2=Cheng |first2=Dongli |last3=Wang |first3=Yuan |last4=Hua |first4=Hong |title=A varifocal augmented‑reality head‑up display using Alvarez freeform lenses |journal=Journal of the Society for Information Display |volume=32 |issue=4 |pages=231–240 |year=2024 |doi=10.1002/jsid.1286 |access-date=2025-04-27}}</ref>
<ref name="Wired2015">{{cite news |last=Zhang |first=Sarah |title=The obscure neuroscience problem that’s plaguing VR |work=Wired |date=2015-08-11 |url=https://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr |access-date=2025-04-27}}</ref>
<ref name="Kreylos2014">{{cite web |last=Kreylos |first=Oliver |title=Accommodation and vergence in head‑mounted displays |website=Doc‑Ok.org |date=2014-04-13 |url=http://doc-ok.org/?p=1602 |access-date=2025-04-27}}</ref>
</references>


[[Category:Terms]]
[[Category:Terms]]

Revision as of 04:28, 27 April 2025

See also: Terms and Technical Terms

Vergence-accommodation conflict (VAC), also known as accommodation-vergence conflict or sometimes accommodation–vergence mismatch, is a visual and perceptual phenomenon that occurs when the brain receives mismatching cues between the distance to which the eyes are pointed or converged (vergence) and the distance at which the eyes' lenses are focused (accommodation).[1][2] Because natural viewing conditions tightly couple these two mechanisms, breaking that link is a primary cause of visual discomfort and performance issues in modern virtual reality (VR), augmented reality (AR), and other stereoscopic 3-D displays, including nearly all mainstream head-mounted displays (HMDs).[1]

Physiological Basis

When fixating on an object in the real world, the human visual system simultaneously performs two key actions:

In natural vision, these two systems are tightly linked through fast, reciprocal neurological signals known as the accommodation–vergence reflex.[2][3] This coupling ensures that the eyes focus at the same distance they are pointed, allowing for clear, comfortable, and efficient vision. Stereoscopic displays disrupt this natural coupling because binocular disparity cues drive the vergence system to the simulated depth of a virtual object, while the accommodation system is driven by blur cues to focus on the physical display surface, which is typically at a fixed optical distance.[4]

Causes / Occurrence in Display Technologies

The vergence-accommodation conflict is inherent in display technologies where the perceived depth of content differs from the physical or optical distance of the display surface:

  • Fixed-focus HMDs: Nearly all consumer VR and many AR headsets use lenses to place a virtual image of the display screens at a fixed focal distance, typically between 1.3 and 2 meters (though this varies).[5] Viewers must accommodate to this fixed plane to see a sharp image. However, stereoscopic rendering creates virtual objects that appear at various depths, requiring vergence changes. Objects rendered virtually nearer than the fixed focal plane induce a positive VAC (eyes converge more than they accommodate), while objects rendered virtually farther induce a negative VAC (eyes converge less than they accommodate).[6]
  • 3D Cinema and Television: VAC also occurs here, but symptoms are often milder. The screen is typically farther away, the field of view is smaller, and content creators can limit disparities to keep virtual objects within a "zone of comfort" relative to the screen distance.[7]
  • Optical See-Through (OST) AR: In OST AR glasses, virtual images (often at a fixed focus) are overlaid onto the real world. This creates a conflict not only between vergence and accommodation for virtual objects but also a potential mismatch between focusing on real-world objects at various distances and the fixed focus of the virtual overlay. This can introduce depth discontinuities, reduce the perceived registration accuracy of virtual objects, and cause discomfort.[8]

Effects and Symptoms

The sustained conflict between vergence and accommodation forces the visual system and brain to work harder, potentially leading to a range of negative effects:[4][1]

  • Visual fatigue / Eyestrain: Tired, aching, or burning eyes resulting from the prolonged effort to resolve conflicting cues.
  • Headaches.
  • Blurred Vision: Difficulty maintaining sharp focus on virtual objects, especially those perceived as very near or very far relative to the display's fixed focal plane.
  • Diplopia (Double Vision): Incorrect vergence responses due to the conflict can sometimes lead to seeing double images.
  • Focusing Problems: Difficulty rapidly refocusing between virtual objects at different apparent depths because the natural reflex is disrupted. Users may also experience lingering focus issues or unusual visual sensations after removing the HMD.
  • VR Sickness / Discomfort: VAC is considered a significant contributor to symptoms like nausea, dizziness, and general discomfort associated with VR/AR use.
  • Reduced Visual Performance: Measurable degradation in tasks requiring fine depth judgments, reduced reading speed, slower visuomotor reaction times, and increased time required to fuse binocular images.[1][9]
  • Focal Rivalry: Particularly in AR, the conflict between focusing on a real-world object and a virtual object projected at a different focal distance can make it difficult or impossible to see both sharply simultaneously.

The severity of these symptoms varies significantly between individuals and depends on factors such as the magnitude of the VAC (the difference between vergence and accommodation distances), the duration of exposure, the nature of the visual content, and individual visual capabilities. Conflicts below approximately 0.4 to 0.6 diopters are often tolerated, but larger conflicts, especially for near-field virtual objects (within arm's reach), become increasingly problematic.[6][7]

Measurement

VAC can be quantified by comparing the optical power (measured in diopters, D, which is the reciprocal of distance in meters) required for accommodation versus the optical power corresponding to the vergence distance.[4]

  • `VAC (Diopters) = | (1 / Accommodation Distance (m)) - (1 / Vergence Distance (m)) |`

For example, if an HMD has a fixed focus set to 2 meters (requiring 1/2.0 = 0.5 D of accommodation) and displays a virtual object that appears to be 0.5 meters away (requiring 1/0.5 = 2.0 D of vergence), the VAC magnitude is |0.5 D - 2.0 D| = 1.5 D.

Mitigation Strategies

Addressing VAC is a major focus of VR and AR research and development. Strategies fall into two main categories: content design and technological solutions.

Content and Interaction Guidelines

Careful design can minimize VAC-induced discomfort in fixed-focus displays:

  1. Limit Depth Range: Keep critical interactive content and prolonged visual targets within the "zone of comfort," typically corresponding to less than ~0.6 D of VAC for foreground objects (closer than the focal plane) and ~1.0 D for background objects (farther than the focal plane).[7][6]
  2. Avoid Rapid Depth Changes: Avoid sudden disparity jumps (> 1 D) or rapid oscillations in depth for prominent objects. Allow the visual system time (at least 500 ms) to adjust to significant depth changes.[3]
  3. Optimize UI Placement: Present user interface elements, text, and critical information at or slightly behind the display's native focal plane where VAC is zero or negative (which is generally better tolerated).[6]
  4. Simulate Blur: When hardware cannot provide correct focus cues, incorporate gaze-contingent depth-of-field rendering (simulating blur for objects not being looked at) to provide monocular depth information that aligns better with vergence, potentially reducing cue conflicts.[10]

Technological Solutions

These hardware approaches aim to create displays where the accommodation distance can dynamically match the vergence distance demanded by the virtual content:

Approach Principle Representative prototypes / Research Strengths / Limitations
Varifocal Eye-tracking determines the user's gaze depth, and the display system adjusts a single focal plane to match that depth using tunable lenses (e.g., liquid crystal, liquid lens, Alvarez) or mechanically moving components (screen or lens). | Meta Reality Labs Butterscotch Varifocal (2023);[11] UNC Wide-FOV deformable-mirror NED.[12] | Delivers correct focus cue at the depth of fixation. Challenges include eye-tracking latency and accuracy, depth switching speed, limited depth range, and potentially incorrect blur cues for objects not at the fixation depth.[13]
Multifocal / Multiplane Stanford light-field HMD research;[14] Magic Leap 1 (2 planes). | Provides more correct focus cues across multiple depths simultaneously without necessarily requiring eye-tracking. Challenges include complexity, cost, reduced brightness/contrast, potential visible transitions between planes, and limited number of planes.
Light Field Research using lenslet arrays, parallax barriers, holographic optical elements, super-multi-view displays. | Potentially provides true continuous focus cues without eye-tracking. Challenges include extremely high resolution and bandwidth requirements, computational complexity, limited field of view, and tradeoffs between spatial and angular resolution.
Holographic Displays Aims to fully reconstruct the wavefront of light from the virtual scene using diffraction patterns generated by spatial light modulators. | Research by Microsoft Research, VividQ, Light Field Lab. | Theoretically the ultimate solution, providing all depth cues including accommodation correctly. Challenges include high computational cost ("speckle" noise), limited field of view, and hardware complexity for real-time, high-quality HMDs.
Retinal Projection / Scanning Research systems; formerly North Focals (acquired by Google). | Can bypass VAC by eliminating the need for accommodation. Challenges include small eyebox, potential for visual artifacts (e.g., floaters becoming more visible), safety concerns, and achieving high resolution/FOV.
Emerging Optics Primarily research stage. | Aim for integration into smaller form factors. Manufacturing challenges, response time, optical quality, and control complexity remain active research areas.

Temporary Workarounds / Adaptation

  • Closing One Eye: Viewing with only one eye eliminates binocular disparity cues, thereby removing the vergence signal and the conflict. This can sometimes make it easier to focus on virtual objects, particularly near ones, at the cost of losing stereoscopic depth perception.[2]
  • Optimal Optical Correction: Ensuring users wear their correct prescription glasses or contact lenses minimizes any additional strain on the visual system from uncorrected refractive errors.
  • Adaptation: The visual system can exhibit some adaptation to VAC during prolonged use, temporarily weakening the coupling between vergence and accommodation.[2] However, this adaptation might be slow, incomplete, and can lead to the lingering aftereffects mentioned earlier when returning to natural viewing conditions.

Current Research Frontiers

  • High-Resolution Varifocal Displays: Prototypes like Meta’s Butterscotch demonstrate progress towards retinal resolution (e.g., 60 pixels per degree) combined with reasonably fast depth switching, suggesting potential commercial viability.[11]
  • Focus-Correct Passthrough AR: Integrating varifocal or multifocal optics into video-see-through AR systems to correctly render both real-world and virtual imagery at appropriate focal depths.[13]
  • Standards and Health Implications: Ongoing work by standards bodies (e.g., ISO TC159, IEC TC100) to develop guidelines for extended VR/AR use, particularly concerning children and workplace applications.
  • Perceptual Modeling: Research using large-sample studies to better understand individual variability in the accommodation-vergence relationship, potentially enabling personalized comfort settings or adaptive display parameters.[9]

See Also

References

  1. 1.0 1.1 1.2 1.3 Hoffman, D. M.; Girshick, A. R.; Akeley, K.; Banks, M. S. (2008). "Vergence–Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue (Journal of Vision, Vol. 8, Issue 3, Article 33)". https://jov.arvojournals.org/article.aspx?articleid=2192424.
  2. 2.0 2.1 2.2 2.3 Kreylos, Oliver (2014-04-13). "Accommodation and Vergence in Head-mounted Displays". http://doc-ok.org/?p=1602.
  3. 3.0 3.1 Kramida, G. (2016). "Resolving the Vergence–Accommodation Conflict in Head-Mounted Displays (IEEE Transactions on Visualization and Computer Graphics, Vol. 22, Issue 7)". https://ieeexplore.ieee.org/document/7296633.
  4. 4.0 4.1 4.2 Kramida, Gregory; Varshney, Amitabh (2016). "Resolving the Vergence-Accommodation Conflict in Head Mounted Displays". https://www.cs.umd.edu/sites/default/files/scholarly_papers/Kramidarev.pdf.
  5. Kreylos, Oliver (2013-07-24). "Head-mounted Displays and Lenses". http://doc-ok.org/?p=1360.
  6. 6.0 6.1 6.2 6.3 Shibata, T.; Kim, J.; Hoffman, D. M.; Banks, M. S. (2011). "The Zone of Comfort: Predicting Visual Discomfort With Stereo Displays (Journal of Vision, Vol. 11, Issue 8, Article 11)". https://jov.arvojournals.org/article.aspx?articleid=2192969.
  7. 7.0 7.1 7.2 ISO 9241-392:2015, Ergonomics of human-system interaction — Part 392: Ergonomic requirements for the reduction of visual fatigue from stereoscopic images. International Organization for Standardization, 2015.
  8. Zhou, Y.; Li, X.; Yuan, C. (2021). "Vergence-Accommodation Conflict in Optical See-Through Display: Review and Prospect (Results in Optics, Vol. 5)". https://www.sciencedirect.com/science/article/pii/S266717002100160X.
  9. 9.0 9.1 Lin, C-J.; Chi, C-F.; Lin, C-K.; Chang, E-C. (2022). "Effects of Virtual Target Size, Position and Parallax on Vergence–Accommodation Conflict as Estimated by Actual Gaze (Scientific Reports, Vol. 12, Article 20100)". https://www.nature.com/articles/s41598-022-24450-9.
  10. Koulieris, G-A.; Buhler, K.; Drettakis, G.; Banks, M. S. (2017). "Accommodation and Comfort in Head-Mounted Displays (ACM SIGGRAPH 2017 Courses)". https://dl.acm.org/doi/10.1145/3084873.3084901.
  11. 11.0 11.1 "Meta’s Going to SIGGRAPH 2023 and Showing Flamera and Butterscotch VR Technologies". 2023-08-04. https://displaydaily.com/metas-going-to-siggraph-2023-and-showing-flamera-and-butterscotch-vr-technologies/.
  12. Dunn, D.; Tippets, C.; Torell, K.; Kellnhofer, P.; Akşit, K.; Didyk, P.; Myszkowski, K.; Luebke, D. et al. (2017). "Wide Field-of-View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors (IEEE Transactions on Visualization and Computer Graphics (TVCG), Vol. 23, Issue 4)". https://ieeexplore.ieee.org/document/7850947.
  13. 13.0 13.1 "Dynamic Focus Augmented Reality Display". 2019. https://telepresence.web.unc.edu/research/dynamic-focus-augmented-reality-display/.
  14. Zhang, S. (2015-08-11). "The Obscure Neuroscience Problem That’s Plaguing VR". https://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr/.
  15. Liu, Y.; Cheng, D.; Wang, Y.; Hua, H. (2024). "A Varifocal Augmented-Reality Head-Up Display Using Alvarez Freeform Lenses (Journal of the Society for Information Display, Vol. 32, Issue 4)". https://onlinelibrary.wiley.com/doi/abs/10.1002/jsid.1286.