Jump to content

Flat focus: Difference between revisions

From VR & AR Wiki
No edit summary
Tag: Reverted
Undo revision 34601 by Xinreality (talk)
Tag: Undo
Line 1: Line 1:
= Flat Focus =
{{Stub}} {{VR}} {{AR}} {{Optics}}


'''Flat focus''' (also known as uniform focus) is an optical phenomenon in [[virtual reality]] and [[augmented reality]] systems where all elements in a virtual scene appear to be in focus at the same distance, regardless of their simulated depth in the virtual environment. This creates a perceptual conflict with how human vision naturally works, where objects at different distances require different focal adjustments by the eye.<ref>Kramida, G. (2016). "Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays". IEEE Transactions on Visualization and Computer Graphics, 22(7), 1912-1931.</ref>
'''Flat Focus''' refers to an [[optical system]] design, common in [[virtual reality]] (VR) and [[augmented reality]] (AR) [[head-mounted display|headsets]] (HMDs), where the [[lens|lenses]] are optimized to bring light originating from the [[microdisplay]] (the screen) to a sharp [[focus]] at a single, fixed [[focal plane]]. This means that regardless of the apparent depth of virtual objects depicted on the screen, the light reaching the user's eye always appears to emanate from this specific, unchanging distance.


== Technical Background ==
This approach contrasts sharply with how the [[human visual system]] naturally perceives the real world. In reality, the [[eye]] employs a process called [[accommodation (optics)|accommodation]], where the [[crystalline lens]] dynamically changes its shape (and thus its focal length) to sharply focus on objects at varying distances. Concurrently, the eyes use [[vergence (optics)|vergence]], rotating inward ([[convergence]]) or outward ([[divergence]]) to align their gaze on the object of interest, providing crucial [[depth perception]] cues through [[binocular disparity]].


In natural human vision, the eyes perform two key operations when looking at objects at different distances:
== Explanation ==
* [[Vergence]] - the rotation of the eyes toward or away from each other to converge on a target
In a typical flat focus VR/AR headset, one or more lenses are placed between the user's eye and a small, high-[[resolution]] microdisplay (such as [[OLED]] or [[LCD]]). The display itself is physically very close to the eye, often only a few centimeters away. The primary purpose of the lens system is twofold:
* [[Accommodation]] - the adjustment of the eye's lens to focus on objects at different distances
# To magnify the small display image, making it fill a significant portion of the user's [[field of view]] (FOV).
# To collimate or refocus the light from the display so that it appears to originate from a farther distance than its physical location.


These processes are neurologically linked through the [[vergence-accommodation reflex]]. In conventional VR and AR displays, flat focus occurs because the physical display panel remains at a fixed focal distance from the eyes (typically 1.5-2 meters for most head-mounted displays), while stereoscopic techniques create the illusion of depth through different images presented to each eye.<ref>Hoffman, D. M., Girshick, A. R., Akeley, K., & Banks, M. S. (2008). "Vergence–accommodation conflicts hinder visual performance and cause visual fatigue". Journal of Vision, 8(3), 33.</ref>
The "flat focus" characteristic arises because the optical design targets a *single* virtual distance for optimal sharpness. This fixed focal distance is a design choice, often set somewhere between 1.5 meters (approx. 5 feet) and [[optical infinity]] (practically, distances beyond ~6 meters / 20 feet where accommodation change becomes negligible)<ref name="Kramida2016">Kramida, G. (2016). Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays. ''IEEE Transactions on Visualization and Computer Graphics, 22''(7), 1912-1931. doi:10.1109/TVCG.2015.2473855</ref>. A common target distance for many consumer VR headsets is around 2 meters<ref name="RoadToVRFocus">Lang, B. (2019, May 21). Oculus Rift S Doesn’t Have IPD Adjustment, But is Tuned for Optimal Focus from 61.5mm to 65.5mm. ''Road to VR''. Retrieved from https://www.roadtovr.com/oculus-rift-s-ipd-adjustment-optimal-focus-range/</ref>.


== The Vergence-Accommodation Conflict ==
Light rays from every pixel on the flat microdisplay pass through the lens system. Ideally, the lenses bend these rays such that they appear to the eye as parallel or slightly diverging bundles, mimicking how light arrives from an object located at the chosen fixed focal distance. The eye's [[crystalline lens]], therefore, only needs to accommodate to *that specific distance* to perceive a sharp image across the entire display, regardless of whether the content shown is a virtual object meant to be centimeters away or kilometers away.


The disparity between where the eyes converge (vergence) and where they focus (accommodation) in flat focus displays is known as the [[vergence-accommodation conflict]] (VAC). This mismatch occurs because:
== Relevance in VR and AR ==
The flat focus design is prevalent in VR/AR primarily due to its relative simplicity, cost-effectiveness, and ability to deliver wide fields of view with manageable [[aberration (optics)|optical aberrations]] using lens technologies like [[aspheric lens|aspheric]] or [[Fresnel lens|Fresnel lenses]], and more recently, [[pancake lens|pancake lenses]] (which achieve a thinner profile but typically still maintain a fixed focus)<ref name="PancakeOptics">Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ''ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH 2017, 36''(4), Article 85, 1–16. doi:10.1145/3072959.3073624 (Discusses various HMD optics including pancake lenses)</ref>.


* The eyes rotate (vergence) to converge on virtual objects that appear at different depths
However, the primary consequence of flat focus is the introduction of the [[Vergence-Accommodation Conflict]] (VAC)<ref name="VACReview">Hoffman, D. M., Girshick, A. R., Akeley, K., & Banks, M. S. (2008). Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. ''Journal of Vision, 8''(3), 33, 1–30. doi:10.1167/8.3.33</ref>. This conflict arises because the cues the brain receives for depth perception become inconsistent:
* The eyes must maintain focus (accommodation) at the fixed physical distance of the display panel
* '''Vergence Cues:''' Based on stereoscopic rendering ([[binocular disparity]]), the user's eyes converge or diverge naturally to fuse the image of virtual objects presented at different depths. For a nearby virtual object, the eyes converge significantly.
* '''Accommodation Cues:''' Regardless of where the eyes are converged, the light is always coming from the fixed focal plane set by the headset optics. Therefore, the eye's accommodation reflex receives cues (primarily from [[retinal blur]]) indicating that the object is always at that fixed distance, prompting the crystalline lens to remain focused there.


This conflict can lead to visual discomfort, fatigue, eyestrain, and reduced performance in extended VR/AR sessions.<ref>Shibata, T., Kim, J., Hoffman, D. M., & Banks, M. S. (2011). "The zone of comfort: Predicting visual discomfort with stereo displays". Journal of Vision, 11(8), 11.</ref>
This mismatch between where the eyes are pointing (vergence) and where they are focusing (accommodation) is unnatural. The human visual system is accustomed to these two mechanisms working in tandem. The conflict can lead to several negative effects:
* [[Visual fatigue]] and [[eye strain]]<ref name="VACReview"/>.
* [[Headache|Headaches]]<ref name="Shibata2011">Shibata, T., Kim, J., Hoffman, D. M., & Banks, M. S. (2011). The zone of comfort: Predicting visual discomfort with stereo displays. ''Journal of Vision, 11''(8), 11, 1–29. doi:10.1167/11.8.11</ref>.
* Difficulty focusing on real-world objects after prolonged VR/AR use.
* Inaccurate perception of [[depth]] and scale<ref name="Kramida2016"/>.
* In some individuals, it may contribute to visually induced [[motion sickness]] or [[nausea]]<ref name="VACReview"/>.


== Implementation in VR/AR Systems ==
== Technical Considerations ==
Achieving a high-quality flat focus image across a wide field of view presents optical engineering challenges:
* '''[[Aberration (optics)|Aberrations]]:''' Lenses, especially simple or wide-FOV ones, suffer from various optical aberrations that can degrade image quality. These include [[chromatic aberration]] (color fringing), [[spherical aberration]] (blur), [[astigmatism (optics)|astigmatism]], [[coma (optics)|coma]], and [[geometric distortion]] (like [[pincushion distortion]] or [[barrel distortion]]). While flat focus simplifies the *focal depth* aspect, complex lens shapes (aspheric, Fresnel patterns) or multiple lens elements are needed to correct these other aberrations across the field of view. Pancake lenses often use polarization and reflective surfaces, introducing their own complexities<ref name="PancakeOptics"/>.
* '''[[Field Curvature]]:''' Ideally, a lens projects a sharp image onto a flat plane (the focal plane). However, many simple lenses naturally focus onto a curved surface (Petzval field curvature). Designers must correct for this to ensure the image is reasonably sharp not just at the center but also towards the periphery of the flat microdisplay's projection onto the fixed focal plane.
* '''[[Exit Pupil]]:''' The size and position of the [[exit pupil]] influence how tolerant the headset is to misalignment with the user's eye. A small exit pupil requires precise positioning, while a larger one offers more leeway but can be harder to achieve with high image quality.
* '''[[Distortion Correction]]:''' Geometric distortions are often corrected computationally. The image rendered to the microdisplay is pre-distorted ([[image warping]]) in the opposite direction of the optical distortion, so that after passing through the lens, it appears geometrically correct to the user. This requires precise calibration of the optical system.


=== Conventional Approaches ===
== Limitations and Challenges ==
Besides the Vergence-Accommodation Conflict, the flat focus approach has other limitations:
* '''Lack of Natural Depth Cues:''' In the real world, objects significantly closer or farther than the point of focus appear blurred ([[Depth of Field|depth of field]]). This blur is a subtle but important depth cue. In flat focus systems, everything is presented at the same focal distance, so virtual objects lack this naturalistic blur difference, making scenes appear somewhat flat or artificial. [[Rendering]] techniques can simulate [[bokeh]] or depth of field effects, but these are computationally generated based on gaze or assumptions, not a result of the eye's natural focusing.
* '''Accessibility Issues:''' Users with certain visual conditions, particularly [[presbyopia]] (age-related difficulty focusing up close), might find the fixed focal distance uncomfortable or impossible to focus on clearly without appropriate [[corrective lens|corrective lenses]]. While the fixed distance is often chosen to be relatively comfortable for the average user, individual needs vary.


Most commercial VR and AR systems utilize flat focus displays with the following characteristics:
== Potential Solutions and Future Directions ==
Research and development efforts are actively exploring solutions to overcome the limitations of flat focus, primarily aiming to resolve the VAC:
* '''[[Varifocal display|Varifocal Displays]]:''' These systems can dynamically adjust the focal plane of the headset to match the depth of the virtual object the user is looking at. This can be achieved through various methods:
    * Mechanically moving the lenses or displays.
    * Using [[liquid lens|liquid crystal lenses]] or other electronically tunable optical elements.
    * Employing deformable membrane mirrors<ref name="MembraneMirrorVarifocal">Rathnayake, A. U., Nguyen, T., & Zhan, T. (2021). Varifocal near-eye display using a focus-tunable Alvarez lens. ''Optics Express, 29''(19), 30935-30947. doi:10.1364/OE.436385</ref>.
* '''[[Multifocal display|Multifocal Displays]]:''' These designs present images on multiple distinct focal planes simultaneously or in rapid succession, allowing the eye to focus more naturally on the plane closest to the target object's depth<ref name="MultifocalDisplays">Mercier, T., Ito, Y., & Kawahito, S. (2017). Multi-focal augmented reality display using time-multiplexed focal planes. ''Optics Express, 25''(23), 28633-28645. doi:10.1364/OE.25.028633</ref>.
* '''[[Light field display|Light Field Displays]]:''' These advanced displays aim to replicate the way light rays travel in the real world, providing correct focus cues by presenting slightly different information depending on the viewing angle and position of the pupil. The eye can then potentially focus naturally at different depths within the captured light field<ref name="LightfieldVR">Lanman, D., & Luebke, D. (2013). Near-eye light field displays. ''ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2013, 32''(6), Article 220, 1–10. doi:10.1145/2508363.2508364</ref>.
* '''[[Holographic display|Holographic Displays]]:''' True holographic displays reconstruct the [[wavefront]] of light from the virtual scene, which would inherently contain all necessary focus cues, potentially eliminating the VAC entirely. This remains a significant technical challenge for near-eye displays<ref name="Maimone2017"/>.


* Fixed focal planes (typically LCD, OLED, or microLED displays)
While flat focus remains the dominant approach in current consumer VR/AR due to its practicality, ongoing advancements in these alternative display and optical technologies promise future HMDs with more natural and comfortable visual experiences.
* [[Stereoscopic rendering]] to create the perception of depth
* [[Binocular disparity]] cues that stimulate vergence eye movements
* Visual depth cues like [[motion parallax]], [[perspective]], and [[occlusion]]
 
These systems prioritize convincing stereoscopic depth perception while accepting the limitations of flat focus.<ref>Koulieris, G. A., Bui, B., Banks, M. S., & Drettakis, G. (2017). "Accommodation and comfort in head-mounted displays". ACM Transactions on Graphics, 36(4), 1-11.</ref>
 
=== Content Design Considerations ===
 
Content designers for flat focus displays often employ techniques to minimize discomfort:
 
* Maintaining important interactive elements within a comfortable depth range
* Avoiding rapid depth transitions that require quick accommodation changes
* Using [[depth of field]] blur effects to mimic natural focus cues
* Implementing [[foveated rendering]] to match natural visual acuity distribution<ref>Konrad, R., Angelopoulos, A., & Wetzstein, G. (2020). "Gaze-contingent Ocular Parallax Rendering for Virtual Reality". ACM Transactions on Graphics, 39(2), 10:1-10:12.</ref>
 
== Challenges and Limitations ==
 
Flat focus in VR/AR systems presents several challenges:
 
* '''Visual fatigue''': Extended use can cause eyestrain, headaches, and nausea
* '''Depth perception accuracy''': Users often misjudge distances in virtual environments
* '''Focus switching''': Transitioning between real and virtual content in AR is particularly challenging
* '''Individual differences''': Interpupillary distance (IPD) and other physiological factors affect how users experience flat focus displays
* '''Content limitations''': Certain types of detailed work requiring precise focus are difficult to implement<ref>Kim, J., Kane, D., & Banks, M. S. (2014). "The rate of change of vergence–accommodation conflict affects visual discomfort". Vision Research, 105, 159-165.</ref>
 
== Advanced Solutions ==
 
Several technologies are being developed to address the limitations of flat focus:
 
=== Varifocal Displays ===
 
[[Varifocal displays]] dynamically adjust the focal distance of the entire display to match the user's gaze point:
 
* Mechanical systems that physically move display panels
* [[Liquid lenses]] that change focal length through electrical stimulation
* Integration with [[eye tracking]] to determine where the user is looking<ref>Dunn, D., Tippets, C., Torell, K., Kellnhofer, P., Akşit, K., Didyk, P., Myszkowski, K., Luebke, D., & Fuchs, H. (2017). "Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors". IEEE Transactions on Visualization and Computer Graphics, 23(4), 1322-1331.</ref>
 
=== Multifocal Displays ===
 
[[Multifocal displays]] present multiple focal planes simultaneously:
 
* Stacked transparent displays at different physical distances
* Time-multiplexed focal planes using high-speed display switching
* Focus-tunable optical elements to create multiple focal planes<ref>Mercier, O., Sulai, Y., Mackenzie, K., Zannoli, M., Hillis, J., Nowrouzezahrai, D., & Lanman, D. (2017). "Fast Gaze-contingent Optimal Decompositions for Multifocal Displays". ACM Transactions on Graphics, 36(6), 237:1-237:15.</ref>
 
=== Light Field Displays ===
 
[[Light field displays]] reproduce the full 4D light field, enabling natural focus at different depths:
 
* [[Microlens arrays]] to project different images in different directions
* [[Holographic displays]] that reconstruct wavefronts of light
* [[Near-eye light field displays]] specifically designed for head-mounted applications<ref>Huang, F. C., Chen, K., & Wetzstein, G. (2015). "The Light Field Stereoscope: Immersive Computer Graphics via Factored Near-Eye Light Field Displays with Focus Cues". ACM Transactions on Graphics, 34(4), 60:1-60:12.</ref>
 
=== Focal Surface Displays ===
 
[[Focal surface displays]] create continuous focal surfaces that match the 3D geometry of virtual content:
 
* Spatially varying optical elements
* Deformable membrane mirrors
* Phase-only spatial light modulators<ref>Aksit, K., Lopes, W., Kim, J., Shirley, P., & Luebke, D. (2017). "Near-Eye Varifocal Augmented Reality Display Using See-Through Screens". ACM Transactions on Graphics, 36(6), 189:1-189:13.</ref>
 
== Applications ==
 
Flat focus displays remain the standard in most current VR/AR applications:
 
=== Gaming and Entertainment ===
 
* Video games with moderate depth complexity
* 360° video experiences
* Virtual cinematic experiences
 
=== Training and Simulation ===
 
* Professional training scenarios with limited depth interaction
* Virtual walkthroughs of architectural spaces
* Medical visualization for educational purposes
 
=== Productivity ===
 
* Virtual desktop environments
* 3D modeling with depth constraints
* Collaborative virtual workspaces<ref>Guttentag, D. A. (2010). "Virtual reality: Applications and implications for tourism". Tourism Management, 31(5), 637-651.</ref>
 
== Future Developments ==
 
The future of flat focus and its alternatives in VR/AR is evolving in several directions:
 
* '''Hybrid solutions''': Combining multiple technologies to address different aspects of the vergence-accommodation conflict
* '''Computational displays''': Using advanced algorithms to optimize visual perception on existing hardware
* '''Neural rendering''': Adapting content based on perceptual models of human vision
* '''Personalized calibration''': Systems that adapt to individual visual characteristics
* '''Biological considerations''': Designs that better account for how the human visual system processes artificial depth cues<ref>Chang, J., Kim, Y., Stengel, M., Padmanaban, N., Lange, R., & Wetzstein, G. (2023). "Towards Perceptually Optimized Varifocal Near-Eye Displays". IEEE Transactions on Visualization and Computer Graphics, 29(5), 2315-2325.</ref>


== See Also ==
== See Also ==
 
* [[Vergence-Accommodation Conflict]] (VAC)
* [[Vergence-accommodation conflict]]
* [[Accommodation (optics)]]
* [[Stereoscopic display]]
* [[Vergence (optics)]]
* [[Microdisplay]]
* [[Lens]]
* [[Fresnel lens]]
* [[Pancake lens]]
* [[Varifocal display]]
* [[Varifocal display]]
* [[Multifocal display]]
* [[Light field display]]
* [[Light field display]]
* [[Eye tracking]]
* [[Head-mounted display]] (HMD)
* [[Foveated rendering]]
* [[Field of View]] (FOV)
* [[Holographic display]]
* [[Aberration (optics)]]
* [[Depth perception]]


== References ==
== References ==
<references/>
<references/>

Revision as of 08:13, 28 April 2025

This page is a stub, please expand it if you have more information.

Template:VR Template:AR Template:Optics

Flat Focus refers to an optical system design, common in virtual reality (VR) and augmented reality (AR) headsets (HMDs), where the lenses are optimized to bring light originating from the microdisplay (the screen) to a sharp focus at a single, fixed focal plane. This means that regardless of the apparent depth of virtual objects depicted on the screen, the light reaching the user's eye always appears to emanate from this specific, unchanging distance.

This approach contrasts sharply with how the human visual system naturally perceives the real world. In reality, the eye employs a process called accommodation, where the crystalline lens dynamically changes its shape (and thus its focal length) to sharply focus on objects at varying distances. Concurrently, the eyes use vergence, rotating inward (convergence) or outward (divergence) to align their gaze on the object of interest, providing crucial depth perception cues through binocular disparity.

Explanation

In a typical flat focus VR/AR headset, one or more lenses are placed between the user's eye and a small, high-resolution microdisplay (such as OLED or LCD). The display itself is physically very close to the eye, often only a few centimeters away. The primary purpose of the lens system is twofold:

  1. To magnify the small display image, making it fill a significant portion of the user's field of view (FOV).
  2. To collimate or refocus the light from the display so that it appears to originate from a farther distance than its physical location.

The "flat focus" characteristic arises because the optical design targets a *single* virtual distance for optimal sharpness. This fixed focal distance is a design choice, often set somewhere between 1.5 meters (approx. 5 feet) and optical infinity (practically, distances beyond ~6 meters / 20 feet where accommodation change becomes negligible)[1]. A common target distance for many consumer VR headsets is around 2 meters[2].

Light rays from every pixel on the flat microdisplay pass through the lens system. Ideally, the lenses bend these rays such that they appear to the eye as parallel or slightly diverging bundles, mimicking how light arrives from an object located at the chosen fixed focal distance. The eye's crystalline lens, therefore, only needs to accommodate to *that specific distance* to perceive a sharp image across the entire display, regardless of whether the content shown is a virtual object meant to be centimeters away or kilometers away.

Relevance in VR and AR

The flat focus design is prevalent in VR/AR primarily due to its relative simplicity, cost-effectiveness, and ability to deliver wide fields of view with manageable optical aberrations using lens technologies like aspheric or Fresnel lenses, and more recently, pancake lenses (which achieve a thinner profile but typically still maintain a fixed focus)[3].

However, the primary consequence of flat focus is the introduction of the Vergence-Accommodation Conflict (VAC)[4]. This conflict arises because the cues the brain receives for depth perception become inconsistent:

  • Vergence Cues: Based on stereoscopic rendering (binocular disparity), the user's eyes converge or diverge naturally to fuse the image of virtual objects presented at different depths. For a nearby virtual object, the eyes converge significantly.
  • Accommodation Cues: Regardless of where the eyes are converged, the light is always coming from the fixed focal plane set by the headset optics. Therefore, the eye's accommodation reflex receives cues (primarily from retinal blur) indicating that the object is always at that fixed distance, prompting the crystalline lens to remain focused there.

This mismatch between where the eyes are pointing (vergence) and where they are focusing (accommodation) is unnatural. The human visual system is accustomed to these two mechanisms working in tandem. The conflict can lead to several negative effects:

Technical Considerations

Achieving a high-quality flat focus image across a wide field of view presents optical engineering challenges:

  • Aberrations: Lenses, especially simple or wide-FOV ones, suffer from various optical aberrations that can degrade image quality. These include chromatic aberration (color fringing), spherical aberration (blur), astigmatism, coma, and geometric distortion (like pincushion distortion or barrel distortion). While flat focus simplifies the *focal depth* aspect, complex lens shapes (aspheric, Fresnel patterns) or multiple lens elements are needed to correct these other aberrations across the field of view. Pancake lenses often use polarization and reflective surfaces, introducing their own complexities[3].
  • Field Curvature: Ideally, a lens projects a sharp image onto a flat plane (the focal plane). However, many simple lenses naturally focus onto a curved surface (Petzval field curvature). Designers must correct for this to ensure the image is reasonably sharp not just at the center but also towards the periphery of the flat microdisplay's projection onto the fixed focal plane.
  • Exit Pupil: The size and position of the exit pupil influence how tolerant the headset is to misalignment with the user's eye. A small exit pupil requires precise positioning, while a larger one offers more leeway but can be harder to achieve with high image quality.
  • Distortion Correction: Geometric distortions are often corrected computationally. The image rendered to the microdisplay is pre-distorted (image warping) in the opposite direction of the optical distortion, so that after passing through the lens, it appears geometrically correct to the user. This requires precise calibration of the optical system.

Limitations and Challenges

Besides the Vergence-Accommodation Conflict, the flat focus approach has other limitations:

  • Lack of Natural Depth Cues: In the real world, objects significantly closer or farther than the point of focus appear blurred (depth of field). This blur is a subtle but important depth cue. In flat focus systems, everything is presented at the same focal distance, so virtual objects lack this naturalistic blur difference, making scenes appear somewhat flat or artificial. Rendering techniques can simulate bokeh or depth of field effects, but these are computationally generated based on gaze or assumptions, not a result of the eye's natural focusing.
  • Accessibility Issues: Users with certain visual conditions, particularly presbyopia (age-related difficulty focusing up close), might find the fixed focal distance uncomfortable or impossible to focus on clearly without appropriate corrective lenses. While the fixed distance is often chosen to be relatively comfortable for the average user, individual needs vary.

Potential Solutions and Future Directions

Research and development efforts are actively exploring solutions to overcome the limitations of flat focus, primarily aiming to resolve the VAC:

  • Varifocal Displays: These systems can dynamically adjust the focal plane of the headset to match the depth of the virtual object the user is looking at. This can be achieved through various methods:
   * Mechanically moving the lenses or displays.
   * Using liquid crystal lenses or other electronically tunable optical elements.
   * Employing deformable membrane mirrors[6].
  • Multifocal Displays: These designs present images on multiple distinct focal planes simultaneously or in rapid succession, allowing the eye to focus more naturally on the plane closest to the target object's depth[7].
  • Light Field Displays: These advanced displays aim to replicate the way light rays travel in the real world, providing correct focus cues by presenting slightly different information depending on the viewing angle and position of the pupil. The eye can then potentially focus naturally at different depths within the captured light field[8].
  • Holographic Displays: True holographic displays reconstruct the wavefront of light from the virtual scene, which would inherently contain all necessary focus cues, potentially eliminating the VAC entirely. This remains a significant technical challenge for near-eye displays[9].

While flat focus remains the dominant approach in current consumer VR/AR due to its practicality, ongoing advancements in these alternative display and optical technologies promise future HMDs with more natural and comfortable visual experiences.

See Also

References

  1. 1.0 1.1 Kramida, G. (2016). Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics, 22(7), 1912-1931. doi:10.1109/TVCG.2015.2473855
  2. Lang, B. (2019, May 21). Oculus Rift S Doesn’t Have IPD Adjustment, But is Tuned for Optimal Focus from 61.5mm to 65.5mm. Road to VR. Retrieved from https://www.roadtovr.com/oculus-rift-s-ipd-adjustment-optimal-focus-range/
  3. 3.0 3.1 Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH 2017, 36(4), Article 85, 1–16. doi:10.1145/3072959.3073624 (Discusses various HMD optics including pancake lenses)
  4. 4.0 4.1 4.2 Hoffman, D. M., Girshick, A. R., Akeley, K., & Banks, M. S. (2008). Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. Journal of Vision, 8(3), 33, 1–30. doi:10.1167/8.3.33
  5. Shibata, T., Kim, J., Hoffman, D. M., & Banks, M. S. (2011). The zone of comfort: Predicting visual discomfort with stereo displays. Journal of Vision, 11(8), 11, 1–29. doi:10.1167/11.8.11
  6. Rathnayake, A. U., Nguyen, T., & Zhan, T. (2021). Varifocal near-eye display using a focus-tunable Alvarez lens. Optics Express, 29(19), 30935-30947. doi:10.1364/OE.436385
  7. Mercier, T., Ito, Y., & Kawahito, S. (2017). Multi-focal augmented reality display using time-multiplexed focal planes. Optics Express, 25(23), 28633-28645. doi:10.1364/OE.25.028633
  8. Lanman, D., & Luebke, D. (2013). Near-eye light field displays. ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2013, 32(6), Article 220, 1–10. doi:10.1145/2508363.2508364
  9. Cite error: Invalid <ref> tag; no text was provided for refs named Maimone2017