Flat focus

Template:VR Template:AR Template:Optics
Flat Focus refers to an optical system design, common in virtual reality (VR) and augmented reality (AR) headsets (HMDs), where the lenses are optimized to bring light originating from the microdisplay (the screen) to a sharp focus at a single, fixed focal plane. This means that regardless of the apparent depth of virtual objects depicted on the screen, the light reaching the user's eye always appears to emanate from this specific, unchanging distance.
This approach contrasts sharply with how the human visual system naturally perceives the real world. In reality, the eye employs a process called accommodation, where the crystalline lens dynamically changes its shape (and thus its focal length) to sharply focus on objects at varying distances. Concurrently, the eyes use vergence, rotating inward (convergence) or outward (divergence) to align their gaze on the object of interest, providing crucial depth perception cues through binocular disparity.
Explanation
In a typical flat focus VR/AR headset, one or more lenses are placed between the user's eye and a small, high-resolution microdisplay (such as OLED or LCD). The display itself is physically very close to the eye, often only a few centimeters away. The primary purpose of the lens system is twofold:
- To magnify the small display image, making it fill a significant portion of the user's field of view (FOV).
- To collimate or refocus the light from the display so that it appears to originate from a farther distance than its physical location.
The "flat focus" characteristic arises because the optical design targets a *single* virtual distance for optimal sharpness. This fixed focal distance is a design choice, often set somewhere between 1.5 meters (approx. 5 feet) and optical infinity (practically, distances beyond ~6 meters / 20 feet where accommodation change becomes negligible)[1]. A common target distance for many consumer VR headsets is around 2 meters[2].
Light rays from every pixel on the flat microdisplay pass through the lens system. Ideally, the lenses bend these rays such that they appear to the eye as parallel or slightly diverging bundles, mimicking how light arrives from an object located at the chosen fixed focal distance. The eye's crystalline lens, therefore, only needs to accommodate to *that specific distance* to perceive a sharp image across the entire display, regardless of whether the content shown is a virtual object meant to be centimeters away or kilometers away.
Relevance in VR and AR
The flat focus design is prevalent in VR/AR primarily due to its relative simplicity, cost-effectiveness, and ability to deliver wide fields of view with manageable optical aberrations using lens technologies like aspheric or Fresnel lenses, and more recently, pancake lenses (which achieve a thinner profile but typically still maintain a fixed focus)[3].
However, the primary consequence of flat focus is the introduction of the Vergence-Accommodation Conflict (VAC)[4]. This conflict arises because the cues the brain receives for depth perception become inconsistent:
- Vergence Cues: Based on stereoscopic rendering (binocular disparity), the user's eyes converge or diverge naturally to fuse the image of virtual objects presented at different depths. For a nearby virtual object, the eyes converge significantly.
- Accommodation Cues: Regardless of where the eyes are converged, the light is always coming from the fixed focal plane set by the headset optics. Therefore, the eye's accommodation reflex receives cues (primarily from retinal blur) indicating that the object is always at that fixed distance, prompting the crystalline lens to remain focused there.
This mismatch between where the eyes are pointing (vergence) and where they are focusing (accommodation) is unnatural. The human visual system is accustomed to these two mechanisms working in tandem. The conflict can lead to several negative effects:
- Visual fatigue and eye strain[4].
- Headaches[5].
- Difficulty focusing on real-world objects after prolonged VR/AR use.
- Inaccurate perception of depth and scale[1].
- In some individuals, it may contribute to visually induced motion sickness or nausea[4].
Technical Considerations
Achieving a high-quality flat focus image across a wide field of view presents optical engineering challenges:
- Aberrations: Lenses, especially simple or wide-FOV ones, suffer from various optical aberrations that can degrade image quality. These include chromatic aberration (color fringing), spherical aberration (blur), astigmatism, coma, and geometric distortion (like pincushion distortion or barrel distortion). While flat focus simplifies the *focal depth* aspect, complex lens shapes (aspheric, Fresnel patterns) or multiple lens elements are needed to correct these other aberrations across the field of view. Pancake lenses often use polarization and reflective surfaces, introducing their own complexities[3].
- Field Curvature: Ideally, a lens projects a sharp image onto a flat plane (the focal plane). However, many simple lenses naturally focus onto a curved surface (Petzval field curvature). Designers must correct for this to ensure the image is reasonably sharp not just at the center but also towards the periphery of the flat microdisplay's projection onto the fixed focal plane.
- Exit Pupil: The size and position of the exit pupil influence how tolerant the headset is to misalignment with the user's eye. A small exit pupil requires precise positioning, while a larger one offers more leeway but can be harder to achieve with high image quality.
- Distortion Correction: Geometric distortions are often corrected computationally. The image rendered to the microdisplay is pre-distorted (image warping) in the opposite direction of the optical distortion, so that after passing through the lens, it appears geometrically correct to the user. This requires precise calibration of the optical system.
Limitations and Challenges
Besides the Vergence-Accommodation Conflict, the flat focus approach has other limitations:
- Lack of Natural Depth Cues: In the real world, objects significantly closer or farther than the point of focus appear blurred (depth of field). This blur is a subtle but important depth cue. In flat focus systems, everything is presented at the same focal distance, so virtual objects lack this naturalistic blur difference, making scenes appear somewhat flat or artificial. Rendering techniques can simulate bokeh or depth of field effects, but these are computationally generated based on gaze or assumptions, not a result of the eye's natural focusing.
- Accessibility Issues: Users with certain visual conditions, particularly presbyopia (age-related difficulty focusing up close), might find the fixed focal distance uncomfortable or impossible to focus on clearly without appropriate corrective lenses. While the fixed distance is often chosen to be relatively comfortable for the average user, individual needs vary.
Potential Solutions and Future Directions
Research and development efforts are actively exploring solutions to overcome the limitations of flat focus, primarily aiming to resolve the VAC:
- Varifocal Displays: These systems can dynamically adjust the focal plane of the headset to match the depth of the virtual object the user is looking at. This can be achieved through various methods:
* Mechanically moving the lenses or displays. * Using liquid crystal lenses or other electronically tunable optical elements. * Employing deformable membrane mirrors[6].
- Multifocal Displays: These designs present images on multiple distinct focal planes simultaneously or in rapid succession, allowing the eye to focus more naturally on the plane closest to the target object's depth[7].
- Light Field Displays: These advanced displays aim to replicate the way light rays travel in the real world, providing correct focus cues by presenting slightly different information depending on the viewing angle and position of the pupil. The eye can then potentially focus naturally at different depths within the captured light field[8].
- Holographic Displays: True holographic displays reconstruct the wavefront of light from the virtual scene, which would inherently contain all necessary focus cues, potentially eliminating the VAC entirely. This remains a significant technical challenge for near-eye displays[9].
While flat focus remains the dominant approach in current consumer VR/AR due to its practicality, ongoing advancements in these alternative display and optical technologies promise future HMDs with more natural and comfortable visual experiences.
See Also
- Vergence-Accommodation Conflict (VAC)
- Accommodation (optics)
- Vergence (optics)
- Microdisplay
- Lens
- Fresnel lens
- Pancake lens
- Varifocal display
- Multifocal display
- Light field display
- Head-mounted display (HMD)
- Field of View (FOV)
- Aberration (optics)
- Depth perception
References
- ↑ 1.0 1.1 Kramida, G. (2016). Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics, 22(7), 1912-1931. doi:10.1109/TVCG.2015.2473855
- ↑ Lang, B. (2019, May 21). Oculus Rift S Doesn’t Have IPD Adjustment, But is Tuned for Optimal Focus from 61.5mm to 65.5mm. Road to VR. Retrieved from https://www.roadtovr.com/oculus-rift-s-ipd-adjustment-optimal-focus-range/
- ↑ 3.0 3.1 Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH 2017, 36(4), Article 85, 1–16. doi:10.1145/3072959.3073624 (Discusses various HMD optics including pancake lenses)
- ↑ 4.0 4.1 4.2 Hoffman, D. M., Girshick, A. R., Akeley, K., & Banks, M. S. (2008). Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. Journal of Vision, 8(3), 33, 1–30. doi:10.1167/8.3.33
- ↑ Shibata, T., Kim, J., Hoffman, D. M., & Banks, M. S. (2011). The zone of comfort: Predicting visual discomfort with stereo displays. Journal of Vision, 11(8), 11, 1–29. doi:10.1167/11.8.11
- ↑ Rathnayake, A. U., Nguyen, T., & Zhan, T. (2021). Varifocal near-eye display using a focus-tunable Alvarez lens. Optics Express, 29(19), 30935-30947. doi:10.1364/OE.436385
- ↑ Mercier, T., Ito, Y., & Kawahito, S. (2017). Multi-focal augmented reality display using time-multiplexed focal planes. Optics Express, 25(23), 28633-28645. doi:10.1364/OE.25.028633
- ↑ Lanman, D., & Luebke, D. (2013). Near-eye light field displays. ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2013, 32(6), Article 220, 1–10. doi:10.1145/2508363.2508364
- ↑ Cite error: Invalid
<ref>
tag; no text was provided for refs namedMaimone2017