Jump to content

Light field display: Difference between revisions

No edit summary
No edit summary
 
(5 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{see also|Terms|Technical Terms}}
{{see also|Terms|Technical Terms}}
'''Light Field Display''' (LFD) is an advanced visualization technology designed to reproduce a [[light field]], the distribution of light rays in [[3D space]], including their intensity and direction.<ref name="WetzsteinPlenoptic">Wetzstein G. “Computational Displays.” SIGGRAPH 2020 Course Notes</ref> Unlike conventional 2D displays or [[stereoscopic display|stereoscopic 3D]] systems that present flat images or fixed viewpoints requiring glasses, light field displays aim to recreate how light naturally propagates from a real scene.<ref name="WetzsteinTensor">Wetzstein, G., Lanman, D., Hirsch, M., & Raskar, R. (2012). Tensor displays: Compressive light field synthesis using multilayer displays with directional backlighting. ACM Transactions on Graphics, 31(4), Article 80. doi:10.1145/2185520.2185576</ref> This allows viewers to perceive genuine [[depth]], [[parallax]] (both horizontal and vertical), and perspective changes without special eyewear (in many implementations).<ref name="LookingGlass27">Looking Glass Factory. Looking Glass 27″ Light Field Display. Retrieved from https://lookingglassfactory.com/looking-glass-27</ref><ref name="LeiaVerge">Hollister, S. (2024, January 19). Leia is building a 3D empire on the back of the worst phone we've ever reviewed. The Verge. Retrieved from https://www.theverge.com/24036574/leia-glasses-free-3d-ces-2024</ref>
'''Light Field Display''' ('''LFD''') is an advanced visualization technology designed to reproduce a [[light field]], the distribution of light rays in [[3D space]], including their intensity and direction.<ref name="WetzsteinPlenoptic">
Wetzstein G. (2020). “Computational Displays: Achieving the Full Plenoptic Function.”
ACM SIGGRAPH 2020 Courses. ACM Digital Library. doi:10.1145/3386569.3409414. 
Available: https://dl.acm.org/doi/10.1145/3386569.3409414 (accessed 3 May 2025).
</ref> Unlike conventional 2D displays or [[stereoscopic display|stereoscopic 3D]] systems that present flat images or fixed viewpoints requiring glasses, light field displays aim to recreate how light naturally propagates from a real scene.<ref name="WetzsteinTensor">Wetzstein, G., Lanman, D., Hirsch, M., & Raskar, R. (2012). Tensor displays: Compressive light field synthesis using multilayer displays with directional backlighting. ACM Transactions on Graphics, 31(4), Article 80. doi:10.1145/2185520.2185576</ref> This allows viewers to perceive genuine [[depth]], [[parallax]] (both horizontal and vertical), and perspective changes without special eyewear (in many implementations).<ref name="LookingGlass27">Looking Glass Factory. Looking Glass 27″ Light Field Display. Retrieved from https://lookingglassfactory.com/looking-glass-27</ref><ref name="LeiaVerge">Hollister, S. (2024, January 19). Leia is building a 3D empire on the back of the worst phone we've ever reviewed. The Verge. Retrieved from https://www.theverge.com/24036574/leia-glasses-free-3d-ces-2024</ref>


This technology is considered crucial for the future of [[Virtual Reality]] (VR) and [[Augmented Reality]] (AR) because it can directly address the [[Vergence-accommodation conflict]] (VAC).<ref name="WiredVAC">Zhang, S. (2015, August 11). The Obscure Neuroscience Problem That's Plaguing VR. WIRED. Retrieved from https://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr</ref><ref name="VACReview">Zhang, Z., Yan, X., Zhang, Y., Liu, Y., & Peng, Y. (2021). Vergence-accommodation conflict in optical see-through display: review and prospect. Opto-Electronic Advances, 4(9), 210003. doi:10.29026/oea.2021.210003</ref> By providing correct [[focal cues]] that match the [[vergence]] information, LFDs promise more immersive, realistic, and visually comfortable experiences, reducing eye strain and [[Virtual Reality Sickness|simulator sickness]] often associated with current HMDs.<ref name="XinRealityWiki">Near-eye light field display - XinReality Wiki. Retrieved from https://xinreality.com/wiki/Near-eye_light_field_display</ref><ref name="CrealWebsite">CREAL. Light-field: Seeing Virtual Worlds Naturally. Retrieved from https://creal.com/technology/</ref>
This technology is considered crucial for the future of [[Virtual Reality]] (VR) and [[Augmented Reality]] (AR) because it can directly address the [[Vergence-accommodation conflict]] (VAC).<ref name="WiredVAC">Zhang, S. (2015, August 11). The Obscure Neuroscience Problem That's Plaguing VR. WIRED. Retrieved from https://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr</ref><ref name="VACReview">Y. Zhou, J. Zhang, F. Fang, “Vergence-accommodation conflict in optical see-through display: Review and prospect,” *Results in Optics*, vol. 5, p. 100160, 2021, doi:10.1016/j.rio.2021.100160.</ref> By providing correct [[focal cues]] that match the [[vergence]] information, LFDs promise more immersive, realistic, and visually comfortable experiences, reducing eye strain and [[Virtual Reality Sickness|simulator sickness]] often associated with current HMDs.<ref name="CrealWebsite">CREAL. Light-field: Seeing Virtual Worlds Naturally. Retrieved from https://creal.com/technology/</ref>


== Definition and Principles ==
== Definition and Principles ==
A light field display aims to replicate the [[Plenoptic Function]], a theoretical function describing the complete set of light rays passing through every point in space, in every direction, potentially across time and wavelength.<ref name="WetzsteinPlenoptic"/> In practice, light field displays generate a discretized (sampled) approximation of the relevant 4D subset of this function (typically spatial position and angular direction).<ref name="WetzsteinPlenoptic"/><ref name="Huang2014EyeglassesFree">Huang, F. C., Wetzstein, G., Barsky, B. A., & Raskar, R. (2014). Eyeglasses-free display: Towards correcting visual aberrations with computational light field displays. ACM Transactions on Graphics, 33(4), Article 59. doi:10.1145/2601097.2601122</ref>
A light field display aims to replicate the [[Plenoptic Function]], a theoretical function describing the complete set of light rays passing through every point in space, in every direction, potentially across time and wavelength.<ref name="WetzsteinPlenoptic"/> In practice, light field displays generate a discretized (sampled) approximation of the relevant 4D subset of this function (typically spatial position and angular direction).<ref name="WetzsteinPlenoptic"/><ref name="Huang2014EyeglassesFree">Huang, F. C., Wetzstein, G., Barsky, B. A., & Raskar, R. (2014). Eyeglasses-free display: Towards correcting visual aberrations with computational light field displays. ACM Transactions on Graphics, 33(4), Article 59. doi:10.1145/2601097.2601122</ref>


By controlling the direction as well as the color and intensity of emitted light rays, these displays allow the viewer's eyes to naturally focus ([[accommodation]]) at different depths within the displayed scene, matching the depth cues provided by binocular vision ([[vergence]]).<ref name="XinRealityWiki"/><ref name="CrealWebsite"/> This recreation allows users to experience:
By controlling the direction as well as the color and intensity of emitted light rays, these displays allow the viewer's eyes to naturally focus ([[accommodation]]) at different depths within the displayed scene, matching the depth cues provided by binocular vision ([[vergence]]).<ref name="CrealWebsite"/> This recreation allows users to experience:
* Full motion [[parallax]] (horizontal and vertical look-around).<ref name="LeiaVerge"/>
* Full motion [[parallax]] (horizontal and vertical look-around).<ref name="LeiaVerge"/>
* Accurate [[occlusion]] cues.
* Accurate [[occlusion]] cues.
* Natural [[focal cues]], mitigating the [[Vergence-accommodation conflict]].<ref name="WiredVAC"/><ref name="XinRealityWiki"/>
* Natural [[focal cues]], mitigating the [[Vergence-accommodation conflict]].<ref name="WiredVAC"/>
* [[Specular highlights]] and realistic reflections that change with viewpoint.
* [[Specular highlights]] and realistic reflections that change with viewpoint.
* Often, viewing without specialized eyewear (especially in non-headset formats).<ref name="LookingGlass27"/>
* Often, viewing without specialized eyewear (especially in non-headset formats).<ref name="LookingGlass27"/>
Line 17: Line 21:
* '''Glasses-Free 3D:''' Many LFD formats (especially desktop and larger) offer autostereoscopic viewing for multiple users simultaneously, each seeing the correct perspective.<ref name="LookingGlass27"/><ref name="LeiaVerge"/>
* '''Glasses-Free 3D:''' Many LFD formats (especially desktop and larger) offer autostereoscopic viewing for multiple users simultaneously, each seeing the correct perspective.<ref name="LookingGlass27"/><ref name="LeiaVerge"/>
* '''Full Parallax:''' True LFDs provide both horizontal and vertical parallax, unlike earlier [[autostereoscopic display|autostereoscopic]] technologies that often limited parallax to side-to-side movement.<ref name="LeiaVerge"/>
* '''Full Parallax:''' True LFDs provide both horizontal and vertical parallax, unlike earlier [[autostereoscopic display|autostereoscopic]] technologies that often limited parallax to side-to-side movement.<ref name="LeiaVerge"/>
* '''Accommodation-Convergence Conflict Resolution:''' A primary driver for VR/AR, LFDs can render virtual objects at appropriate focal distances, aligning accommodation and vergence to significantly improve visual comfort and realism.<ref name="XinRealityWiki"/><ref name="CrealWebsite"/><ref name="Lanman2020NearEyeCourse">Lanman, D., & Luebke, D. (2020). Near-Eye Light Field Displays for VR and AR. SIGGRAPH Courses. doi:10.1145/3388769.3407421</ref>
* '''Accommodation-Convergence Conflict Resolution:''' A primary driver for VR/AR, LFDs can render virtual objects at appropriate focal distances, aligning accommodation and vergence to significantly improve visual comfort and realism.<ref name="CrealWebsite"/><ref name="Lanman2020NearEyeCourse">
Lanman D., & Luebke D.(2013). “Near‑Eye Light Field Displays.” 
*ACM Transactions on Graphics*, 32 (6), 220:1–220:10. doi:10.1145/2508363.2508366. 
Project page: https://research.nvidia.com/publication/near-eye-light-field-displays (accessed 3 May 2025).
</ref>
* '''Computational Requirements:''' Generating and processing the massive amount of data (multiple views or directional light information) needed for LFDs requires significant [[Graphics processing unit|GPU]] power and bandwidth.<ref name="LeiaVerge"/><ref name="Huang2014EyeglassesFree"/>
* '''Computational Requirements:''' Generating and processing the massive amount of data (multiple views or directional light information) needed for LFDs requires significant [[Graphics processing unit|GPU]] power and bandwidth.<ref name="LeiaVerge"/><ref name="Huang2014EyeglassesFree"/>
* '''Resolution Trade-offs:''' A fundamental challenge involves balancing spatial resolution (image sharpness), angular resolution (smoothness of parallax/number of views), [[Field of view|field of view (FoV)]], and depth of field.<ref name="Huang2014EyeglassesFree"/><ref name="Lanman2020NearEyeCourse"/> This is often referred to as the spatio-angular resolution trade-off.
* '''Resolution Trade-offs:''' A fundamental challenge involves balancing spatial resolution (image sharpness), angular resolution (smoothness of parallax/number of views), [[Field of view|field of view (FoV)]], and depth of field.<ref name="Huang2014EyeglassesFree"/><ref name="Lanman2020NearEyeCourse"/> This is often referred to as the spatio-angular resolution trade-off.
Line 43: Line 51:
* '''Directional Backlighting:''' A standard display panel (e.g., LCD) is combined with a specialized backlight that emits light in controlled directions. The backlight might use another LCD panel coupled with optics like lenticular sheets to achieve directionality.<ref name="Maimone2013Focus3D">Maimone, A., Wetzstein, G., Hirsch, M., Lanman, D., Raskar, R., & Fuchs, H. (2013). Focus 3D: compressive accommodation display. ACM Transactions on Graphics, 32(5), Article 152. doi:10.1145/2516971.2516983</ref>
* '''Directional Backlighting:''' A standard display panel (e.g., LCD) is combined with a specialized backlight that emits light in controlled directions. The backlight might use another LCD panel coupled with optics like lenticular sheets to achieve directionality.<ref name="Maimone2013Focus3D">Maimone, A., Wetzstein, G., Hirsch, M., Lanman, D., Raskar, R., & Fuchs, H. (2013). Focus 3D: compressive accommodation display. ACM Transactions on Graphics, 32(5), Article 152. doi:10.1145/2516971.2516983</ref>
* '''Projector Arrays:''' Multiple projectors illuminate a screen (often lenticular or diffusive). Each projector provides a different perspective view, and their combined output forms the light field.<ref name="LeiaVerge"/>
* '''Projector Arrays:''' Multiple projectors illuminate a screen (often lenticular or diffusive). Each projector provides a different perspective view, and their combined output forms the light field.<ref name="LeiaVerge"/>
* '''[[Parallax Barrier]]s:''' An opaque layer with precisely positioned slits or apertures is placed in front of or between display panels. The barrier blocks light selectively, allowing different pixels to be seen from different angles.<ref name="JDI_Parallax">Japan Display Inc. News (2016, December 5). Ultra-High Resolution Display with Integrated Parallax Barrier for Glasses-Free 3D. Retrieved from https://www.j-display.com/english/news/2016/20161205.html</ref> Often less light-efficient than MLAs.
* '''[[Parallax Barrier]]s:''' An opaque layer with precisely positioned slits or apertures is placed in front of or between display panels. The barrier blocks light selectively, allowing different pixels to be seen from different angles.<ref name="JDI_Parallax">
* '''[[Waveguide]] Optics:''' Light is injected into thin optical waveguides (similar to those in some AR glasses) and then coupled out at specific points with controlled directionality, often using diffractive optical elements (DOEs) or gratings.<ref name="LightFieldLabTech">Light Field Lab. SolidLight Platform. Retrieved from https://www.lightfieldlab.com/solidlight</ref><ref name="Maimone2017HolographicNED">Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ACM Transactions on Graphics, 36(4), Article 85. doi:10.1145/3072959.3073624</ref> This is explored for compact AR/VR systems.
Japan Display Inc. (2016, Dec 5). *Ultra‑High Resolution Display with Integrated Parallax Barrier for Glasses‑Free 3D* [Press release].
Archived copy: https://web.archive.org/web/20161221045330/https://www.j-display.com/english/news/2016/20161205.html (accessed 3 May 2025).
</ref> Often less light-efficient than MLAs.
* '''[[Waveguide]] Optics:''' Light is injected into thin optical waveguides (similar to those in some AR glasses) and then coupled out at specific points with controlled directionality, often using diffractive optical elements (DOEs) or gratings.<ref name="LightFieldLabTech">
Light Field Lab. *SolidLight™ Platform Overview.* https://www.lightfieldlab.com/ (accessed 3 May 2025).
</ref><ref name="Maimone2017HolographicNED">Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ACM Transactions on Graphics, 36(4), Article 85. doi:10.1145/3072959.3073624</ref> This is explored for compact AR/VR systems.
* '''Time-Multiplexed Displays:''' Different views or directional illumination patterns are presented rapidly in sequence. If cycled faster than human perception, this creates the illusion of a continuous light field. Can be combined with other techniques like directional backlighting.<ref name="Liu2014OSTHMD">Liu, S., Cheng, D., & Hua, H. (2014). An optical see-through head mounted display with addressable focal planes. 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 33-42. doi:10.1109/ISMAR.2014.6948403</ref>
* '''Time-Multiplexed Displays:''' Different views or directional illumination patterns are presented rapidly in sequence. If cycled faster than human perception, this creates the illusion of a continuous light field. Can be combined with other techniques like directional backlighting.<ref name="Liu2014OSTHMD">Liu, S., Cheng, D., & Hua, H. (2014). An optical see-through head mounted display with addressable focal planes. 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 33-42. doi:10.1109/ISMAR.2014.6948403</ref>
* '''Holographic and Diffractive Approaches:''' While [[Holographic display|holographic displays]] reconstruct wavefronts through diffraction, some LFDs utilize holographic optical elements (HOEs) or related diffractive principles to achieve high angular resolution and potentially overcome MLA limitations.<ref name="SpringerReview2021">Martínez-Corral, M., Guan, Z., Li, Y., Xiong, Z., & Javidi, B. (2021). Review of light field technologies. Visual Computing for Industry, Biomedicine, and Art, 4(1), 29. doi:10.1186/s42492-021-00096-8</ref> Some companies use "holographic" terminology for their high-density LFDs.<ref name="LightFieldLabHolographic">IEEE Spectrum (2021, October 7). Light Field Lab Claims to Have Cracked the Code for Real Holograms. Retrieved from https://spectrum.ieee.org/light-field-lab-holograms</ref>
* '''Holographic and Diffractive Approaches:''' While [[Holographic display|holographic displays]] reconstruct wavefronts through diffraction, some LFDs utilize holographic optical elements (HOEs) or related diffractive principles to achieve high angular resolution and potentially overcome MLA limitations.<ref name="SpringerReview2021">M. Martínez-Corral, Z. Guan, Y. Li, Z. Xiong, B. Javidi, “Review of light field technologies,” *Visual Computing for Industry, Biomedicine and Art*, 4 (1): 29, 2021, doi:10.1186/s42492-021-00096-8.</ref> Some companies use "holographic" terminology for their high-density LFDs.<ref name="ForbesLightField">C. Fink, “Light Field Lab Raises $50 Million to Bring SolidLight Holograms Into the Real World,” *Forbes*, 8 Feb 2023. Available: https://www.forbes.com/sites/charliefink/2023/02/08/light-field-lab-raises-50m-to-bring-solidlight-holograms-into-the-real-world/ (accessed 30 Apr 2025).</ref>


== Types of Light Field Displays ==
== Types of Light Field Displays ==
* '''Near-Eye Light Field Displays:''' Integrated into VR/AR [[Head-mounted display|HMDs]]. Primarily focused on solving the VAC for comfortable, realistic close-up interactions.<ref name="XinRealityWiki"/><ref name="CrealWebsite"/><ref name="Lanman2020NearEyeCourse"/> Examples include research prototypes from NVIDIA<ref name="NvidiaNELD"/> and academic groups,<ref name="Huang2015Stereoscope">Huang, F. C., Chen, K., & Wetzstein, G. (2015). The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues. ACM Transactions on Graphics, 34(4), Article 60. doi:10.1145/2766943</ref> and commercial modules from companies like [[CREAL]].<ref name="CrealRoadToVR"/> Often utilize MLAs, stacked LCDs, or waveguide/diffractive approaches.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/>
* '''Near-Eye Light Field Displays:''' Integrated into VR/AR [[Head-mounted display|HMDs]]. Primarily focused on solving the VAC for comfortable, realistic close-up interactions.<ref name="CrealWebsite"/><ref name="Lanman2020NearEyeCourse"/> Examples include research prototypes from NVIDIA<ref name="NvidiaNELD"/> and academic groups,<ref name="Huang2015Stereoscope">Huang, F. C., Chen, K., & Wetzstein, G. (2015). The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues. ACM Transactions on Graphics, 34(4), Article 60. doi:10.1145/2766943</ref> and commercial modules from companies like [[CREAL]].<ref name="CrealRoadToVR"/> Often utilize MLAs, stacked LCDs, or waveguide/diffractive approaches.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/>
* '''Tabletop/Desktop Displays:''' Provide glasses-free 3D for individual or small group viewing. Used for professional visualization, gaming, communication, and content creation.<ref name="LookingGlass27"/><ref name="LeiaVerge"/> [[Looking Glass Factory]] is a key player here, offering various sizes like the Looking Glass Portrait and the larger Looking Glass 27".<ref name="LookingGlass27"/><ref name="LookingGlassSoftware">Looking Glass Factory. Software Overview. Retrieved from https://lookingglassfactory.com/software</ref> [[Leia Inc.]] also targets this market with monitor and mobile displays.<ref name="LeiaVerge"/> Typically use MLA or barrier technology.
* '''Tabletop/Desktop Displays:''' Provide glasses-free 3D for individual or small group viewing. Used for professional visualization, gaming, communication, and content creation.<ref name="LookingGlass27"/><ref name="LeiaVerge"/> [[Looking Glass Factory]] is a key player here, offering various sizes like the Looking Glass Portrait and the larger Looking Glass 27".<ref name="LookingGlass27"/><ref name="LookingGlassSoftware">Looking Glass Factory. Software Overview. Retrieved from https://lookingglassfactory.com/software</ref> [[Leia Inc.]] also targets this market with monitor and mobile displays.<ref name="LeiaVerge"/> Typically use MLA or barrier technology.
* '''Large Format / Tiled Displays:''' Aimed at creating large-scale, immersive "holographic" experiences without glasses for public venues, command centers, or collaborative environments.<ref name="LightFieldLabHolographic"/><ref name="LightFieldLabSolidLightPR">Light Field Lab Press Release (2021, October 7). Light Field Lab Demonstrates SolidLight™, the Highest Resolution Holographic Display Platform Ever Designed. Retrieved from https://www.lightfieldlab.com/press/light-field-lab-demonstrates-solidlight</ref> [[Light Field Lab]]'s SolidLight™ platform uses modular panels designed to be tiled into large video walls.<ref name="LightFieldLabHolographic"/><ref name="LightFieldLabSolidLightPR"/> Sony's ELF-SR series (Spatial Reality Display) uses high-speed vision sensors and a micro-optical lens for a single user but demonstrates high-fidelity desktop light field effects.<ref name="SonyELFSR2">Sony Professional. Sony's Spatial Reality Display. Retrieved from https://pro.sony/ue_US/products/professional-displays/elf-sr2</ref>
* '''Large Format / Tiled Displays:''' Aimed at creating large-scale, immersive "holographic" experiences without glasses for public venues, command centers, or collaborative environments.<ref name="ForbesLightField"/><ref name="LightFieldLabSolidLightPR">
 
Light Field Lab Press Release (2021, Oct 7). *Light Field Lab Unveils SolidLight™ – The Highest Resolution Holographic Display Platform Ever Designed.
https://www.lightfieldlab.com/press-release-oct-2021 (accessed 3 May 2025).
</ref> [[Light Field Lab]]'s SolidLight™ platform uses modular panels designed to be tiled into large video walls.<ref name="ForbesLightField"/><ref name="LightFieldLabSolidLightPR"/> Sony's ELF-SR series (Spatial Reality Display) uses high-speed vision sensors and a micro-optical lens for a single user but demonstrates high-fidelity desktop light field effects.<ref name="SonyELFSR2">
Sony Professional. *ELF‑SR2 Spatial Reality Display.
https://pro.sony/ue_US/products/spatial-reality-displays/elf-sr2 (accessed 3 May 2025).
</ref>
== Comparison with Other 3D Display Technologies ==
== Comparison with Other 3D Display Technologies ==
{| class="wikitable"
{| class="wikitable"
Line 104: Line 122:
Creating content compatible with LFDs requires capturing or generating directional view information:
Creating content compatible with LFDs requires capturing or generating directional view information:
* '''[[Light Field Camera|Light Field Cameras]] / [[Plenoptic Camera|Plenoptic Cameras]]:''' Capture both intensity and direction of incoming light using specialized sensors (often with MLAs).<ref name="WetzsteinPlenoptic"/> The captured data can be processed for LFD playback.
* '''[[Light Field Camera|Light Field Cameras]] / [[Plenoptic Camera|Plenoptic Cameras]]:''' Capture both intensity and direction of incoming light using specialized sensors (often with MLAs).<ref name="WetzsteinPlenoptic"/> The captured data can be processed for LFD playback.
* '''[[Computer Graphics]] Rendering:''' Standard 3D scenes built in engines like [[Unity (game engine)|Unity]] or [[Unreal Engine]] can be rendered from multiple viewpoints to generate the necessary data.<ref name="LightFieldLabHolographic"/><ref name="LookingGlassSoftware"/> Specialized light field rendering techniques, potentially using [[Ray tracing (graphics)|ray tracing]] or neural methods like [[Neural Radiance Fields]] (NeRF), are employed.<ref name="LightFieldLabHolographic"/><ref name="Mildenhall2020NeRF">Mildenhall, B., Srinivasan, P. P., Tancik, M., Barron, J. T., Ramamoorthi, R., & Ng, R. (2020). NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis. European Conference on Computer Vision (ECCV), 405-421. doi:10.1007/978-3-030-58452-8_24</ref>
* '''[[Computer Graphics]] Rendering:''' Standard 3D scenes built in engines like [[Unity (game engine)|Unity]] or [[Unreal Engine]] can be rendered from multiple viewpoints to generate the necessary data.<ref name="ForbesLightField"/><ref name="LookingGlassSoftware"/> Specialized light field rendering techniques, potentially using [[Ray tracing (graphics)|ray tracing]] or neural methods like [[Neural Radiance Fields]] (NeRF), are employed.<ref name="ForbesLightField"/><ref name="Mildenhall2020NeRF">Mildenhall, B., Srinivasan, P. P., Tancik, M., Barron, J. T., Ramamoorthi, R., & Ng, R. (2020). NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis. European Conference on Computer Vision (ECCV), 405-421. doi:10.1007/978-3-030-58452-8_24</ref>
* '''[[Photogrammetry]] and 3D Scanning:''' Real-world objects/scenes captured as 3D models can serve as input for rendering light field views.
* '''[[Photogrammetry]] and 3D Scanning:''' Real-world objects/scenes captured as 3D models can serve as input for rendering light field views.
* '''Existing 3D Content Conversion:''' Plugins and software tools (e.g., provided by Looking Glass Factory) allow conversion of existing 3D models, animations, or even stereoscopic content for LFD viewing.<ref name="LookingGlassSoftware"/>
* '''Existing 3D Content Conversion:''' Plugins and software tools (e.g., provided by Looking Glass Factory) allow conversion of existing 3D models, animations, or even stereoscopic content for LFD viewing.<ref name="LookingGlassSoftware"/>
Line 111: Line 129:
==Applications==
==Applications==
===Applications in VR and AR===
===Applications in VR and AR===
* '''Enhanced Realism and Immersion:''' Correct depth cues make virtual objects appear more solid and stable, improving the sense of presence, especially for near-field interactions.<ref name="XinRealityWiki"/><ref name="CrealRoadToVR"/>
* '''Enhanced Realism and Immersion:''' Correct depth cues make virtual objects appear more solid and stable, improving the sense of presence, especially for near-field interactions.<ref name="CrealRoadToVR"/>
* '''Improved Visual Comfort:''' Mitigating the VAC reduces eye strain, fatigue, and nausea, enabling longer and more comfortable VR/AR sessions.<ref name="WiredVAC"/><ref name="CrealWebsite"/>
* '''Improved Visual Comfort:''' Mitigating the VAC reduces eye strain, fatigue, and nausea, enabling longer and more comfortable VR/AR sessions.<ref name="WiredVAC"/><ref name="CrealWebsite"/>
* '''Natural Interaction:''' Accurate depth perception facilitates intuitive hand-eye coordination for manipulating virtual objects.<ref name="CrealRoadToVR"/>
* '''Natural Interaction:''' Accurate depth perception facilitates intuitive hand-eye coordination for manipulating virtual objects.<ref name="CrealRoadToVR"/>
Line 122: Line 140:
* '''[[Digital Signage]] and Advertising:''' Eye-catching glasses-free 3D displays for retail and public spaces.<ref name="LookingGlass27"/>
* '''[[Digital Signage]] and Advertising:''' Eye-catching glasses-free 3D displays for retail and public spaces.<ref name="LookingGlass27"/>
* '''Product Design and Engineering (CAD/CAE):''' Collaborative visualization and review of 3D models.<ref name="Nam2019Medical"/>
* '''Product Design and Engineering (CAD/CAE):''' Collaborative visualization and review of 3D models.<ref name="Nam2019Medical"/>
* '''Entertainment and Gaming:''' Immersive experiences in arcades, museums, theme parks, and potentially future home entertainment.<ref name="LightFieldLabHolographic"/>
* '''Entertainment and Gaming:''' Immersive experiences in arcades, museums, theme parks, and potentially future home entertainment.<ref name="ForbesLightField"/>
* '''Automotive Displays:''' [[Head-up display|Heads-up displays]] (HUDs) or dashboards presenting information at appropriate depths.<ref name="JDI_Parallax"/>
* '''Automotive Displays:''' [[Head-up display|Heads-up displays]] (HUDs) or dashboards presenting information at appropriate depths.<ref name="JDI_Parallax"/>
* '''Telepresence and Communication:''' Creating realistic, life-sized 3D representations of remote collaborators, like Google's [[Project Starline]] concept.<ref name="Starline">Google Blog (2023, May 10). A first look at Project Starline’s new, simpler prototype. Retrieved from https://blog.google/technology/research/project-starline-prototype/</ref>
* '''Telepresence and Communication:''' Creating realistic, life-sized 3D representations of remote collaborators, like Google's [[Project Starline]] concept.<ref name="Starline">Google Blog (2023, May 10). A first look at Project Starline’s new, simpler prototype. Retrieved from https://blog.google/technology/research/project-starline-prototype/</ref>
Line 129: Line 147:
== Challenges and Limitations ==
== Challenges and Limitations ==
* '''Spatio-Angular Resolution Trade-off:''' Increasing the number of views (angular resolution) often decreases the perceived sharpness (spatial resolution) for a fixed display pixel count.<ref name="Huang2014EyeglassesFree"/><ref name="Lanman2020NearEyeCourse"/>
* '''Spatio-Angular Resolution Trade-off:''' Increasing the number of views (angular resolution) often decreases the perceived sharpness (spatial resolution) for a fixed display pixel count.<ref name="Huang2014EyeglassesFree"/><ref name="Lanman2020NearEyeCourse"/>
* '''Computational Complexity & Bandwidth:''' Rendering, compressing, and transmitting the massive datasets for real-time LFDs is extremely demanding on GPUs and data infrastructure.<ref name="LeiaVerge"/><ref name="LightFieldLabHolographic"/>
* '''Computational Complexity & Bandwidth:''' Rendering, compressing, and transmitting the massive datasets for real-time LFDs is extremely demanding on GPUs and data infrastructure.<ref name="LeiaVerge"/><ref name="ForbesLightField"/>
* '''Manufacturing Complexity and Cost:''' Producing precise optical components like high-density MLAs, perfectly aligned multi-layer stacks, or large-area waveguide structures is challenging and costly.<ref name="LightFieldLabHolographic"/>
* '''Manufacturing Complexity and Cost:''' Producing precise optical components like high-density MLAs, perfectly aligned multi-layer stacks, or large-area waveguide structures is challenging and costly.<ref name="ForbesLightField"/>
* '''Form Factor and Miniaturization:''' Integrating complex optics and electronics into thin, lightweight, and power-efficient near-eye devices remains difficult.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/>
* '''Form Factor and Miniaturization:''' Integrating complex optics and electronics into thin, lightweight, and power-efficient near-eye devices remains difficult.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/>
* '''Limited Field of View (FoV):''' Achieving wide FoV comparable to traditional VR headsets while maintaining high angular resolution is challenging.<ref name="Lanman2020NearEyeCourse"/>
* '''Limited Field of View (FoV):''' Achieving wide FoV comparable to traditional VR headsets while maintaining high angular resolution is challenging.<ref name="Lanman2020NearEyeCourse"/>
Line 156: Line 174:
* '''Miniaturization for Wearables:''' Developing ultra-thin, efficient components using [[Metasurface|metasurfaces]], [[Holographic optical element|holographic optical elements (HOEs)]], advanced waveguides, and [[MicroLED]] displays for integration into consumer AR/VR glasses.<ref name="CrealRoadToVR"/><ref name="SpringerReview2021"/>
* '''Miniaturization for Wearables:''' Developing ultra-thin, efficient components using [[Metasurface|metasurfaces]], [[Holographic optical element|holographic optical elements (HOEs)]], advanced waveguides, and [[MicroLED]] displays for integration into consumer AR/VR glasses.<ref name="CrealRoadToVR"/><ref name="SpringerReview2021"/>
* '''Improved Content Capture and Creation Tools:''' Advancements in [[Plenoptic camera|plenoptic cameras]], AI-driven view synthesis, and streamlined software workflows.<ref name="Mildenhall2020NeRF"/>
* '''Improved Content Capture and Creation Tools:''' Advancements in [[Plenoptic camera|plenoptic cameras]], AI-driven view synthesis, and streamlined software workflows.<ref name="Mildenhall2020NeRF"/>
* '''Higher Resolution and Efficiency:''' Addressing the spatio-angular trade-off and improving light efficiency through new materials, optical designs (e.g., polarization multiplexing<ref name="Polarization2025">Resolution enhancement of light field displays using a polarization multiplexing panel. (2025). Optics Communications, 583, 130858. doi:10.1016/j.optcom.2025.130858</ref>), and display technologies.
* '''Higher Resolution and Efficiency:''' Addressing the spatio-angular trade-off and improving light efficiency through new materials, optical designs (e.g., polarization multiplexing<ref name="Tan2019Polarization">G. Tan, T. Zhan, Y.-H. Lee, J. Xiong, S.-T. Wu, “Near-eye light-field display with polarization multiplexing,” *Proceedings of SPIE* 10942, Advances in Display Technologies IX, paper 1094206, 2019, doi:10.1117/12.2509121.</ref>), and display technologies.


== See Also ==
== See Also ==