Light field display: Difference between revisions
Appearance
Xinreality (talk | contribs) No edit summary |
Xinreality (talk | contribs) No edit summary |
||
Line 46: | Line 46: | ||
* '''[[Waveguide]] Optics:''' Light is injected into thin optical waveguides (similar to those in some AR glasses) and then coupled out at specific points with controlled directionality, often using diffractive optical elements (DOEs) or gratings.<ref name="LightFieldLabTech">Light Field Lab. SolidLight Platform. Retrieved from https://www.lightfieldlab.com/solidlight</ref><ref name="Maimone2017HolographicNED">Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ACM Transactions on Graphics, 36(4), Article 85. doi:10.1145/3072959.3073624</ref> This is explored for compact AR/VR systems. | * '''[[Waveguide]] Optics:''' Light is injected into thin optical waveguides (similar to those in some AR glasses) and then coupled out at specific points with controlled directionality, often using diffractive optical elements (DOEs) or gratings.<ref name="LightFieldLabTech">Light Field Lab. SolidLight Platform. Retrieved from https://www.lightfieldlab.com/solidlight</ref><ref name="Maimone2017HolographicNED">Maimone, A., Georgiou, A., & Kollin, J. S. (2017). Holographic near-eye displays for virtual and augmented reality. ACM Transactions on Graphics, 36(4), Article 85. doi:10.1145/3072959.3073624</ref> This is explored for compact AR/VR systems. | ||
* '''Time-Multiplexed Displays:''' Different views or directional illumination patterns are presented rapidly in sequence. If cycled faster than human perception, this creates the illusion of a continuous light field. Can be combined with other techniques like directional backlighting.<ref name="Liu2014OSTHMD">Liu, S., Cheng, D., & Hua, H. (2014). An optical see-through head mounted display with addressable focal planes. 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 33-42. doi:10.1109/ISMAR.2014.6948403</ref> | * '''Time-Multiplexed Displays:''' Different views or directional illumination patterns are presented rapidly in sequence. If cycled faster than human perception, this creates the illusion of a continuous light field. Can be combined with other techniques like directional backlighting.<ref name="Liu2014OSTHMD">Liu, S., Cheng, D., & Hua, H. (2014). An optical see-through head mounted display with addressable focal planes. 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 33-42. doi:10.1109/ISMAR.2014.6948403</ref> | ||
* '''Holographic and Diffractive Approaches:''' While [[Holographic display|holographic displays]] reconstruct wavefronts through diffraction, some LFDs utilize holographic optical elements (HOEs) or related diffractive principles to achieve high angular resolution and potentially overcome MLA limitations.<ref name="SpringerReview2021">M. Martínez-Corral, Z. Guan, Y. Li, Z. Xiong, B. Javidi, “Review of light field technologies,” *Visual Computing for Industry, Biomedicine and Art*, 4 (1): 29, 2021, doi:10.1186/s42492-021-00096-8.</ref> Some companies use "holographic" terminology for their high-density LFDs.<ref name=" | * '''Holographic and Diffractive Approaches:''' While [[Holographic display|holographic displays]] reconstruct wavefronts through diffraction, some LFDs utilize holographic optical elements (HOEs) or related diffractive principles to achieve high angular resolution and potentially overcome MLA limitations.<ref name="SpringerReview2021">M. Martínez-Corral, Z. Guan, Y. Li, Z. Xiong, B. Javidi, “Review of light field technologies,” *Visual Computing for Industry, Biomedicine and Art*, 4 (1): 29, 2021, doi:10.1186/s42492-021-00096-8.</ref> Some companies use "holographic" terminology for their high-density LFDs.<ref name="ForbesLightField">C. Fink, “Light Field Lab Raises $50 Million to Bring SolidLight Holograms Into the Real World,” *Forbes*, 8 Feb 2023. Available: https://www.forbes.com/sites/charliefink/2023/02/08/light-field-lab-raises-50m-to-bring-solidlight-holograms-into-the-real-world/ (accessed 30 Apr 2025).</ref> | ||
== Types of Light Field Displays == | == Types of Light Field Displays == | ||
* '''Near-Eye Light Field Displays:''' Integrated into VR/AR [[Head-mounted display|HMDs]]. Primarily focused on solving the VAC for comfortable, realistic close-up interactions.<ref name="XinRealityWiki"/><ref name="CrealWebsite"/><ref name="Lanman2020NearEyeCourse"/> Examples include research prototypes from NVIDIA<ref name="NvidiaNELD"/> and academic groups,<ref name="Huang2015Stereoscope">Huang, F. C., Chen, K., & Wetzstein, G. (2015). The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues. ACM Transactions on Graphics, 34(4), Article 60. doi:10.1145/2766943</ref> and commercial modules from companies like [[CREAL]].<ref name="CrealRoadToVR"/> Often utilize MLAs, stacked LCDs, or waveguide/diffractive approaches.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/> | * '''Near-Eye Light Field Displays:''' Integrated into VR/AR [[Head-mounted display|HMDs]]. Primarily focused on solving the VAC for comfortable, realistic close-up interactions.<ref name="XinRealityWiki"/><ref name="CrealWebsite"/><ref name="Lanman2020NearEyeCourse"/> Examples include research prototypes from NVIDIA<ref name="NvidiaNELD"/> and academic groups,<ref name="Huang2015Stereoscope">Huang, F. C., Chen, K., & Wetzstein, G. (2015). The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues. ACM Transactions on Graphics, 34(4), Article 60. doi:10.1145/2766943</ref> and commercial modules from companies like [[CREAL]].<ref name="CrealRoadToVR"/> Often utilize MLAs, stacked LCDs, or waveguide/diffractive approaches.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/> | ||
* '''Tabletop/Desktop Displays:''' Provide glasses-free 3D for individual or small group viewing. Used for professional visualization, gaming, communication, and content creation.<ref name="LookingGlass27"/><ref name="LeiaVerge"/> [[Looking Glass Factory]] is a key player here, offering various sizes like the Looking Glass Portrait and the larger Looking Glass 27".<ref name="LookingGlass27"/><ref name="LookingGlassSoftware">Looking Glass Factory. Software Overview. Retrieved from https://lookingglassfactory.com/software</ref> [[Leia Inc.]] also targets this market with monitor and mobile displays.<ref name="LeiaVerge"/> Typically use MLA or barrier technology. | * '''Tabletop/Desktop Displays:''' Provide glasses-free 3D for individual or small group viewing. Used for professional visualization, gaming, communication, and content creation.<ref name="LookingGlass27"/><ref name="LeiaVerge"/> [[Looking Glass Factory]] is a key player here, offering various sizes like the Looking Glass Portrait and the larger Looking Glass 27".<ref name="LookingGlass27"/><ref name="LookingGlassSoftware">Looking Glass Factory. Software Overview. Retrieved from https://lookingglassfactory.com/software</ref> [[Leia Inc.]] also targets this market with monitor and mobile displays.<ref name="LeiaVerge"/> Typically use MLA or barrier technology. | ||
* '''Large Format / Tiled Displays:''' Aimed at creating large-scale, immersive "holographic" experiences without glasses for public venues, command centers, or collaborative environments.<ref name=" | * '''Large Format / Tiled Displays:''' Aimed at creating large-scale, immersive "holographic" experiences without glasses for public venues, command centers, or collaborative environments.<ref name="ForbesLightField"/><ref name="LightFieldLabSolidLightPR">Light Field Lab Press Release (2021, October 7). Light Field Lab Demonstrates SolidLight™, the Highest Resolution Holographic Display Platform Ever Designed. Retrieved from https://www.lightfieldlab.com/press/light-field-lab-demonstrates-solidlight</ref> [[Light Field Lab]]'s SolidLight™ platform uses modular panels designed to be tiled into large video walls.<ref name="ForbesLightField"/><ref name="LightFieldLabSolidLightPR"/> Sony's ELF-SR series (Spatial Reality Display) uses high-speed vision sensors and a micro-optical lens for a single user but demonstrates high-fidelity desktop light field effects.<ref name="SonyELFSR2">Sony Professional. Sony's Spatial Reality Display. Retrieved from https://pro.sony/ue_US/products/professional-displays/elf-sr2</ref> | ||
== Comparison with Other 3D Display Technologies == | == Comparison with Other 3D Display Technologies == | ||
Line 104: | Line 104: | ||
Creating content compatible with LFDs requires capturing or generating directional view information: | Creating content compatible with LFDs requires capturing or generating directional view information: | ||
* '''[[Light Field Camera|Light Field Cameras]] / [[Plenoptic Camera|Plenoptic Cameras]]:''' Capture both intensity and direction of incoming light using specialized sensors (often with MLAs).<ref name="WetzsteinPlenoptic"/> The captured data can be processed for LFD playback. | * '''[[Light Field Camera|Light Field Cameras]] / [[Plenoptic Camera|Plenoptic Cameras]]:''' Capture both intensity and direction of incoming light using specialized sensors (often with MLAs).<ref name="WetzsteinPlenoptic"/> The captured data can be processed for LFD playback. | ||
* '''[[Computer Graphics]] Rendering:''' Standard 3D scenes built in engines like [[Unity (game engine)|Unity]] or [[Unreal Engine]] can be rendered from multiple viewpoints to generate the necessary data.<ref name=" | * '''[[Computer Graphics]] Rendering:''' Standard 3D scenes built in engines like [[Unity (game engine)|Unity]] or [[Unreal Engine]] can be rendered from multiple viewpoints to generate the necessary data.<ref name="ForbesLightField"/><ref name="LookingGlassSoftware"/> Specialized light field rendering techniques, potentially using [[Ray tracing (graphics)|ray tracing]] or neural methods like [[Neural Radiance Fields]] (NeRF), are employed.<ref name="ForbesLightField"/><ref name="Mildenhall2020NeRF">Mildenhall, B., Srinivasan, P. P., Tancik, M., Barron, J. T., Ramamoorthi, R., & Ng, R. (2020). NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis. European Conference on Computer Vision (ECCV), 405-421. doi:10.1007/978-3-030-58452-8_24</ref> | ||
* '''[[Photogrammetry]] and 3D Scanning:''' Real-world objects/scenes captured as 3D models can serve as input for rendering light field views. | * '''[[Photogrammetry]] and 3D Scanning:''' Real-world objects/scenes captured as 3D models can serve as input for rendering light field views. | ||
* '''Existing 3D Content Conversion:''' Plugins and software tools (e.g., provided by Looking Glass Factory) allow conversion of existing 3D models, animations, or even stereoscopic content for LFD viewing.<ref name="LookingGlassSoftware"/> | * '''Existing 3D Content Conversion:''' Plugins and software tools (e.g., provided by Looking Glass Factory) allow conversion of existing 3D models, animations, or even stereoscopic content for LFD viewing.<ref name="LookingGlassSoftware"/> | ||
Line 122: | Line 122: | ||
* '''[[Digital Signage]] and Advertising:''' Eye-catching glasses-free 3D displays for retail and public spaces.<ref name="LookingGlass27"/> | * '''[[Digital Signage]] and Advertising:''' Eye-catching glasses-free 3D displays for retail and public spaces.<ref name="LookingGlass27"/> | ||
* '''Product Design and Engineering (CAD/CAE):''' Collaborative visualization and review of 3D models.<ref name="Nam2019Medical"/> | * '''Product Design and Engineering (CAD/CAE):''' Collaborative visualization and review of 3D models.<ref name="Nam2019Medical"/> | ||
* '''Entertainment and Gaming:''' Immersive experiences in arcades, museums, theme parks, and potentially future home entertainment.<ref name=" | * '''Entertainment and Gaming:''' Immersive experiences in arcades, museums, theme parks, and potentially future home entertainment.<ref name="ForbesLightField"/> | ||
* '''Automotive Displays:''' [[Head-up display|Heads-up displays]] (HUDs) or dashboards presenting information at appropriate depths.<ref name="JDI_Parallax"/> | * '''Automotive Displays:''' [[Head-up display|Heads-up displays]] (HUDs) or dashboards presenting information at appropriate depths.<ref name="JDI_Parallax"/> | ||
* '''Telepresence and Communication:''' Creating realistic, life-sized 3D representations of remote collaborators, like Google's [[Project Starline]] concept.<ref name="Starline">Google Blog (2023, May 10). A first look at Project Starline’s new, simpler prototype. Retrieved from https://blog.google/technology/research/project-starline-prototype/</ref> | * '''Telepresence and Communication:''' Creating realistic, life-sized 3D representations of remote collaborators, like Google's [[Project Starline]] concept.<ref name="Starline">Google Blog (2023, May 10). A first look at Project Starline’s new, simpler prototype. Retrieved from https://blog.google/technology/research/project-starline-prototype/</ref> | ||
Line 129: | Line 129: | ||
== Challenges and Limitations == | == Challenges and Limitations == | ||
* '''Spatio-Angular Resolution Trade-off:''' Increasing the number of views (angular resolution) often decreases the perceived sharpness (spatial resolution) for a fixed display pixel count.<ref name="Huang2014EyeglassesFree"/><ref name="Lanman2020NearEyeCourse"/> | * '''Spatio-Angular Resolution Trade-off:''' Increasing the number of views (angular resolution) often decreases the perceived sharpness (spatial resolution) for a fixed display pixel count.<ref name="Huang2014EyeglassesFree"/><ref name="Lanman2020NearEyeCourse"/> | ||
* '''Computational Complexity & Bandwidth:''' Rendering, compressing, and transmitting the massive datasets for real-time LFDs is extremely demanding on GPUs and data infrastructure.<ref name="LeiaVerge"/><ref name=" | * '''Computational Complexity & Bandwidth:''' Rendering, compressing, and transmitting the massive datasets for real-time LFDs is extremely demanding on GPUs and data infrastructure.<ref name="LeiaVerge"/><ref name="ForbesLightField"/> | ||
* '''Manufacturing Complexity and Cost:''' Producing precise optical components like high-density MLAs, perfectly aligned multi-layer stacks, or large-area waveguide structures is challenging and costly.<ref name=" | * '''Manufacturing Complexity and Cost:''' Producing precise optical components like high-density MLAs, perfectly aligned multi-layer stacks, or large-area waveguide structures is challenging and costly.<ref name="ForbesLightField"/> | ||
* '''Form Factor and Miniaturization:''' Integrating complex optics and electronics into thin, lightweight, and power-efficient near-eye devices remains difficult.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/> | * '''Form Factor and Miniaturization:''' Integrating complex optics and electronics into thin, lightweight, and power-efficient near-eye devices remains difficult.<ref name="Lanman2020NearEyeCourse"/><ref name="CrealRoadToVR"/> | ||
* '''Limited Field of View (FoV):''' Achieving wide FoV comparable to traditional VR headsets while maintaining high angular resolution is challenging.<ref name="Lanman2020NearEyeCourse"/> | * '''Limited Field of View (FoV):''' Achieving wide FoV comparable to traditional VR headsets while maintaining high angular resolution is challenging.<ref name="Lanman2020NearEyeCourse"/> | ||
Line 156: | Line 156: | ||
* '''Miniaturization for Wearables:''' Developing ultra-thin, efficient components using [[Metasurface|metasurfaces]], [[Holographic optical element|holographic optical elements (HOEs)]], advanced waveguides, and [[MicroLED]] displays for integration into consumer AR/VR glasses.<ref name="CrealRoadToVR"/><ref name="SpringerReview2021"/> | * '''Miniaturization for Wearables:''' Developing ultra-thin, efficient components using [[Metasurface|metasurfaces]], [[Holographic optical element|holographic optical elements (HOEs)]], advanced waveguides, and [[MicroLED]] displays for integration into consumer AR/VR glasses.<ref name="CrealRoadToVR"/><ref name="SpringerReview2021"/> | ||
* '''Improved Content Capture and Creation Tools:''' Advancements in [[Plenoptic camera|plenoptic cameras]], AI-driven view synthesis, and streamlined software workflows.<ref name="Mildenhall2020NeRF"/> | * '''Improved Content Capture and Creation Tools:''' Advancements in [[Plenoptic camera|plenoptic cameras]], AI-driven view synthesis, and streamlined software workflows.<ref name="Mildenhall2020NeRF"/> | ||
* '''Higher Resolution and Efficiency:''' Addressing the spatio-angular trade-off and improving light efficiency through new materials, optical designs (e.g., polarization multiplexing<ref name=" | * '''Higher Resolution and Efficiency:''' Addressing the spatio-angular trade-off and improving light efficiency through new materials, optical designs (e.g., polarization multiplexing<ref name="Tan2019Polarization">G. Tan, T. Zhan, Y.-H. Lee, J. Xiong, S.-T. Wu, “Near-eye light-field display with polarization multiplexing,” *Proceedings of SPIE* 10942, Advances in Display Technologies IX, paper 1094206, 2019, doi:10.1117/12.2509121.</ref>), and display technologies. | ||
== See Also == | == See Also == |