Lens array: Difference between revisions
Appearance
Xinreality (talk | contribs) No edit summary |
Xinreality (talk | contribs) No edit summary |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{see also|Terms|Technical Terms}} | {{see also|Terms|Technical Terms}} | ||
[[Lens array]]s are two-dimensional arrangements of many small lenses (often [[Microlens arrays]]) that manipulate [[light fields]] for imaging or display. In [[Virtual reality]] (VR) and [[Augmented reality]] (AR) systems, lens arrays serve two broad roles: as '''display optics''' that create 3D or light-field images, and as '''sensor optics''' that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths.<ref name="Li2019"> | [[Lens array]]s are two-dimensional arrangements of many small lenses (often [[Microlens arrays]]) that manipulate [[light fields]] for imaging or display. In [[Virtual reality]] (VR) and [[Augmented reality]] (AR) systems, lens arrays serve two broad roles: as '''display optics''' that create 3D or light-field images, and as '''sensor optics''' that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths.<ref name="Li2019"></ref><ref name="Ng2005"></ref> In sensing, microlens-based ''plenoptic'' or light-field cameras capture the full 4D light field, allowing computational refocusing and depth estimation.<ref name="Ng2005" /> <ref name="Yang2018"></ref> Modern VR/AR prototypes leverage microlens arrays, [[Light field display]] techniques, [[Integral imaging]], holographic waveguide couplers, and specialized lens-array modules for eye tracking and depth sensing. These components appear in devices such as wide-FOV near-eye displays and optical see-through [[Head-mounted display]]s. | ||
==History== | ==History== | ||
Lens-array technology traces back over a century. Gabriel Lippmann first proposed "integral photography" in 1908, capturing 3D scenes via a lens grid.<ref name="Li2019" /> Early implementations used pinhole arrays (circa 1911) and later simple microlens plates (around 1948) to record and replay light fields.<ref name="Li2019" /> In the mid-20th century, lenticular (cylindrical lens) sheets became popular for autostereoscopic prints and displays (e.g. 3D postcards and packaging), providing separate views for each eye. By the 2000s, advances in digital displays and microfabrication revived lens-array research for head-worn displays. For example, smartphone-scale integral imaging was demonstrated by pairing a display with a matching MLA.<ref name="Li2019" /> In recent years, VR/AR research has produced thin, wide-FOV near-eye displays using sophisticated lens arrays (e.g. polarization optics or metasurfaces)<ref name="Shin2023"> | Lens-array technology traces back over a century. Gabriel Lippmann first proposed "integral photography" in 1908, capturing 3D scenes via a lens grid.<ref name="Li2019" /> Early implementations used pinhole arrays (circa 1911) and later simple microlens plates (around 1948) to record and replay light fields.<ref name="Li2019" /> In the mid-20th century, lenticular (cylindrical lens) sheets became popular for autostereoscopic prints and displays (e.g. 3D postcards and packaging), providing separate views for each eye. By the 2000s, advances in digital displays and microfabrication revived lens-array research for head-worn displays. For example, smartphone-scale integral imaging was demonstrated by pairing a display with a matching MLA.<ref name="Li2019" /> In recent years, VR/AR research has produced thin, wide-FOV near-eye displays using sophisticated lens arrays (e.g. polarization optics or metasurfaces)<ref name="Shin2023"></ref>, as well as compact eye-tracking and depth cameras using microlens arrays.<ref name="Yang2018" /><ref name="Microsoft2020"></ref> | ||
== Types of lens arrays == | == Types of lens arrays == | ||
Line 9: | Line 9: | ||
Lens arrays in VR/AR come in several varieties: | Lens arrays in VR/AR come in several varieties: | ||
'''Spherical microlens arrays:''' Regular arrays of small convex (often spherical or aspheric) lenses. These planar MLAs are common for light-field displays and cameras. Pitch (spacing) can range from tens of micrometers (in cameras) up to a few millimeters (in HMD displays).<ref name="Wei2023"> | '''Spherical microlens arrays:''' Regular arrays of small convex (often spherical or aspheric) lenses. These planar MLAs are common for light-field displays and cameras. Pitch (spacing) can range from tens of micrometers (in cameras) up to a few millimeters (in HMD displays).<ref name="Wei2023"></ref><ref name="Jang2021"></ref> Each lenslet has a focal length chosen to suit the application (e.g. to collimate a display or focus on a sensor). | ||
'''Lenticular arrays:''' Arrays of cylindrical microlenses (lenticules) arranged one-dimensionally or two-dimensionally. These produce multiple horizontal viewing zones in glasses-free 3D displays. For example, a lenticular lens array can restrict the exit pupil to certain angles, enabling light-field panels that show different images to each eye.<ref name="Balogh2023"> | '''Lenticular arrays:''' Arrays of cylindrical microlenses (lenticules) arranged one-dimensionally or two-dimensionally. These produce multiple horizontal viewing zones in glasses-free 3D displays. For example, a lenticular lens array can restrict the exit pupil to certain angles, enabling light-field panels that show different images to each eye.<ref name="Balogh2023"></ref> Such arrays are widely used in glasses-free 3D signage and have been adapted to VR/AR light-field display prototypes. | ||
'''Holographic optical element (HOE) arrays:''' These use diffractive hologram patterns that act like an array of lenses. In AR waveguide combiners, ''lens-array holographic optical elements'' have been used to form 2D/3D transparent display screens.<ref name="Liu2012"> | '''Holographic optical element (HOE) arrays:''' These use diffractive hologram patterns that act like an array of lenses. In AR waveguide combiners, ''lens-array holographic optical elements'' have been used to form 2D/3D transparent display screens.<ref name="Liu2012"></ref> A HOE can replace a physical lens array by encoding lens behavior into a recorded interference pattern. In one prototype, a ''lens-array HOE'' was created to build a see-through AR screen.<ref name="Liu2012" /> Other works use holographic micromirror arrays in conjunction with MLAs to couple images into waveguides.<ref name="Jang2021" /> | ||
'''Liquid crystal / tunable lens arrays:''' Some arrays use liquid crystal (LC) or fluidic lenses whose optical power can be electronically changed. For example, a chiral (polarization-sensitive) LC lens array was demonstrated in an AR system to steer light and break conventional FOV limits.<ref name="Wei2023" /> Variable-focus MLAs can allow dynamic focus adjustment or multi-focal displays. | '''Liquid crystal / tunable lens arrays:''' Some arrays use liquid crystal (LC) or fluidic lenses whose optical power can be electronically changed. For example, a chiral (polarization-sensitive) LC lens array was demonstrated in an AR system to steer light and break conventional FOV limits.<ref name="Wei2023" /> Variable-focus MLAs can allow dynamic focus adjustment or multi-focal displays. | ||
Line 76: | Line 76: | ||
Research on lens-array technology is advancing rapidly. '''Adaptive optics''' will likely play a growing role. Arrays of liquid-crystal or shape-changing lenses could allow dynamic focus control and multi-focal displays (reducing vergence-accommodation conflict). Similarly, '''dynamic wavelength control''' (e.g. polarization or tunable filters in each lenslet) could enable spatiotemporal multiplexing for color and focus. | Research on lens-array technology is advancing rapidly. '''Adaptive optics''' will likely play a growing role. Arrays of liquid-crystal or shape-changing lenses could allow dynamic focus control and multi-focal displays (reducing vergence-accommodation conflict). Similarly, '''dynamic wavelength control''' (e.g. polarization or tunable filters in each lenslet) could enable spatiotemporal multiplexing for color and focus. | ||
'''Metasurfaces and flat optics''' are a major trend. Recent work has demonstrated ''achromatic metasurface waveguides'' for AR: for example, a 2025 Light:Science &Apps paper introduced inverse-designed metasurface couplers that eliminate chromatic aberration across the full visible spectrum and achieve ~45° FOV.<ref name="Achromatic2025"> | '''Metasurfaces and flat optics''' are a major trend. Recent work has demonstrated ''achromatic metasurface waveguides'' for AR: for example, a 2025 Light:Science &Apps paper introduced inverse-designed metasurface couplers that eliminate chromatic aberration across the full visible spectrum and achieve ~45° FOV.<ref name="Achromatic2025"></ref> These metasurface lens arrays are ultrathin and could replace bulky refractive MLAs in future headsets. Cholesteric liquid-crystal metasurface (chiral) lens arrays have already been used to break the field-of-view limit in a scanning AR display.<ref name="Wei2023" /> | ||
'''Integration and compute-optics co-design''' will improve performance. Headsets may co-optimize lens arrays with on-sensor processing. For instance, a microlens array camera could perform onboard refocusing or eye-pose estimation in hardware. Conversely, display side, algorithms could pre-distort images to compensate residual lens aberrations. | '''Integration and compute-optics co-design''' will improve performance. Headsets may co-optimize lens arrays with on-sensor processing. For instance, a microlens array camera could perform onboard refocusing or eye-pose estimation in hardware. Conversely, display side, algorithms could pre-distort images to compensate residual lens aberrations. | ||
Line 87: | Line 87: | ||
==References== | ==References== | ||
<references /> | <references> | ||
<ref name="Li2019">Li, X.; Chen, L.; Li, Y.; <i>et al.</i> “A Broadband Achromatic Metalens Array for Integral Imaging in the Visible.” ''Light: Science & Applications'' <b>8</b>, 99 (2019). https://doi.org/10.1038/s41377‑019‑0197‑4</ref> | |||
<ref name="Ng2005">Ng, R.; Levoy, M.; Brédif, M.; <i>et al.</i> “Light Field Photography with a Hand‑Held Plenoptic Camera.” Stanford CSTR 2005‑02 (2005). http://graphics.stanford.edu/papers/lfcamera/</ref> | |||
<ref name="Yang2018">Yang, L.; Guo, Y. “Eye Tracking Using a Light Field Camera on a Head‑Mounted Display.” US Patent Application <b>US 2018/0173303 A1</b>, 21 June 2018.</ref> | |||
<ref name="Shin2023">Shin, K‑S.; Hong, J.; Han, W.; Park, J‑H. “Field of View and Angular‑Resolution Enhancement in Microlens‑Array‑Type VR Near‑Eye Display Using Polarization Grating.” ''Optics Express'' <b>33</b>(1): 263‑278 (2025). https://doi.org/10.1364/OE.546812</ref> | |||
<ref name="Microsoft2020">Microsoft Technology Licensing LLC. “Camera Comprising Lens Array.” US Patent Application <b>US 2023/0319428 A1</b>, 5 October 2023.</ref> | |||
<ref name="Wei2023">Weng, Y.; Zhang, Y.; Wang, W.; <i>et al.</i> “High‑Efficiency and Compact Two‑Dimensional Exit Pupil Expansion Design for Diffractive Waveguide Based on Polarization Volume Grating.” ''Optics Express'' <b>31</b>(4): 6601‑6614 (2023). https://doi.org/10.1364/OE.482447</ref> | |||
<ref name="Jang2021">Darkhanbaatar, N.; Erdenebat, M‑U.; Shin, C‑W.; <i>et al.</i> “Three‑Dimensional See‑Through Augmented‑Reality Display System Using a Holographic Micromirror Array.” ''Applied Optics'' <b>60</b>(25): 7545‑7551 (2021). https://doi.org/10.1364/AO.428364</ref> | |||
<ref name="Balogh2023">Zhang, G.; He, Y.; Liang, H.; <i>et al.</i> “Directional and Eye‑Tracking Light Field Display with Efficient Rendering and Illumination.” ''Micromachines'' <b>14</b>(7): 1465 (2023). https://doi.org/10.3390/mi14071465</ref> | |||
<ref name="Liu2012">Hong, K.; Hong, J.; Yeom, J.; Lee, B. “Two‑Dimensional and Three‑Dimensional See‑Through Screen Using Holographic Optical Elements.” In <i>Digital Holography and Three‑Dimensional Imaging 2012</i>, paper DM2C.6. Optical Society of America (2012). https://doi.org/10.1364/DH.2012.DM2C.6</ref> | |||
<ref name="Achromatic2025">Tian, Z.; Zhu, X.; Surman, P.; <i>et al.</i> “An Achromatic Metasurface Waveguide for Augmented Reality Displays.” ''Light: Science & Applications'' <b>14</b>, 94 (2025). https://doi.org/10.1038/s41377‑025‑01761‑w</ref> | |||
</references> | |||
[[Category:Terms]] | [[Category:Terms]] |