Jump to content

Lens array: Difference between revisions

No edit summary
No edit summary
Line 1: Line 1:
{{see also|Terms|Technical Terms}}
{{see also|Terms|Technical Terms}}
Lens arrays are two-dimensional arrangements of many small lenses (often [[Microlens array]]s) that manipulate light fields for imaging or display. In [[Virtual reality]] (VR) and [[Augmented reality]] (AR) systems, lens arrays serve two broad roles: as '''display optics''' that create 3D or light-field images, and as '''sensor optics''' that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths.<ref name="Li2019">Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref><ref name="Ng2005">Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref> In sensing, microlens-based ''plenoptic'' or light-field cameras capture the full 4D light field, allowing computational refocusing and depth estimation.<ref name="Ng2005" /> <ref name="Yang2018">Yang L, Guo Y. Eye tracking using a light field camera on a head-mounted display. US Patent Application US20180173303A1. 2018 Jun 21.</ref> Modern VR/AR prototypes leverage microlens arrays, [[Light field display]] techniques, [[Integral imaging]], holographic waveguide couplers, and specialized lens-array modules for eye tracking and depth sensing. These components appear in devices such as wide-FOV near-eye displays and optical see-through [[Head-mounted display]]s.
[[Lens array]]s are two-dimensional arrangements of many small lenses (often [[Microlens arrays]]) that manipulate [[light fields]] for imaging or display. In [[Virtual reality]] (VR) and [[Augmented reality]] (AR) systems, lens arrays serve two broad roles: as '''display optics''' that create 3D or light-field images, and as '''sensor optics''' that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths.<ref name="Li2019">Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref><ref name="Ng2005">Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref> In sensing, microlens-based ''plenoptic'' or light-field cameras capture the full 4D light field, allowing computational refocusing and depth estimation.<ref name="Ng2005" /> <ref name="Yang2018">Yang L, Guo Y. Eye tracking using a light field camera on a head-mounted display. US Patent Application US20180173303A1. 2018 Jun 21.</ref> Modern VR/AR prototypes leverage microlens arrays, [[Light field display]] techniques, [[Integral imaging]], holographic waveguide couplers, and specialized lens-array modules for eye tracking and depth sensing. These components appear in devices such as wide-FOV near-eye displays and optical see-through [[Head-mounted display]]s.


==History==
==History==