Lens array
Great. I’ll create a comprehensive and factually accurate wiki-style article on "Lens array" as it applies to both display and sensing systems in Virtual reality (VR) and Augmented reality (AR). It will follow a standard structure with detailed explanations of microlens arrays, light field displays, integral imaging, waveguides, sensing applications, challenges, and future developments, along with properly cited references and internal wiki links.
I’ll let you know as soon as it’s ready for your review.
- Lens array in virtual and augmented reality
Lens arrays are two-dimensional arrangements of many small lenses (often Microlens arrays) that manipulate light fields for imaging or display. In Virtual reality (VR) and Augmented reality (AR) systems, lens arrays serve two broad roles: as **display optics** that create 3D or light-field images, and as **sensor optics** that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)) ([rayphoto.pdf](https://graphics.stanford.edu/papers/lfcamera/lfcamera-150dpi.pdf#:~:text=This%20paper%20presents%20a%20camera,in%20the%20resolution%20of%20images)). In sensing, microlens-based *plenoptic* or light-field cameras capture the full 4D light field, allowing computational refocusing and depth estimation ([rayphoto.pdf](https://graphics.stanford.edu/papers/lfcamera/lfcamera-150dpi.pdf#:~:text=This%20paper%20presents%20a%20camera,in%20the%20resolution%20of%20images)) ([US20180173303A1 - Eye tracking using a light field camera on a head-mounted display
- Google Patents](https://patents.google.com/patent/US20180173303A1/en#:~:text=,up%20close%20in%20the%20foreground)). Modern VR/AR prototypes leverage microlens arrays, Light field display techniques, Integral imaging, holographic waveguide couplers, and specialized lens-array modules for eye tracking and depth sensing. These components appear in devices such as wide-FOV near-eye displays and optical see-through Head-mounted displays.
- History
Lens-array technology traces back over a century. Gabriel Lippmann first proposed “integral photography” in 1908, capturing 3D scenes via a lens grid ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)). Early implementations used pinhole arrays (circa 1911) and later simple microlens plates (around 1948) to record and replay light fields ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)). In the mid-20th century, lenticular (cylindrical lens) sheets became popular for autostereoscopic prints and displays (e.g. 3D postcards and packaging), providing separate views for each eye. By the 2000s, advances in digital displays and microfabrication revived lens-array research for head-worn displays. For example, smartphone-scale integral imaging was demonstrated by pairing a display with a matching MLA ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)). In recent years, VR/AR research has produced thin, wide-FOV near-eye displays using sophisticated lens arrays (e.g. polarization optics or metasurfaces) ([Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating - PubMed](https://pubmed.ncbi.nlm.nih.gov/39876217/#:~:text=In%20this%20paper%2C%20we%20propose,handed%20and)), as well as compact eye-tracking and depth cameras using microlens arrays ([US20180173303A1 - Eye tracking using a light field camera on a head-mounted display
- Google Patents](https://patents.google.com/patent/US20180173303A1/en#:~:text=,up%20close%20in%20the%20foreground)) ([Microsoft Patent | Camera comprising lens array - Nweon Patent](https://patent.nweon.com/30768#:~:text=Examples%20are%20disclosed%20that%20relate,of%20the%20array%20of%20lenses)).
- Types of lens arrays
Lens arrays in VR/AR come in several varieties:
- **Spherical microlens arrays:** Regular arrays of small convex (often spherical or aspheric) lenses. These planar MLAs are common for light-field displays and cameras. Pitch (spacing) can range from tens of micrometers (in cameras) up to a few millimeters (in HMD displays) ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=forms%20linearly%20polarized%20light%20through,term%20in%20the%20permittivity%20tensor)) ([Three-dimensional see-through augmented-reality display system using a holographic micromirror array](https://opg.optica.org/ao/abstract.cfm?uri=ao-60-25-7545#:~:text=Laser%20source%20Cobolt%20Samba%20100,0078%C2%A0mm)). Each lenslet has a focal length chosen to suit the application (e.g. to collimate a display or focus on a sensor).
- **Lenticular arrays:** Arrays of cylindrical microlenses (lenticules) arranged one-dimensionally or two-dimensionally. These produce multiple horizontal viewing zones in glasses-free 3D displays. For example, a lenticular lens array can restrict the exit pupil to certain angles, enabling light-field panels that show different images to each eye ([
Directional and Eye-Tracking Light Field Display with Efficient Rendering and Illumination - PMC ](https://pmc.ncbi.nlm.nih.gov/articles/PMC10385613/#:~:text=information%20in%20the%20viewing%20zone,of%20the%20illumination%20power)). Such arrays are widely used in glasses-free 3D signage and have been adapted to VR/AR light-field display prototypes.
- **Holographic optical element (HOE) arrays:** These use diffractive hologram patterns that act like an array of lenses. In AR waveguide combiners, *lens-array holographic optical elements* have been used to form 2D/3D transparent display screens ([Two-dimensional and three-dimensional see-through screen using holographic optical elements](https://opg.optica.org/abstract.cfm?uri=DH-2012-DM2C.6#:~:text=In%20this%20paper%2C%20we%20propose,thorough%20screen%20is%20experimentally%20verified)). A HOE can replace a physical lens array by encoding lens behavior into a recorded interference pattern. In one prototype, a *lens-array HOE* was created to build a see-through AR screen ([Two-dimensional and three-dimensional see-through screen using holographic optical elements](https://opg.optica.org/abstract.cfm?uri=DH-2012-DM2C.6#:~:text=In%20this%20paper%2C%20we%20propose,thorough%20screen%20is%20experimentally%20verified)), and other works use holographic micromirror arrays in conjunction with MLAs to couple images into waveguides ([Three-dimensional see-through augmented-reality display system using a holographic micromirror array](https://opg.optica.org/ao/abstract.cfm?uri=ao-60-25-7545#:~:text=It%20is%20difficult%20to%20find,prism.%20The)).
- **Liquid crystal / tunable lens arrays:** Some arrays use liquid crystal (LC) or fluidic lenses whose optical power can be electronically changed. For example, a chiral (polarization-sensitive) LC lens array was demonstrated in an AR system to steer light and break conventional FOV limits ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=FOV%20of%20a%20retinal%20scanning,and%20high%20eye%20box%20size)). Variable-focus MLAs can allow dynamic focus adjustment or multi-focal displays.
- **Metamaterial / metalens arrays:** Flat optical metasurfaces can form ultra-thin lenslets. Metalens arrays (nanoscale-post structures) have been fabricated for light-field imaging ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Furthermore%2C%20since%20a%20metalens%20is,in%20applications%20of%20integral%20imaging)). Such metasurface MLAs can be made extremely thin (sub-micron) and can correct chromatic aberration ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%20is%20a%20promising,limited%20focusing)) ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Furthermore%2C%20since%20a%20metalens%20is,in%20applications%20of%20integral%20imaging)). Patents also describe *metamaterial lens arrays* for eye-tracking cameras ([Microsoft Patent | Camera comprising lens array - Nweon Patent](https://patent.nweon.com/30768#:~:text=FIGS,lens%20array%20comprising%20metamaterial%20lenses)). As fabrication matures, metasurface MLAs promise lighter, compact optics.
- **Fresnel / diffractive lens arrays:** Arrays of Fresnel lenses or diffractive patterned surfaces can also be used. A Fresnel lens array trades off sharpness for thinness and can be cheaper to mold. Diffractive MLAs (e.g. polymer gratings) are used in some projection optics.
Each type has pros and cons: refractive spherical lenses are simple but chromatic, HOEs and metasurfaces can be flat and achromatic but are harder to fabricate at scale, and tunable arrays add complexity. In practice, VR/AR designs often combine MLAs with polarization gratings, waveguides, and computational processing to achieve the desired effects.
- Applications in displays
- Light-field and multi-view VR displays:** Lens arrays enable near-eye systems that present multiple image perspectives simultaneously. By placing an MLA at an optical conjugate of the display panel, a VR headset can create a plenoptic or light-field effect where each eye perceives slightly different images at different angles. For example, a thin MLA-based VR near-eye display was demonstrated using a polarization grating to expand the field of view (FoV) from ~59° to ~95°, while preserving image resolution ([Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating - PubMed](https://pubmed.ncbi.nlm.nih.gov/39876217/#:~:text=In%20this%20paper%2C%20we%20propose,handed%20and)). Such designs address the FoV–resolution tradeoff inherent in lens-array displays: more lenslets (or larger aperture per lenslet) can increase FoV but reduce angular detail. Integral imaging is a related concept: an array of lenslets reconstructs a 3D scene from a grid of “elemental” images ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)). In tabletop or handheld 3D displays, integral imaging systems (with pixelated panels and matching MLAs) produce glasses-free 3D images with true focal cues.
- Autostereoscopic (glasses-free) 3D displays:** Both lenticular arrays and MLAs have been used in commercial 3D displays (e.g. display signage, digital signage). In VR contexts, lenticular arrays can form static multi-view panels. One example combined a lenticular array with eye-tracking: as the viewer moves or the gaze changes, the system updates which view is shown. In a light-field display prototype, a lenticular array restricted the viewing zone and an eye-tracking system expanded the effective viewing area, using only ~17% of the data normally required ([
Directional and Eye-Tracking Light Field Display with Efficient Rendering and Illumination - PMC ](https://pmc.ncbi.nlm.nih.gov/articles/PMC10385613/#:~:text=information%20in%20the%20viewing%20zone,of%20the%20illumination%20power)). This demonstrates how lens arrays (like lenticules) coupled with tracking can reduce bandwidth and still provide smooth motion parallax.
- Holographic and waveguide displays:** In AR near-eye displays, lens arrays appear in the form of couplers or combiners. Diffractive or holographic elements on waveguides often act like lens arrays. For example, a recent see-through AR prototype used a *holographic micromirror array* together with a fixed microlens array as a lenslet-based in-coupler. This system overlaid a reconstructed 3D image on the real world without additional bulky optics ([Three-dimensional see-through augmented-reality display system using a holographic micromirror array](https://opg.optica.org/ao/abstract.cfm?uri=ao-60-25-7545#:~:text=It%20is%20difficult%20to%20find,prism.%20The)). Similarly, research has fabricated *lens-array HOEs* on planar substrates to create see-through 2D/3D screens ([Two-dimensional and three-dimensional see-through screen using holographic optical elements](https://opg.optica.org/abstract.cfm?uri=DH-2012-DM2C.6#:~:text=In%20this%20paper%2C%20we%20propose,thorough%20screen%20is%20experimentally%20verified)). In these AR designs, each microlens or holographic lenslet directs light into the waveguide or to the eye, enabling floating virtual images. The advantage is that such arrays can maintain a clear real-world view while presenting virtual content at various depths (an Optical see-through display property).
- Focal cueing and depth enhancement:** Some VR/AR displays incorporate multiple lens arrays for depth/focus manipulation. For instance, a light-field HMD may use two stacked MLA arrays (a so-called dual-focal arrangement) to enlarge the depth range so that virtual objects at different distances can appear simultaneously in focus. Polarization or liquid crystal arrays have been used to switch between focus planes. These advanced architectures aim to overcome the vergence-accommodation mismatch by aligning virtual image focus with convergence.
- Applications in sensing
- Light-field (plenoptic) cameras for depth and eye tracking:** Lens arrays are fundamental to plenoptic imaging. Placing an MLA a focal distance in front of an image sensor allows each micro-image to capture rays from different angles ([rayphoto.pdf](https://graphics.stanford.edu/papers/lfcamera/lfcamera-150dpi.pdf#:~:text=This%20paper%20presents%20a%20camera,in%20the%20resolution%20of%20images)). This effectively samples the full 4D light field of the scene. With computational processing, one can refocus the image after capture or compute depth maps from parallax between the micro-images ([rayphoto.pdf](https://graphics.stanford.edu/papers/lfcamera/lfcamera-150dpi.pdf#:~:text=This%20paper%20presents%20a%20camera,in%20the%20resolution%20of%20images)). In VR/AR, this is useful both for external depth sensing (scene reconstruction) and internal eye imaging. For example, patents describe using a light-field camera (with MLA) inside an HMD to capture the user’s eye. The captured plenoptic data lets the system digitally refocus on various eye regions and compute gaze direction without needing precise IR glints ([US20180173303A1 - Eye tracking using a light field camera on a head-mounted display
- Google Patents](https://patents.google.com/patent/US20180173303A1/en#:~:text=,up%20close%20in%20the%20foreground)). This relaxes the geometric constraints on eye-tracker placement. Thus, microlens-based light-field cameras can support both environmental mapping and fine eye tracking in headsets.
- Multi-view eye-tracking cameras:** Even simpler lens arrays are used in gaze trackers. An array of small lenses can cover a wide field-of-view of the eye by imaging it onto different sensor regions. For example, a patented near-eye system used four tiny metamaterial lenses arranged in an array, each imaging a portion of the eye onto separate areas of the detector ([Microsoft Patent | Camera comprising lens array - Nweon Patent](https://patent.nweon.com/30768#:~:text=Examples%20are%20disclosed%20that%20relate,of%20the%20array%20of%20lenses)) ([Microsoft Patent | Camera comprising lens array - Nweon Patent](https://patent.nweon.com/30768#:~:text=FIGS,lens%20array%20comprising%20metamaterial%20lenses)). By analyzing these sub-images, the headset can determine the pupil center and gaze direction for each eye. Such lens-array trackers can be more compact than single large lenses and may capture more robust features (like iris patterns) for tracking.
- Depth sensors:** Outside of full light-field cameras, some depth-sensing concepts also use microlenses. One approach is a multi-aperture structured-light projector: an array of tiny beams (formed by a lens array) projects a coded IR pattern for depth triangulation. Another is embedding micro-lenses over depth-sensing pixels to increase fill factor or directivity. In practice, however, most time-of-flight and stereo cameras in VR/AR do not use discrete lens arrays (they use single large lenses or laser projectors). The main use of lens arrays in sensing is thus in light-field capture (including gaze capture) rather than typical ToF or stereo modules.
- Technical specifications
Lens-array designs involve several key parameters:
- **Lens pitch (size):** The center-to-center spacing of the lenslets. Near-eye display lens pitches are often on the order of 0.5–3 mm. For example, a wide-FOV scanning AR prototype used a “chiral” LC lens array of 8×15 lenses with 2 mm pitch ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=forms%20linearly%20polarized%20light%20through,term%20in%20the%20permittivity%20tensor)). An AR waveguide coupler in another system used spherical lenslets with 1 mm pitch ([Three-dimensional see-through augmented-reality display system using a holographic micromirror array](https://opg.optica.org/ao/abstract.cfm?uri=ao-60-25-7545#:~:text=Laser%20source%20Cobolt%20Samba%20100,0078%C2%A0mm)). Plenoptic camera MLAs, by contrast, have much finer pitch (tens to hundreds of µm) to densely sample the image plane. Pitch determines the tradeoff between image resolution and angular coverage: smaller pitch yields higher angular resolution (more sub-aperture views) but collects less light per lens.
- **Focal length and f-number:** Each lenslet’s focal length sets the viewing frustum of that micro-aperture. Low f-number (wide aperture) means a large view angle per lens, which broadens the overall FOV of the system. In the scanning waveguide example, the 2 mm lenslets had an f-number of about 0.41 at 639 nm ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=forms%20linearly%20polarized%20light%20through,term%20in%20the%20permittivity%20tensor)). In designs, the focal length is often chosen to collimate or focus light from the display panel to the eye (in displays) or from the scene to the sensor (in cameras). Mismatches in focal length across the array can create blurring or depth errors.
- **Aperture shape and fill factor:** Lenslets may be round or hexagonal. Hexagonal or honeycomb layouts can achieve near-100% fill factor (no dead zones) which maximizes brightness. Fill-factor and uniformity are critical: any gap between lenses can cause vignetting or loss of resolution. In fabrication, arrays are usually molded or imprinted in photoresist, and then replicated in glass or plastic.
- **Resolution and eye-box:** The number of lenses across an HMD display determines how many views can be presented. Each lens typically covers a few hundred display pixels. Alignment is crucial: each sub-image must align to the user’s eye position. Systems often include pupil steering (moving images to follow the eye) to maintain the eye-box. In the aforementioned scanning AR system ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=Note%20%E2%80%93%20the%20authors%20did,the%20projection%20device%20they%20used)), the wide-FOV was achieved by a large lens array, but the resulting resolution per view was low because the 2 mm pitch limited how many sub-images could be rendered.
- **Chromatic and optical aberrations:** Simple refractive lenslets suffer from chromatic dispersion (different focal lengths per wavelength). As noted in integral imaging, chromatic aberration in MLAs “reduces viewing quality” ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)). This is especially problematic for full-color displays. Achromatic doublet designs or advanced metalens lenses can correct this, but add complexity. Spherical aberration and field curvature within each lenslet also degrade sharpness if not carefully managed.
- **Materials and manufacturing:** Lens arrays are typically made in glass, plastic or polymer (e.g. PMMA, silicone) for refractive types. Holographic HOEs are recorded in photopolymers (e.g. Bayfol HX). Metasurface MLAs use high-index nanostructures (e.g. TiO₂) on a substrate ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Furthermore%2C%20since%20a%20metalens%20is,in%20applications%20of%20integral%20imaging)). Manufacturing tolerances (surface roughness, lens height accuracy) critically affect performance. For example, a 1 µm error in a microlens height could shift focus by hundreds of micrometers.
Overall, the technical design of a lens array involves a trade-off between FOV, resolution, brightness, and physical thickness. Emerging approaches like metalens arrays promise thinner optics with engineered dispersion ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%20is%20a%20promising,limited%20focusing)) ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Furthermore%2C%20since%20a%20metalens%20is,in%20applications%20of%20integral%20imaging)), which may shift these trade-offs in future systems.
- Challenges
Lens-array components face several challenges in VR/AR:
- **FoV–Resolution trade-off:** Expanding the user’s field of view typically requires more lenslets or larger lens aperture, but this reduces the angular (and thus spatial) resolution per view. Shin *et al.* showed that using a polarization grating could enlarge an MLA display’s FoV from ~59° to 95°, but this was at the expense of needing sophisticated polarization control ([Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating - PubMed](https://pubmed.ncbi.nlm.nih.gov/39876217/#:~:text=In%20this%20paper%2C%20we%20propose,handed%20and)). In many designs, improving one parameter (like FoV or brightness) degrades another.
- **Chromatic aberration and color mixing:** As noted earlier, MLAs inherently blur different colors unless achromatized. Achieving full-color images through a simple lens array is difficult ([A broadband achromatic metalens array for integral imaging in the visible | Light: Science & Applications](https://www.nature.com/articles/s41377-019-0178-2#:~:text=Integral%20imaging%2C%20first%20proposed%20in,inherent%20chromatic%20aberration%20still%20reduces)). Some systems use color filter arrays or sequential-color illumination to mitigate this, but this adds complexity and can reduce brightness.
- **Crosstalk and ghosting:** In multi-view displays, the images for adjacent views must not overlap. Small misalignments or imperfections cause crosstalk, where one eye sees part of the image intended for the other. This degrades 3D effect. In holographic see-through designs, incomplete isolation can cause ghost images of virtual content. Accurate fabrication and calibration are needed to minimize these artifacts.
- **Eye-box and alignment:** For near-eye applications, the exit pupil (eye-box) must accommodate user movement. Simple lens arrays can produce small, fixed eye-boxes. Techniques like eye-tracking (to move the image) or pupil duplication (multiple layered arrays) are required to ensure a reasonable viewing region. The scanning waveguide example noted that despite a wide FoV, the eye-box remained limited, and they attributed low resolution partly to the relatively large 2 mm lens pitch ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=Note%20%E2%80%93%20the%20authors%20did,the%20projection%20device%20they%20used)) (larger pitch reduced how finely the eye-box could be sampled).
- **Optical efficiency:** Each optical surface, grating, or holographic element can introduce loss. Adding an array of lenslets means more surfaces and potential Fresnel reflections. Diffractive elements (gratings, HOEs) often have limited efficiency bandwidth. Ensuring enough brightness in the final image is a common design hurdle, especially for battery-powered displays.
- **Manufacturing scale and cost:** Large, high-quality MLAs (especially with small lenslets) are challenging to produce over large areas. Holographic and metasurface arrays often require cleanroom fabrication. For consumer VR/AR, cost-effective replication (e.g. using nanoimprint or injection molding) is crucial but may not yet match the performance of lab prototypes.
- Future developments
Research on lens-array technology is advancing rapidly. **Adaptive optics** will likely play a growing role. Arrays of liquid-crystal or shape-changing lenses could allow dynamic focus control and multi-focal displays (reducing vergence-accommodation conflict). Similarly, **dynamic wavelength control** (e.g. polarization or tunable filters in each lenslet) could enable spatiotemporal multiplexing for color and focus.
- Metasurfaces and flat optics** are a major trend. Recent work has demonstrated *achromatic metasurface waveguides* for AR: for example, a 2025 Light:Science &Apps paper introduced inverse-designed metasurface couplers that eliminate chromatic aberration across the full visible spectrum and achieve ~45° FOV ([An achromatic metasurface waveguide for augmented reality displays | Light: Science & Applications](https://www.nature.com/articles/s41377-025-01761-w#:~:text=Augmented%20reality%20,system%20that%20overcomes%20this%20long)). These metasurface lens arrays are ultrathin and could replace bulky refractive MLAs in future headsets. Cholesteric liquid-crystal metasurface (chiral) lens arrays have already been used to break the field-of-view limit in a scanning AR display ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=FOV%20of%20a%20retinal%20scanning,and%20high%20eye%20box%20size)).
- Integration and compute-optics co-design** will improve performance. Headsets may co-optimize lens arrays with on-sensor processing. For instance, a microlens array camera could perform onboard refocusing or eye-pose estimation in hardware. Conversely, display side, algorithms could pre-distort images to compensate residual lens aberrations.
- Higher-density arrays** and **monolithic fabrication** may emerge. Advances in 3D printing and nanoimprint lithography could yield integrated “optical wafers” combining display and MLA. Also, developments in holographic printing may allow recording complex lens-array HOEs on demand.
In sensing, *light-field cameras* in miniaturized form will likely become standard in AR glasses for robust gaze and hand tracking, thanks to the flexibility demonstrated in patents and prototypes ([US20180173303A1 - Eye tracking using a light field camera on a head-mounted display
- Google Patents](https://patents.google.com/patent/US20180173303A1/en#:~:text=,up%20close%20in%20the%20foreground)) ([Microsoft Patent | Camera comprising lens array - Nweon Patent](https://patent.nweon.com/30768#:~:text=Examples%20are%20disclosed%20that%20relate,of%20the%20array%20of%20lenses)).
As VR/AR systems aim for wider FOV, thinner form factors, and better realism, custom lens-array designs will continue to evolve. Each new generation of headsets (for example, employing pancake optics, multi-zone optics, or holographic waveguides) tends to reinvigorate lens-array innovation. In sum, lens arrays remain a key enabling technology for immersive displays and interactive sensing, with ongoing research focusing on mitigating their limitations and leveraging novel materials and computation ([An achromatic metasurface waveguide for augmented reality displays | Light: Science & Applications](https://www.nature.com/articles/s41377-025-01761-w#:~:text=Augmented%20reality%20,system%20that%20overcomes%20this%20long)) ([](https://escholarship.org/content/qt5549g1ch/qt5549g1ch.pdf?t=r8puoj#:~:text=FOV%20of%20a%20retinal%20scanning,and%20high%20eye%20box%20size)).