Jump to content

Microlens Arrays: Difference between revisions

From VR & AR Wiki
No edit summary
Tag: Reverted
No edit summary
Tag: Reverted
Line 1: Line 1:
'''Microlens array'''
A [[microlens array]] (MLA) is a planar grid of microscopic refractive elements—typically tens to hundreds of micrometres in diameter—fabricated on a common transparent substrate.<ref name="Paschotta">R. Paschotta, ''RP Photonics Encyclopedia'', article “Microlens Arrays”, accessed 26&nbsp;April&nbsp;2025.</ref>  Each lenslet focuses or collimates incoming light, enabling compact optical functions that would otherwise require bulky macroscopic optics.  MLAs commonly appear in [[Virtual reality|VR]] and [[Augmented reality|AR]] displays, image sensors, illumination systems, and wavefront sensors.
 
A '''microlens array''' (MLA) is a planar grid of microscopic refractive elements—typically tens to hundreds of micrometres in diameter—fabricated on a common transparent substrate.<ref name="Paschotta">R. Paschotta, ''RP Photonics Encyclopedia'', article “Microlens Arrays”, accessed 26&nbsp;April&nbsp;2025.</ref>  Each lenslet focuses or collimates incoming light, enabling compact optical functions that would otherwise require bulky macroscopic optics.  MLAs commonly appear in [[Virtual reality|VR]] and [[Augmented reality|AR]] displays, image sensors, illumination systems, and wavefront sensors.


==Structure and optical parameters==
==Structure and optical parameters==

Revision as of 23:23, 25 April 2025

A microlens array (MLA) is a planar grid of microscopic refractive elements—typically tens to hundreds of micrometres in diameter—fabricated on a common transparent substrate.[1] Each lenslet focuses or collimates incoming light, enabling compact optical functions that would otherwise require bulky macroscopic optics. MLAs commonly appear in VR and AR displays, image sensors, illumination systems, and wavefront sensors.

Structure and optical parameters

A typical array arranges circular or hexagonal lenslets in square- or hexagonal-close packing. Key parameters include:

  • Pitch (centre-to-centre spacing, usually 50–300 µm),
  • Diameter (often equal to pitch for circular lenses),
  • Focal length (f≈100 µm–5 mm, depending on design),
  • Numerical aperture (dictated by diameter and f),
  • Fill factor (total lens area ÷ array area; ≈90 % for hexagonal packing).[1]

Because lenslets are small, geometric aberrations are negligible, but diffraction and interference effects become significant when features approach the optical wavelength.[2]

Fabrication and materials

Most MLAs are produced with semiconductor-style micro-fabrication:

  • Photoresist reflow – patterning cylindrical posts in photoresist, then melting them to form hemispherical lenses.[3]
  • Grayscale lithography – exposing a resist with spatially varying dose to directly sculpt the lens surface.
  • Wafer-level glass molding – precision pressing of glass into a mould to form hundreds of thousands of lenslets on 150- or 200-mm wafers.
  • Two-photon polymerisation – direct laser writing of free-form lens shapes with sub-micron accuracy.

Common substrates include UV-cured polymers (e.g., PMMA, Ormocer), fused silica, and optical-grade polycarbonate. Anti-reflection coatings are often applied to both air- and substrate-side surfaces.[3]

General optical applications

  • Image sensors – A one-to-one microlens–pixel arrangement funnels light into each photodiode, raising quantum efficiency and mitigating “dead-space” between pixels.[4]
  • Beam homogenisers – Paired MLAs (a “fly’s-eye” integrator) transform non-uniform laser or LED profiles into flat-top illumination for projectors and lithography.[1]
  • Shack–Hartmann wavefront sensing – Each lens focuses a spot on a camera; spot displacements yield local slope and phase error.[3]
  • Light-field capture – An MLA in front of the sensor encodes both spatial and angular radiance, enabling computational refocus and depth extraction.[5]

Applications in VR and AR

Display brightness and fill factor

OLED and LCD micro-displays used in near-eye optics leave dark inter-pixel gaps. Depositing an MLA directly on the colour-filter/top-glass increases pixel aperture ratio, boosting brightness and reducing the “screen-door” effect.[4]

Field of view (FoV) enhancement

Research prototypes place a thin MLA between the micro-display and the main eye lens so that each display pixel subtends a larger visual angle, trading spatial sampling for FoV and blur-free magnification.[5]

Near-eye light-field and focus-cue displays

NVIDIA demonstrated a 5-mm-thick 1280 × 720 OLED bonded to a 1.0-mm-pitch MLA (Failed to parse (syntax error): {\displaystyle f≈3.3 \text{mm}} ). The array projects a four-plane light field that lets the eye accommodate naturally, mitigating the vergence–accommodation conflict inherent in stereoscopic HMDs.[5]

Liquid-crystal varifocal microlens arrays

Patents by Chinese and Korean manufacturers describe electrically tunable liquid-crystal MLAs inside a Head-mounted display (HMD). By varying lenslet focal length per video frame, the system optically blurs user-defined regions, simulating depth-of-field and reducing eye strain.[6]

Eye-tracking and sensing

Compact eye-tracking cameras exploit per-pixel MLAs to raise sensitivity and widen acceptance angle, improving gaze estimation latency and robustness.[3]

Waveguide out-coupling

Future AR waveguides may integrate micro-optics—either diffractive “meta-lenses” or classical MLAs—to collimate or angularly multiplex light, raising efficiency over current grating-based out-couplers.[2]

Industry adoption

  • Sony uses on-glass MLAs on its 0.7-inch OLED micro-display line for AR EVF modules.[4]
  • Apple patent US 2014/0168783 describes combining multiple MLAs with a parallax barrier for a switchable 3-D near-eye display.[7]
  • NVIDIA and Stanford University pioneered near-eye light-field HMD prototypes integrating MLAs directly on the display.[5]

See also

References

  1. 1.0 1.1 1.2 R. Paschotta, RP Photonics Encyclopedia, article “Microlens Arrays”, accessed 26 April 2025.
  2. 2.0 2.1 H. Chang et al., “Waveguide-based augmented-reality displays: perspectives and challenges,” eLight 3 (3), 2023.
  3. 3.0 3.1 3.2 3.3 Avantier Inc., “Introducing Microlens Arrays,” Knowledge Center, 2020.
  4. 4.0 4.1 4.2 DisplayDaily, “Emerging Display Technologies for the AR/VR Market,” Nov 2021.
  5. 5.0 5.1 5.2 5.3 D. Lanman & D. Luebke, “Near-Eye Light-Field Displays,” NVIDIA Research, 2013.
  6. CN107942517B, “Head-mounted display device based on liquid-crystal microlens array,” State Intellectual Property Office of China, granted 2022.
  7. US20140168783A1, “Electronic device with microlens arrays for providing perspective-corrected imagery,” 2014.