Microlens Arrays: Difference between revisions
Appearance
Xinreality (talk | contribs) m Text replacement - "e.g.," to "for example" |
Xinreality (talk | contribs) No edit summary |
||
Line 1: | Line 1: | ||
{{see also|Terms|Technical Terms}} | {{see also|Terms|Technical Terms}} | ||
[[Microlens Arrays]] ('''MLAs'''), sometimes called '''micro-lens arrays''' or '''lenslet arrays''', are [[optical component]]s consisting of multiple small [[lens]]es (often called '''lenslets''') arranged in a one-dimensional or two-dimensional pattern on a supporting substrate<ref name="RPPhotonics">Microlens arrays | [[Microlens Arrays]] ('''MLAs'''), sometimes called '''micro-lens arrays''' or '''lenslet arrays''', are [[optical component]]s consisting of multiple small [[lens]]es (often called '''lenslets''') arranged in a one-dimensional or two-dimensional pattern on a supporting substrate<ref name="RPPhotonics">Microlens arrays - fabrication, parameters, applications - RP Photonics</ref><ref name="ShanghaiOptics">Microlens Array - Shanghai Optics</ref><ref name="TEMICON">Microlens Arrays, MLA - temicon</ref><ref name="PhotonicsDict">Microlens array - Photonics Dictionary</ref>. Each lenslet typically has a diameter significantly less than 10 millimeters, often ranging from tens or hundreds of micrometers down to just a few micrometers, or even sub-micrometer in specialized cases<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="Syntec">Microlens Arrays | Single Point Diamond Turning - Syntec Optics</ref><ref name="StandardMLA">Standard Microlens Array - Bön Optics (appears to be distributor for brand, original mfg unclear)</ref>. The array pattern is commonly periodic, such as a square or hexagonal grid, but can also be linear, rectangular, circular, or even random/stochastic for specific applications<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="TEMICON"/><ref name="StandardMLA"/>. An array can contain thousands, millions, or even more individual lenslets<ref name="RPPhotonics"/><ref name="AvantierIntro">Introducing Microlens Arrays - Avantier Inc.</ref><ref name="OpticalComponents">Detailed Insights in Microlens Array Products - OPTICAL COMPONENTS</ref>. | ||
MLAs are characterized by their potential for miniaturization, integration into complex systems, and considerable design flexibility<ref name="StandardMLA"/><ref name="ApolloOptics">Injection-molded microlens arrays - Apollo Optical Systems</ref>. They have become a critical enabling technology in [[virtual reality]] (VR) and [[augmented reality]] (AR) devices, where they help solve numerous optical challenges related to [[field of view]], display brightness, visual quality, and [[form factor]]<ref name="Bote">Microlens Arrays: Versatile and Efficient Optical Solutions - Bote Optics Singapore</ref><ref name="OpticalComponents"/><ref name="BrightView">AR-VR - Augmented and Virtual Reality - BrightView Technologies, Inc.</ref>. Beyond VR/AR, they are employed across diverse fields, including [[telecommunication]]s (fiber coupling, optical switches), [[medical imaging]] (endoscopy, [[Optical Coherence Tomography|OCT]]), [[solar energy]] (concentrators), automotive [[LiDAR]], [[laser]] beam homogenization and shaping, [[sensor]] technology ([[ | MLAs are characterized by their potential for miniaturization, integration into complex systems, and considerable design flexibility<ref name="StandardMLA"/><ref name="ApolloOptics">Injection-molded microlens arrays - Apollo Optical Systems</ref>. They have become a critical enabling technology in [[virtual reality]] (VR) and [[augmented reality]] (AR) devices, where they help solve numerous optical challenges related to [[field of view]], display brightness, visual quality, and [[form factor]]<ref name="Bote">Microlens Arrays: Versatile and Efficient Optical Solutions - Bote Optics Singapore</ref><ref name="OpticalComponents"/><ref name="BrightView">AR-VR - Augmented and Virtual Reality - BrightView Technologies, Inc.</ref>. Beyond VR/AR, they are employed across diverse fields, including [[telecommunication]]s (fiber coupling, optical switches), [[medical imaging]] (endoscopy, [[Optical Coherence Tomography|OCT]]), [[solar energy]] (concentrators), automotive [[LiDAR]], [[laser]] beam homogenization and shaping, [[sensor]] technology ([[Shack-Hartmann wavefront sensor]]s, image sensors), and [[consumer electronics]] (projectors, cameras, displays)<ref name="OpticalComponents"/><ref name="ShanghaiOptics"/><ref name="AvantierIntro"/><ref name="StandardMLA"/><ref name="Bote"/><ref name="GDOptics">Efficient and precise production of microlens arrays using precision glass molding - GD Optics (PDF)</ref>. Microlens arrays play an increasingly important role in next-generation display systems, [[waveguide]] technologies, [[eye tracking]] systems, [[light field display]] technologies, environmental [[sensing]], and [[computational imaging]] applications within the immersive technology sector<ref name="PatentlyApple">Apple Invents an optical system with Microlens Array Projectors to advance time-of-flight sensing for Face ID, delivering more realistic AR/VR features+ - Patently Apple (July 21, 2022)</ref><ref name="BrightView"/><ref name="UltraThinMLA">(2024-03-19) Imaging with high resolution and wide field of view based on an ultrathin microlens array - AIP Publishing</ref><ref name="AvantierMicroOptics">Types of Micro Optics - Avantier Inc.</ref>. | ||
==Characteristics== | ==Characteristics== | ||
Line 17: | Line 17: | ||
*'''[[Focal Length]] and [[Numerical Aperture]] (NA):''' The focal length of the individual lenslets determines their focusing power. Available focal lengths range from sub-millimeter to tens or hundreds of millimeters<ref name="StandardMLA"/><ref name="NewportMALS18">MALS18 Micro Lens Array - Newport</ref><ref name="MeetOptics"/>. The NA describes the range of angles over which the lens can accept or emit light. | *'''[[Focal Length]] and [[Numerical Aperture]] (NA):''' The focal length of the individual lenslets determines their focusing power. Available focal lengths range from sub-millimeter to tens or hundreds of millimeters<ref name="StandardMLA"/><ref name="NewportMALS18">MALS18 Micro Lens Array - Newport</ref><ref name="MeetOptics"/>. The NA describes the range of angles over which the lens can accept or emit light. | ||
*'''[[Optical Coating|Coatings]]:''' [[Anti-reflection coating]]s are frequently applied to the MLA surfaces (often both sides) to minimize reflection losses and maximize light transmission within the desired spectral range<ref name="RPPhotonics"/><ref name="Syntec"/><ref name="BrightView"/>. Other coatings might be applied for filtering or environmental protection. | *'''[[Optical Coating|Coatings]]:''' [[Anti-reflection coating]]s are frequently applied to the MLA surfaces (often both sides) to minimize reflection losses and maximize light transmission within the desired spectral range<ref name="RPPhotonics"/><ref name="Syntec"/><ref name="BrightView"/>. Other coatings might be applied for filtering or environmental protection. | ||
*'''[[Dimensional tolerance]]s and Quality:''' Key quality parameters include the accuracy of lenslet shape (surface form error, often specified in fractions of a wavelength), [[Surface quality (optics)|surface quality]] (scratch-dig), lenslet positioning accuracy (centration, pitch uniformity), focal length uniformity across the array, and overall array flatness<ref name="AvantierSpecs">Microlens arrays | *'''[[Dimensional tolerance]]s and Quality:''' Key quality parameters include the accuracy of lenslet shape (surface form error, often specified in fractions of a wavelength), [[Surface quality (optics)|surface quality]] (scratch-dig), lenslet positioning accuracy (centration, pitch uniformity), focal length uniformity across the array, and overall array flatness<ref name="AvantierSpecs">Microlens arrays - fabrication, parameters, applications - RP Photonics (mentions high accuracy vital)</ref><ref name="GDOptics"/>. Positional accuracy better than 1 µm can be achieved<ref name="GDOptics"/>. | ||
==Fabrication Methods== | ==Fabrication Methods== | ||
Line 42: | Line 42: | ||
*'''[[Eye Tracking]] and Dynamic Focus:''' Tunable microlens arrays, such as those based on [[electrowetting]] liquid lenses or [[liquid crystal lens]]es, can be integrated into HMDs. Combined with eye-tracking cameras, these systems could dynamically adjust the focus of the displayed image or specific lenslets to match the user's gaze depth in real-time<ref name="ElectrowettingLFD"/>. This could enhance the realism of light field displays, provide variable focus capabilities<ref name="Article1Ref_Muenster2019">Muenster, R., Jaeger, G., Hubner, M., Stetter, M., & Stilla, U. (2019). Liquid crystal tunable microlens array for augmented reality displays. In Digital Optical Technologies 2019 (Vol. 11062, p. 110620J). International Society for Optics and Photonics.</ref>, potentially correct for individual user refractive errors, or even simulate depth-of-field effects by selectively blurring parts of the image<ref name="PatentCNBlur">(B) CN107942517B: VR head-mounted display device with function of relieving visual fatigue based on liquid crystal microlens array - Google Patents</ref>. MLAs are also used in some [[eye tracking]] systems to help collect and direct light for imaging the user's pupil, enabling features like [[foveated rendering]]<ref name="Article1Ref_Kim2019">Kim, J., Jeong, Y., Stengel, M., Akşit, K., Albert, R., Boudaoud, B., ... & Luebke, D. (2019). Foveated AR: dynamically-foveated augmented reality display. ACM Transactions on Graphics, 38(4), 1-15.</ref>. | *'''[[Eye Tracking]] and Dynamic Focus:''' Tunable microlens arrays, such as those based on [[electrowetting]] liquid lenses or [[liquid crystal lens]]es, can be integrated into HMDs. Combined with eye-tracking cameras, these systems could dynamically adjust the focus of the displayed image or specific lenslets to match the user's gaze depth in real-time<ref name="ElectrowettingLFD"/>. This could enhance the realism of light field displays, provide variable focus capabilities<ref name="Article1Ref_Muenster2019">Muenster, R., Jaeger, G., Hubner, M., Stetter, M., & Stilla, U. (2019). Liquid crystal tunable microlens array for augmented reality displays. In Digital Optical Technologies 2019 (Vol. 11062, p. 110620J). International Society for Optics and Photonics.</ref>, potentially correct for individual user refractive errors, or even simulate depth-of-field effects by selectively blurring parts of the image<ref name="PatentCNBlur">(B) CN107942517B: VR head-mounted display device with function of relieving visual fatigue based on liquid crystal microlens array - Google Patents</ref>. MLAs are also used in some [[eye tracking]] systems to help collect and direct light for imaging the user's pupil, enabling features like [[foveated rendering]]<ref name="Article1Ref_Kim2019">Kim, J., Jeong, Y., Stengel, M., Akşit, K., Albert, R., Boudaoud, B., ... & Luebke, D. (2019). Foveated AR: dynamically-foveated augmented reality display. ACM Transactions on Graphics, 38(4), 1-15.</ref>. | ||
*'''Depth Sensing ([[Time-of-Flight]], [[Structured Light]]):''' MLAs play a role in the projection modules of active depth sensing systems. In [[Time-of-Flight]] (ToF) sensors, MLAs can shape and homogenize the output beam from illumination sources like [[VCSEL]] arrays, projecting a well-defined pattern (for example a "top-hat" profile) of infrared light onto the scene<ref name="PatentlyApple"/><ref name="BrightView"/>. In [[Structured Light]] systems (like those used in some versions of Apple's [[Face ID]]), MLAs can project a complex pattern of spots or lines onto the target<ref name="PatentlyApple"/><ref name="TEMICON"/>. The distortion of this pattern as seen by a sensor reveals the 3D shape of the target. These capabilities are essential for environmental mapping, hand tracking, [[gesture recognition]], and object recognition in AR/VR<ref name="PatentlyApple"/><ref name="TEMICON"/>. Some HMD patent designs use multiple MLAs combined with parallax barriers for 3D imaging<ref name="PatentUSMultiMLA">(A1) US20140168783A1: Near-eye microlens array displays - Google Patents</ref>. | *'''Depth Sensing ([[Time-of-Flight]], [[Structured Light]]):''' MLAs play a role in the projection modules of active depth sensing systems. In [[Time-of-Flight]] (ToF) sensors, MLAs can shape and homogenize the output beam from illumination sources like [[VCSEL]] arrays, projecting a well-defined pattern (for example a "top-hat" profile) of infrared light onto the scene<ref name="PatentlyApple"/><ref name="BrightView"/>. In [[Structured Light]] systems (like those used in some versions of Apple's [[Face ID]]), MLAs can project a complex pattern of spots or lines onto the target<ref name="PatentlyApple"/><ref name="TEMICON"/>. The distortion of this pattern as seen by a sensor reveals the 3D shape of the target. These capabilities are essential for environmental mapping, hand tracking, [[gesture recognition]], and object recognition in AR/VR<ref name="PatentlyApple"/><ref name="TEMICON"/>. Some HMD patent designs use multiple MLAs combined with parallax barriers for 3D imaging<ref name="PatentUSMultiMLA">(A1) US20140168783A1: Near-eye microlens array displays - Google Patents</ref>. | ||
*'''[[Wavefront Sensor]]s:''' The [[ | *'''[[Wavefront Sensor]]s:''' The [[Shack-Hartmann wavefront sensor]] uses an MLA placed in front of an [[image sensor]] ([[CCD]] or [[CMOS]]). An incoming optical wavefront is divided by the MLA into multiple beamlets, each focused onto the sensor. Deviations of the spot positions from a reference grid reveal the local slope of the wavefront, allowing its overall shape (including aberrations) to be reconstructed<ref name="RPPhotonics"/><ref name="AvantierIntro"/><ref name="StandardMLA"/><ref name="GDOptics"/>. While primarily used in optical metrology and [[adaptive optics]], this principle could potentially be adapted for HMD calibration or real-time measurement of the eye's aberrations for personalized display correction. | ||
*'''[[Light Field Camera]]s / Imaging Enhancement:''' Placing an MLA in front of an image sensor enables the capture of light field information (intensity and direction of light rays), creating a [[plenoptic camera]]<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="OpticaLFD"/>. This allows computational features like post-capture refocusing, depth map extraction, and perspective shifting. Such capabilities could be valuable for outward-facing cameras on AR/VR headsets for improved environmental understanding or [[computational photography]]. Even in conventional cameras, MLAs are often placed directly on CMOS/CCD sensors (one lenslet per pixel) simply to increase [[light collection]] efficiency (the optical fill factor) by funneling more incident light onto the active photosensitive area of each pixel, improving low-light performance and sensitivity<ref name="RPPhotonics"/><ref name="OpticalComponents"/><ref name="AvantierIntro"/><ref name="ApolloOptics"/>. | *'''[[Light Field Camera]]s / Imaging Enhancement:''' Placing an MLA in front of an image sensor enables the capture of light field information (intensity and direction of light rays), creating a [[plenoptic camera]]<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="OpticaLFD"/>. This allows computational features like post-capture refocusing, depth map extraction, and perspective shifting. Such capabilities could be valuable for outward-facing cameras on AR/VR headsets for improved environmental understanding or [[computational photography]]. Even in conventional cameras, MLAs are often placed directly on CMOS/CCD sensors (one lenslet per pixel) simply to increase [[light collection]] efficiency (the optical fill factor) by funneling more incident light onto the active photosensitive area of each pixel, improving low-light performance and sensitivity<ref name="RPPhotonics"/><ref name="OpticalComponents"/><ref name="AvantierIntro"/><ref name="ApolloOptics"/>. | ||
*'''High-Resolution Wide-FOV Imaging:''' Research demonstrates that combining ultrathin MLAs (potentially with wavelength-scale thickness using [[metasurface]] concepts) with [[computational imaging|computational reconstruction]] algorithms can achieve high-resolution imaging across a wide field of view within an extremely compact system<ref name="UltraThinMLA"/>. This could lead to highly integrated, high-performance cameras for AR glasses or VR headset pass-through modes<ref name="UltraThinMLA"/>. | *'''High-Resolution Wide-FOV Imaging:''' Research demonstrates that combining ultrathin MLAs (potentially with wavelength-scale thickness using [[metasurface]] concepts) with [[computational imaging|computational reconstruction]] algorithms can achieve high-resolution imaging across a wide field of view within an extremely compact system<ref name="UltraThinMLA"/>. This could lead to highly integrated, high-performance cameras for AR glasses or VR headset pass-through modes<ref name="UltraThinMLA"/>. | ||
Line 99: | Line 99: | ||
* [[Time-of-Flight camera]] | * [[Time-of-Flight camera]] | ||
* [[Structured Light]] | * [[Structured Light]] | ||
* [[ | * [[Shack-Hartmann wavefront sensor]] | ||
* [[Optical Aberration]] | * [[Optical Aberration]] | ||
* [[Field of View]] | * [[Field of View]] |