Microlens Arrays: Difference between revisions
Xinreality (talk | contribs) No edit summary Tag: Manual revert |
Xinreality (talk | contribs) No edit summary |
||
(6 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{ | {{see also|Terms|Technical Terms}} | ||
''' | [[Microlens Arrays]] ('''MLAs'''), sometimes called '''micro-lens arrays''' or '''lenslet arrays''', are [[optical component]]s consisting of multiple small [[lens]]es (often called '''lenslets''') arranged in a one-dimensional or two-dimensional pattern on a supporting substrate<ref name="RPPhotonics">Microlens arrays – fabrication, parameters, applications - RP Photonics</ref><ref name="ShanghaiOptics">Microlens Array - Shanghai Optics</ref><ref name="TEMICON">Microlens Arrays, MLA - temicon</ref><ref name="PhotonicsDict">Microlens array - Photonics Dictionary</ref>. Each lenslet typically has a diameter significantly less than 10 millimeters, often ranging from tens or hundreds of micrometers down to just a few micrometers, or even sub-micrometer in specialized cases<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="Syntec">Microlens Arrays | Single Point Diamond Turning - Syntec Optics</ref><ref name="StandardMLA">Standard Microlens Array - Bön Optics (appears to be distributor for brand, original mfg unclear)</ref>. The array pattern is commonly periodic, such as a square or hexagonal grid, but can also be linear, rectangular, circular, or even random/stochastic for specific applications<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="TEMICON"/><ref name="StandardMLA"/>. An array can contain thousands, millions, or even more individual lenslets<ref name="RPPhotonics"/><ref name="AvantierIntro">Introducing Microlens Arrays - Avantier Inc.</ref><ref name="OpticalComponents">Detailed Insights in Microlens Array Products - OPTICAL COMPONENTS</ref>. | ||
MLAs are characterized by their potential for miniaturization, integration into complex systems, and considerable design flexibility<ref name="StandardMLA"/><ref name="ApolloOptics"> | MLAs are characterized by their potential for miniaturization, integration into complex systems, and considerable design flexibility<ref name="StandardMLA"/><ref name="ApolloOptics">Injection-molded microlens arrays - Apollo Optical Systems</ref>. They have become a critical enabling technology in [[virtual reality]] (VR) and [[augmented reality]] (AR) devices, where they help solve numerous optical challenges related to [[field of view]], display brightness, visual quality, and [[form factor]]<ref name="Bote">Microlens Arrays: Versatile and Efficient Optical Solutions - Bote Optics Singapore</ref><ref name="OpticalComponents"/><ref name="BrightView">AR-VR - Augmented and Virtual Reality - BrightView Technologies, Inc.</ref>. Beyond VR/AR, they are employed across diverse fields, including [[telecommunication]]s (fiber coupling, optical switches), [[medical imaging]] (endoscopy, [[Optical Coherence Tomography|OCT]]), [[solar energy]] (concentrators), automotive [[LiDAR]], [[laser]] beam homogenization and shaping, [[sensor]] technology ([[Shack–Hartmann wavefront sensor]]s, image sensors), and [[consumer electronics]] (projectors, cameras, displays)<ref name="OpticalComponents"/><ref name="ShanghaiOptics"/><ref name="AvantierIntro"/><ref name="StandardMLA"/><ref name="Bote"/><ref name="GDOptics">Efficient and precise production of microlens arrays using precision glass molding - GD Optics (PDF)</ref>. Microlens arrays play an increasingly important role in next-generation display systems, [[waveguide]] technologies, [[eye tracking]] systems, [[light field display]] technologies, environmental [[sensing]], and [[computational imaging]] applications within the immersive technology sector<ref name="PatentlyApple">Apple Invents an optical system with Microlens Array Projectors to advance time-of-flight sensing for Face ID, delivering more realistic AR/VR features+ - Patently Apple (July 21, 2022)</ref><ref name="BrightView"/><ref name="UltraThinMLA">(2024-03-19) Imaging with high resolution and wide field of view based on an ultrathin microlens array - AIP Publishing</ref><ref name="AvantierMicroOptics">Types of Micro Optics - Avantier Inc.</ref>. | ||
== Characteristics == | ==Characteristics== | ||
Microlens arrays possess several key parameters that define their function and application suitability: | Microlens arrays possess several key parameters that define their function and application suitability: | ||
*'''Materials:''' MLAs can be fabricated from a wide range of optical materials, including various types of [[glass]] (like BK7), [[UV]]-grade [[fused silica]], [[silicon]], [[quartz]], [[zinc selenide]] (ZnSe), [[calcium fluoride]], and numerous optical [[polymer]]s such as [[PMMA]], [[polycarbonate]], and PET<ref name="Syntec"/><ref name="StandardMLA"/><ref name="OpticalComponents"/><ref name="BrightView"/>. The material choice is critical and depends on the target [[wavelength]] range (from deep UV to far [[infrared]]), required [[durability]], thermal stability, compatibility with [[manufacturing]] processes, and cost<ref name="OpticalComponents"/><ref name="Syntec"/><ref name="EdmundOptics"> | *'''Materials:''' MLAs can be fabricated from a wide range of optical materials, including various types of [[glass]] (like BK7), [[UV]]-grade [[fused silica]], [[silicon]], [[quartz]], [[zinc selenide]] (ZnSe), [[calcium fluoride]], and numerous optical [[polymer]]s such as [[PMMA]], [[polycarbonate]], and PET<ref name="Syntec"/><ref name="StandardMLA"/><ref name="OpticalComponents"/><ref name="BrightView"/>. The material choice is critical and depends on the target [[wavelength]] range (from deep UV to far [[infrared]]), required [[durability]], thermal stability, compatibility with [[manufacturing]] processes, and cost<ref name="OpticalComponents"/><ref name="Syntec"/><ref name="EdmundOptics">Microlens Arrays - Edmund Optics</ref>. Fused silica, for example, offers excellent transmission from UV (around 193nm) to near-infrared (up to 3µm)<ref name="Syntec"/><ref name="AvantierIntro"/><ref name="EdmundOptics"/>. Silicon is suitable for infrared applications (approx. 1.2µm to 5µm) and integrates well with [[MEMS]] fabrication<ref name="OpticalComponents"/><ref name="Syntec"/>. | ||
*'''Lenslet Shape and Profile:''' Individual lenslets can have various shapes, including circular, square, or hexagonal footprints<ref name="ShanghaiOptics"/><ref name="StandardMLA"/>. Their optical surfaces can be spherical, [[aspheric lens|aspherical]], cylindrical, or even freeform<ref name="StandardMLA"/><ref name="OpticalComponents"/><ref name="TEMICON"/><ref name="IsuzuGlass"> | *'''Lenslet Shape and Profile:''' Individual lenslets can have various shapes, including circular, square, or hexagonal footprints<ref name="ShanghaiOptics"/><ref name="StandardMLA"/>. Their optical surfaces can be spherical, [[aspheric lens|aspherical]], cylindrical, or even freeform<ref name="StandardMLA"/><ref name="OpticalComponents"/><ref name="TEMICON"/><ref name="IsuzuGlass">lens array - Isuzu Glass</ref>. Common profiles include<ref name="Article1Ref_Daly1990">Daly, D., Stevens, R. F., Hutley, M. C., & Davies, N. (1990). The manufacture of microlenses by melting photoresist. Measurement Science and Technology, 1(8), 759-766.</ref>: | ||
** [[Plano-convex lens|Plano-convex]] (flat on one side, convex on the other) | ** [[Plano-convex lens|Plano-convex]] (flat on one side, convex on the other) | ||
** [[Bi-convex lens|Bi-convex]] (convex on both sides) | ** [[Bi-convex lens|Bi-convex]] (convex on both sides) | ||
Line 13: | Line 13: | ||
** [[Fresnel lens|Fresnel-type]] (using concentric rings to achieve lens properties with reduced thickness) | ** [[Fresnel lens|Fresnel-type]] (using concentric rings to achieve lens properties with reduced thickness) | ||
Aspherical and freeform profiles are crucial for minimizing aberrations (like spherical aberration and [[chromatic aberration]]) and achieving specific beam shaping or imaging performance, especially off-axis<ref name="RPPhotonics"/><ref name="OpticalComponents"/><ref name="UltraThinMLA"/>. Cylindrical lenses focus light along a line rather than to a point and are used in applications like barcode scanners or laser line generation<ref name="OpticalComponents"/><ref name="GDOptics"/>. Lens profiles can be continuous surface type or stepped (diffractive)<ref name="StandardMLA"/>. | Aspherical and freeform profiles are crucial for minimizing aberrations (like spherical aberration and [[chromatic aberration]]) and achieving specific beam shaping or imaging performance, especially off-axis<ref name="RPPhotonics"/><ref name="OpticalComponents"/><ref name="UltraThinMLA"/>. Cylindrical lenses focus light along a line rather than to a point and are used in applications like barcode scanners or laser line generation<ref name="OpticalComponents"/><ref name="GDOptics"/>. Lens profiles can be continuous surface type or stepped (diffractive)<ref name="StandardMLA"/>. | ||
*'''Array Pattern and [[Fill Factor]]:''' Common arrangements include square and hexagonal grids, which offer regular packing<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/>. Hexagonal packing typically allows for a higher [[fill factor]] (the ratio of the optically active area to the total array area) compared to square packing of circular lenses, potentially exceeding 90%<ref name="WikiFillFactor"> | *'''Array Pattern and [[Fill Factor]]:''' Common arrangements include square and hexagonal grids, which offer regular packing<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/>. Hexagonal packing typically allows for a higher [[fill factor]] (the ratio of the optically active area to the total array area) compared to square packing of circular lenses, potentially exceeding 90%<ref name="WikiFillFactor">Microlens array fill factor - Wikipedia</ref><ref name="TEMICON"/>. High fill factors (e.g., up to 98% or more for square arrays, potentially >99% for gapless hexagonal arrays) are often desired to maximize light throughput, ensure uniform illumination, and avoid energy loss or undesirable diffraction effects like zero-order [[hotspot]]s in homogenization applications<ref name="ShanghaiOptics"/><ref name="AvantierIntro"/><ref name="Newport">Microlens Arrays - Newport</ref><ref name="TEMICON"/>. Some fabrication methods allow for "gapless" arrays where lenslets meet edge-to-edge<ref name="TEMICON"/>. Random or stochastic arrangements are also possible for specific diffusion or security applications<ref name="TEMICON"/>. | ||
*'''Pitch:''' The center-to-center distance between adjacent lenslets is known as the pitch. This can range widely, from millimeters down to a few micrometers or even less, depending on the application and fabrication limits<ref name="RPPhotonics"/><ref name="StandardMLA"/><ref name="MeetOptics"> | *'''Pitch:''' The center-to-center distance between adjacent lenslets is known as the pitch. This can range widely, from millimeters down to a few micrometers or even less, depending on the application and fabrication limits<ref name="RPPhotonics"/><ref name="StandardMLA"/><ref name="MeetOptics">Lens Arrays | Microlens Array - MEETOPTICS</ref>. | ||
*'''[[Focal Length]] and [[Numerical Aperture]] (NA):''' The focal length of the individual lenslets determines their focusing power. Available focal lengths range from sub-millimeter to tens or hundreds of millimeters<ref name="StandardMLA"/><ref name="NewportMALS18"> | *'''[[Focal Length]] and [[Numerical Aperture]] (NA):''' The focal length of the individual lenslets determines their focusing power. Available focal lengths range from sub-millimeter to tens or hundreds of millimeters<ref name="StandardMLA"/><ref name="NewportMALS18">MALS18 Micro Lens Array - Newport</ref><ref name="MeetOptics"/>. The NA describes the range of angles over which the lens can accept or emit light. | ||
*'''[[Optical Coating|Coatings]]:''' [[Anti-reflection coating]]s are frequently applied to the MLA surfaces (often both sides) to minimize reflection losses and maximize light transmission within the desired spectral range<ref name="RPPhotonics"/><ref name="Syntec"/><ref name="BrightView"/>. Other coatings might be applied for filtering or environmental protection. | *'''[[Optical Coating|Coatings]]:''' [[Anti-reflection coating]]s are frequently applied to the MLA surfaces (often both sides) to minimize reflection losses and maximize light transmission within the desired spectral range<ref name="RPPhotonics"/><ref name="Syntec"/><ref name="BrightView"/>. Other coatings might be applied for filtering or environmental protection. | ||
*'''[[Dimensional tolerance]]s and Quality:''' Key quality parameters include the accuracy of lenslet shape (surface form error, often specified in fractions of a wavelength), [[Surface quality (optics)|surface quality]] (scratch-dig), lenslet positioning accuracy (centration, pitch uniformity), focal length uniformity across the array, and overall array flatness<ref name="AvantierSpecs"> | *'''[[Dimensional tolerance]]s and Quality:''' Key quality parameters include the accuracy of lenslet shape (surface form error, often specified in fractions of a wavelength), [[Surface quality (optics)|surface quality]] (scratch-dig), lenslet positioning accuracy (centration, pitch uniformity), focal length uniformity across the array, and overall array flatness<ref name="AvantierSpecs">Microlens arrays – fabrication, parameters, applications - RP Photonics (mentions high accuracy vital)</ref><ref name="GDOptics"/>. Positional accuracy better than 1 µm can be achieved<ref name="GDOptics"/>. | ||
== Fabrication Methods == | ==Fabrication Methods== | ||
Manufacturing microlens arrays requires specialized techniques capable of creating microscale optical structures with high precision and often in large volumes: | Manufacturing microlens arrays requires specialized techniques capable of creating microscale optical structures with high precision and often in large volumes: | ||
*'''[[Photolithography]] and Etching:''' This is a foundational technique borrowed from the [[semiconductor]] industry. A pattern is defined using a [[photomask]] and [[photoresist]] on a substrate. Subsequent etching processes (e.g., [[wet etching]] or [[dry etching]] like [[Reactive-ion etching|reactive ion etching (RIE)]]) transfer the pattern into the substrate material, creating the lens structures<ref name="OpticalComponents"/><ref name="Newport"/><ref name="AvantierIntro"/><ref name="ReviewFab">(2024-12-09) | *'''[[Photolithography]] and Etching:''' This is a foundational technique borrowed from the [[semiconductor]] industry. A pattern is defined using a [[photomask]] and [[photoresist]] on a substrate. Subsequent etching processes (e.g., [[wet etching]] or [[dry etching]] like [[Reactive-ion etching|reactive ion etching (RIE)]]) transfer the pattern into the substrate material, creating the lens structures<ref name="OpticalComponents"/><ref name="Newport"/><ref name="AvantierIntro"/><ref name="ReviewFab">(2024-12-09) Fabrication of Microlens Array and Its Application: A Review - ResearchGate (PDF)</ref><ref name="Syntec"/>. Multiple etching steps can create multi-level diffractive structures<ref name="StandardMLA"/>. Grayscale photolithography uses masks with varying transparency to directly pattern 3D lens profiles<ref name="WikiFabrication">Microlens Fabrication - Wikipedia</ref>. | ||
*'''Photoresist Reflow:''' In this method, cylindrical pillars of photoresist are first patterned using lithography. The substrate is then heated above the resist's glass transition temperature, causing the pillars to melt and reflow. [[Surface tension]] naturally pulls the resist into a spherical cap shape (lenslet). This resist lens pattern can then be transferred into the underlying substrate (e.g., fused silica) using RIE, or the reflowed resist structure itself can serve as a mold for replication<ref name="TEMICON"/><ref name="ReviewFab"/><ref name="Article1Ref_Daly1990"/>. | *'''Photoresist Reflow:''' In this method, cylindrical pillars of photoresist are first patterned using lithography. The substrate is then heated above the resist's glass transition temperature, causing the pillars to melt and reflow. [[Surface tension]] naturally pulls the resist into a spherical cap shape (lenslet). This resist lens pattern can then be transferred into the underlying substrate (e.g., fused silica) using RIE, or the reflowed resist structure itself can serve as a mold for replication<ref name="TEMICON"/><ref name="ReviewFab"/><ref name="Article1Ref_Daly1990"/>. | ||
*'''[[Laser Direct Writing]] (LDW):''' High-precision lasers are used to directly shape the lenslets. This can involve selectively hardening a photosensitive material (like a photopolymer, e.g., via [[two-photon polymerization]]<ref name="Article1Ref_Gissibl2016">Gissibl, T., Thiele, S., Herkommer, A., & Giessen, H. (2016). Two-photon direct laser writing of ultracompact multi-lens objectives. Nature Photonics, 10(8), 554-560.</ref>) or ablating material from the substrate surface<ref name="OpticalComponents"/><ref name="AvantierIntro"/><ref name="ApolloOptics"/>. LDW offers great flexibility in creating complex, aspheric, or freeform profiles but can be slower and more expensive than mask-based methods, though it is used for creating high-quality masters for replication<ref name="OpticalComponents"/><ref name="ApolloOptics"/>. | *'''[[Laser Direct Writing]] (LDW):''' High-precision lasers are used to directly shape the lenslets. This can involve selectively hardening a photosensitive material (like a photopolymer, e.g., via [[two-photon polymerization]]<ref name="Article1Ref_Gissibl2016">Gissibl, T., Thiele, S., Herkommer, A., & Giessen, H. (2016). Two-photon direct laser writing of ultracompact multi-lens objectives. Nature Photonics, 10(8), 554-560.</ref>) or ablating material from the substrate surface<ref name="OpticalComponents"/><ref name="AvantierIntro"/><ref name="ApolloOptics"/>. LDW offers great flexibility in creating complex, aspheric, or freeform profiles but can be slower and more expensive than mask-based methods, though it is used for creating high-quality masters for replication<ref name="OpticalComponents"/><ref name="ApolloOptics"/>. | ||
Line 29: | Line 29: | ||
*'''Other Methods:''' Techniques like [[inkjet printing]] of optical polymers have also been demonstrated for fabricating microlenses<ref name="Article1Ref_Cox2001">Cox, W. R., Chen, T., & Hayes, D. J. (2001). Micro-optics fabrication by ink-jet printing. Optics and Photonics News, 12(6), 32-35.</ref>. | *'''Other Methods:''' Techniques like [[inkjet printing]] of optical polymers have also been demonstrated for fabricating microlenses<ref name="Article1Ref_Cox2001">Cox, W. R., Chen, T., & Hayes, D. J. (2001). Micro-optics fabrication by ink-jet printing. Optics and Photonics News, 12(6), 32-35.</ref>. | ||
== Applications in VR/AR == | ==Applications in VR/AR== | ||
Microlens arrays provide enabling capabilities for next-generation VR and AR systems, helping to address critical challenges related to [[form factor]], [[field of view]] (FOV), visual quality ([[resolution]], [[brightness]], depth perception), and [[power consumption]]. | Microlens arrays provide enabling capabilities for next-generation VR and AR systems, helping to address critical challenges related to [[form factor]], [[field of view]] (FOV), visual quality ([[resolution]], [[brightness]], depth perception), and [[power consumption]]. | ||
=== Display Optics === | ===Display Optics=== | ||
*'''Compact Magnifiers / Eyepieces:''' One key application is replacing bulky single-element eyepiece lenses (like traditional refractive lenses or even Fresnel lenses) with MLAs positioned between the [[microdisplay]] and the user's eye<ref name="PhotonicsArticle">(2023-05-10) | *'''Compact Magnifiers / Eyepieces:''' One key application is replacing bulky single-element eyepiece lenses (like traditional refractive lenses or even Fresnel lenses) with MLAs positioned between the [[microdisplay]] and the user's eye<ref name="PhotonicsArticle">(2023-05-10) Advanced Study of Optical Imaging Systems for Virtual Reality Head-Mounted Displays - Photonics</ref><ref name="PolGrating">(2025-01-02) Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating - Optica Publishing Group</ref><ref name="SuperlensMLA">Compact near-eye display system using a superlens-based microlens array magnifier - Optica Publishing Group (mentions MLA as magnifier)</ref>. Each lenslet magnifies a portion of the microdisplay image. This architecture holds the potential for significantly thinner and lighter [[Head-Mounted Display|HMDs]]<ref name="ThinVR">(2024-10-22) ThinVR: Heterogeneous microlens arrays for compact, 180 degree FOV VR near-eye displays - ResearchGate</ref><ref name="AzumaThinVR">ThinVR: Heterogeneous microlens arrays for compact, 180 degree FOV VR near-eye displays - Ronald Azuma (PDF of paper)</ref>. | ||
*'''Wide Field of View (FOV) Systems:''' To achieve ultra-wide FOVs (e.g., 180° horizontally, approaching the human visual system's range) while maintaining a compact form factor, researchers are exploring the use of curved MLAs paired with curved displays<ref name="PhotonicsArticle"/><ref name="ThinVR"/><ref name="AzumaThinVR"/>. In such wide-FOV systems, many lenslets are viewed significantly off-axis. To manage image quality (e.g., reduce distortion, maintain [[eye box]] size) across the entire FOV, heterogeneous MLAs are crucial. In these arrays, the properties (e.g., shape, focal length, tilt) of the lenslets are custom-designed and vary systematically across the array<ref name="ThinVR"/><ref name="AzumaThinVR"/>. Optimization algorithms are used to design these complex heterogeneous lenslet profiles<ref name="ThinVR"/><ref name="AzumaThinVR"/>. | *'''Wide Field of View (FOV) Systems:''' To achieve ultra-wide FOVs (e.g., 180° horizontally, approaching the human visual system's range) while maintaining a compact form factor, researchers are exploring the use of curved MLAs paired with curved displays<ref name="PhotonicsArticle"/><ref name="ThinVR"/><ref name="AzumaThinVR"/>. In such wide-FOV systems, many lenslets are viewed significantly off-axis. To manage image quality (e.g., reduce distortion, maintain [[eye box]] size) across the entire FOV, heterogeneous MLAs are crucial. In these arrays, the properties (e.g., shape, focal length, tilt) of the lenslets are custom-designed and vary systematically across the array<ref name="ThinVR"/><ref name="AzumaThinVR"/>. Optimization algorithms are used to design these complex heterogeneous lenslet profiles<ref name="ThinVR"/><ref name="AzumaThinVR"/>. | ||
*'''[[Light Field Display]]s:''' MLAs are a cornerstone technology for creating near-eye light field displays<ref name="NvidiaLFD"> | *'''[[Light Field Display]]s:''' MLAs are a cornerstone technology for creating near-eye light field displays<ref name="NvidiaLFD">Nvidia Near-Eye Light Field Display - LightField Forum</ref><ref name="NvidiaSupp">Supplementary Material: Near-Eye Light Field Displays - Research at NVIDIA (PDF)</ref><ref name="TAMULFD">Light-field Display Technical Deep Dive - Texas A&M College of Architecture (PDF)</ref><ref name="ResearchGateLFD">Design and simulation of a light field display - ResearchGate</ref>. By placing a precisely aligned MLA over a high-resolution microdisplay, the light rays originating from different pixels under each lenslet can be controlled in direction<ref name="NvidiaLFD"/><ref name="ResearchGateLFD"/>. Each lenslet projects a "micro-image" (sometimes called an elemental image or [[Hogel|hogel]]) composed of pixels underneath it, and effectively acts as a projector sending different information in different directions<ref name="TAMULFD"/><ref name="OpticaLFD">(2021-09-29) Examining the utility of pinhole-type screens for lightfield display - Optica Publishing Group</ref>. This allows the display to reconstruct a light field that approximates the light rays that would emanate from a real 3D scene. Crucially, this enables the viewer's eye to naturally change focus to different depths within the virtual scene, potentially resolving the [[vergence-accommodation conflict]] (VAC) that plagues conventional stereoscopic displays and causes eye strain<ref name="ElectrowettingLFD">Fabrication of an electrowetting liquid microlens array for a focus tunable integral imaging system - Optica Publishing Group</ref><ref name="Creal">VR IN FOCUS - Creal (PDF)</ref>. This technique is closely related to [[integral imaging]]<ref name="ResearchGateLFD"/><ref name="ElectrowettingLFD"/>. An early NVIDIA research prototype used 1280x720 OLED microdisplays with 1.0 mm pitch microlens arrays (focal length ~3.3 mm) to achieve a 29°x16° light field FOV<ref name="NvidiaSupp"/>. A key challenge remains the trade-off between spatial resolution (image sharpness) and angular resolution (number of views/[[depth cue]]s)<ref name="ResearchGateLFD"/><ref name="TAMULFD"/><ref name="Creal"/>. | ||
*'''Efficiency, Brightness, and [[Screen-door effect]] Reduction:''' MLAs can improve the overall light efficiency and perceived quality of display systems. In projectors or backlit displays like [[LCD]]s, MLAs can be used to focus light specifically onto the active (transmitting) area of each pixel, reducing light absorption by the surrounding pixel structure (e.g., [[thin-film transistor]]s)<ref name="ShanghaiOptics"/><ref name="RPPhotonics"/>. This increases brightness and reduces power consumption<ref name="BrightView"/><ref name="ApolloOptics"/>. Manufacturers like Sony have applied MLAs over [[OLED]] microdisplays to increase peak luminance<ref name="DisplayDailySony">(Nov 2021) | *'''Efficiency, Brightness, and [[Screen-door effect]] Reduction:''' MLAs can improve the overall light efficiency and perceived quality of display systems. In projectors or backlit displays like [[LCD]]s, MLAs can be used to focus light specifically onto the active (transmitting) area of each pixel, reducing light absorption by the surrounding pixel structure (e.g., [[thin-film transistor]]s)<ref name="ShanghaiOptics"/><ref name="RPPhotonics"/>. This increases brightness and reduces power consumption<ref name="BrightView"/><ref name="ApolloOptics"/>. Manufacturers like Sony have applied MLAs over [[OLED]] microdisplays to increase peak luminance<ref name="DisplayDailySony">(Nov 2021) Emerging Display Technologies for the AR/VR Market - Display Daily</ref>. By directing light more effectively or magnifying the apparent pixel area, MLAs can also help reduce the visible "screen door effect" (the dark gaps between pixels)<ref name="Article1Ref_Lanman2013">Lanman, D., & Luebke, D. (2013). Near-eye light field displays. ACM Transactions on Graphics, 32(6), 1-10.</ref>. Furthermore, MLA-based eyepiece designs can offer inherently better light efficiency compared to polarization-dependent folded optical paths used in [[pancake lens]] designs. Pancake lenses achieve thin form factors by folding the optical path using [[polarizer]]s and [[half-mirror]]s, but this process typically results in very low light efficiency (often cited as 10-25%)<ref name="PolGrating"/><ref name="PancakeReview">(2024-12-09) Fabrication of Microlens Array and Its Application: A Review - ResearchGate (PDF)</ref><ref name="LightTransPancake">(2024-07-05) Catadioptric Imaging System Based on Pancake Lenses - LightTrans</ref><ref name="RedditPancake">What are pancake lenses? - Reddit (Sept 13, 2022)</ref><ref name="LimbakReddit">(2022-05-27) LIMBAK's freeform microlens array is thinner and much more efficient than pancake lenses for VR and MR - Reddit</ref>. Novel freeform MLA designs claim much higher efficiencies (e.g., 80%) while also achieving thin profiles<ref name="LimbakReddit"/>. | ||
*'''[[Waveguide]] Coupling:''' In AR [[waveguide]] displays, microlens arrays can potentially serve as [[in-coupling]] and [[out-coupling]] elements, efficiently directing light from miniature projectors (like [[microLED]] arrays) into thin waveguides and then out toward the user's eyes. Research suggests pixel-level collimating microlenses could narrow microLED emission angles for better waveguide injection, though adding fabrication complexity<ref name="WaveguideReview">(2023) | *'''[[Waveguide]] Coupling:''' In AR [[waveguide]] displays, microlens arrays can potentially serve as [[in-coupling]] and [[out-coupling]] elements, efficiently directing light from miniature projectors (like [[microLED]] arrays) into thin waveguides and then out toward the user's eyes. Research suggests pixel-level collimating microlenses could narrow microLED emission angles for better waveguide injection, though adding fabrication complexity<ref name="WaveguideReview">(2023) Waveguide-based augmented reality displays: perspectives and challenges - eLight (SpringerOpen)</ref><ref name="Article1Ref_Kress2019">Kress, B. C., & Meyrueis, P. (2019). Applied digital optics: from micro-optics to nanophotonics. John Wiley & Sons.</ref>. | ||
=== Sensing and Tracking === | ===Sensing and Tracking=== | ||
*'''[[Eye Tracking]] and Dynamic Focus:''' Tunable microlens arrays, such as those based on [[electrowetting]] liquid lenses or [[liquid crystal lens]]es, can be integrated into HMDs. Combined with eye-tracking cameras, these systems could dynamically adjust the focus of the displayed image or specific lenslets to match the user's gaze depth in real-time<ref name="ElectrowettingLFD"/>. This could enhance the realism of light field displays, provide variable focus capabilities<ref name="Article1Ref_Muenster2019">Muenster, R., Jaeger, G., Hubner, M., Stetter, M., & Stilla, U. (2019). Liquid crystal tunable microlens array for augmented reality displays. In Digital Optical Technologies 2019 (Vol. 11062, p. 110620J). International Society for Optics and Photonics.</ref>, potentially correct for individual user refractive errors, or even simulate depth-of-field effects by selectively blurring parts of the image<ref name="PatentCNBlur">(B) | *'''[[Eye Tracking]] and Dynamic Focus:''' Tunable microlens arrays, such as those based on [[electrowetting]] liquid lenses or [[liquid crystal lens]]es, can be integrated into HMDs. Combined with eye-tracking cameras, these systems could dynamically adjust the focus of the displayed image or specific lenslets to match the user's gaze depth in real-time<ref name="ElectrowettingLFD"/>. This could enhance the realism of light field displays, provide variable focus capabilities<ref name="Article1Ref_Muenster2019">Muenster, R., Jaeger, G., Hubner, M., Stetter, M., & Stilla, U. (2019). Liquid crystal tunable microlens array for augmented reality displays. In Digital Optical Technologies 2019 (Vol. 11062, p. 110620J). International Society for Optics and Photonics.</ref>, potentially correct for individual user refractive errors, or even simulate depth-of-field effects by selectively blurring parts of the image<ref name="PatentCNBlur">(B) CN107942517B: VR head-mounted display device with function of relieving visual fatigue based on liquid crystal microlens array - Google Patents</ref>. MLAs are also used in some [[eye tracking]] systems to help collect and direct light for imaging the user's pupil, enabling features like [[foveated rendering]]<ref name="Article1Ref_Kim2019">Kim, J., Jeong, Y., Stengel, M., Akşit, K., Albert, R., Boudaoud, B., ... & Luebke, D. (2019). Foveated AR: dynamically-foveated augmented reality display. ACM Transactions on Graphics, 38(4), 1-15.</ref>. | ||
*'''Depth Sensing ([[Time-of-Flight]], [[Structured Light]]):''' MLAs play a role in the projection modules of active depth sensing systems. In [[Time-of-Flight]] (ToF) sensors, MLAs can shape and homogenize the output beam from illumination sources like [[VCSEL]] arrays, projecting a well-defined pattern (e.g., a "top-hat" profile) of infrared light onto the scene<ref name="PatentlyApple"/><ref name="BrightView"/>. In [[Structured Light]] systems (like those used in some versions of Apple's [[Face ID]]), MLAs can project a complex pattern of spots or lines onto the target<ref name="PatentlyApple"/><ref name="TEMICON"/>. The distortion of this pattern as seen by a sensor reveals the 3D shape of the target. These capabilities are essential for environmental mapping, hand tracking, [[gesture recognition]], and object recognition in AR/VR<ref name="PatentlyApple"/><ref name="TEMICON"/>. Some HMD patent designs use multiple MLAs combined with parallax barriers for 3D imaging<ref name="PatentUSMultiMLA">(A1) | *'''Depth Sensing ([[Time-of-Flight]], [[Structured Light]]):''' MLAs play a role in the projection modules of active depth sensing systems. In [[Time-of-Flight]] (ToF) sensors, MLAs can shape and homogenize the output beam from illumination sources like [[VCSEL]] arrays, projecting a well-defined pattern (e.g., a "top-hat" profile) of infrared light onto the scene<ref name="PatentlyApple"/><ref name="BrightView"/>. In [[Structured Light]] systems (like those used in some versions of Apple's [[Face ID]]), MLAs can project a complex pattern of spots or lines onto the target<ref name="PatentlyApple"/><ref name="TEMICON"/>. The distortion of this pattern as seen by a sensor reveals the 3D shape of the target. These capabilities are essential for environmental mapping, hand tracking, [[gesture recognition]], and object recognition in AR/VR<ref name="PatentlyApple"/><ref name="TEMICON"/>. Some HMD patent designs use multiple MLAs combined with parallax barriers for 3D imaging<ref name="PatentUSMultiMLA">(A1) US20140168783A1: Near-eye microlens array displays - Google Patents</ref>. | ||
*'''[[Wavefront Sensor]]s:''' The [[Shack–Hartmann wavefront sensor]] uses an MLA placed in front of an [[image sensor]] ([[CCD]] or [[CMOS]]). An incoming optical wavefront is divided by the MLA into multiple beamlets, each focused onto the sensor. Deviations of the spot positions from a reference grid reveal the local slope of the wavefront, allowing its overall shape (including aberrations) to be reconstructed<ref name="RPPhotonics"/><ref name="AvantierIntro"/><ref name="StandardMLA"/><ref name="GDOptics"/>. While primarily used in optical metrology and [[adaptive optics]], this principle could potentially be adapted for HMD calibration or real-time measurement of the eye's aberrations for personalized display correction. | *'''[[Wavefront Sensor]]s:''' The [[Shack–Hartmann wavefront sensor]] uses an MLA placed in front of an [[image sensor]] ([[CCD]] or [[CMOS]]). An incoming optical wavefront is divided by the MLA into multiple beamlets, each focused onto the sensor. Deviations of the spot positions from a reference grid reveal the local slope of the wavefront, allowing its overall shape (including aberrations) to be reconstructed<ref name="RPPhotonics"/><ref name="AvantierIntro"/><ref name="StandardMLA"/><ref name="GDOptics"/>. While primarily used in optical metrology and [[adaptive optics]], this principle could potentially be adapted for HMD calibration or real-time measurement of the eye's aberrations for personalized display correction. | ||
*'''[[Light Field Camera]]s / Imaging Enhancement:''' Placing an MLA in front of an image sensor enables the capture of light field information (intensity and direction of light rays), creating a [[plenoptic camera]]<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="OpticaLFD"/>. This allows computational features like post-capture refocusing, depth map extraction, and perspective shifting. Such capabilities could be valuable for outward-facing cameras on AR/VR headsets for improved environmental understanding or [[computational photography]]. Even in conventional cameras, MLAs are often placed directly on CMOS/CCD sensors (one lenslet per pixel) simply to increase [[light collection]] efficiency (the optical fill factor) by funneling more incident light onto the active photosensitive area of each pixel, improving low-light performance and sensitivity<ref name="RPPhotonics"/><ref name="OpticalComponents"/><ref name="AvantierIntro"/><ref name="ApolloOptics"/>. | *'''[[Light Field Camera]]s / Imaging Enhancement:''' Placing an MLA in front of an image sensor enables the capture of light field information (intensity and direction of light rays), creating a [[plenoptic camera]]<ref name="RPPhotonics"/><ref name="ShanghaiOptics"/><ref name="OpticaLFD"/>. This allows computational features like post-capture refocusing, depth map extraction, and perspective shifting. Such capabilities could be valuable for outward-facing cameras on AR/VR headsets for improved environmental understanding or [[computational photography]]. Even in conventional cameras, MLAs are often placed directly on CMOS/CCD sensors (one lenslet per pixel) simply to increase [[light collection]] efficiency (the optical fill factor) by funneling more incident light onto the active photosensitive area of each pixel, improving low-light performance and sensitivity<ref name="RPPhotonics"/><ref name="OpticalComponents"/><ref name="AvantierIntro"/><ref name="ApolloOptics"/>. | ||
*'''High-Resolution Wide-FOV Imaging:''' Research demonstrates that combining ultrathin MLAs (potentially with wavelength-scale thickness using [[metasurface]] concepts) with [[computational imaging|computational reconstruction]] algorithms can achieve high-resolution imaging across a wide field of view within an extremely compact system<ref name="UltraThinMLA"/>. This could lead to highly integrated, high-performance cameras for AR glasses or VR headset pass-through modes<ref name="UltraThinMLA"/>. | *'''High-Resolution Wide-FOV Imaging:''' Research demonstrates that combining ultrathin MLAs (potentially with wavelength-scale thickness using [[metasurface]] concepts) with [[computational imaging|computational reconstruction]] algorithms can achieve high-resolution imaging across a wide field of view within an extremely compact system<ref name="UltraThinMLA"/>. This could lead to highly integrated, high-performance cameras for AR glasses or VR headset pass-through modes<ref name="UltraThinMLA"/>. | ||
== Implementation in Commercial Devices == | ==Implementation in Commercial Devices== | ||
While many advanced MLA applications are still in research, some forms are used in current or recent VR/AR products: | While many advanced MLA applications are still in research, some forms are used in current or recent VR/AR products: | ||
*'''[[Varjo]] XR-3:''' Utilizes advanced optics potentially incorporating microlens structures as part of its "Bionic Display" system to achieve very high resolution ("human-eye resolution") in the central foveal area of the display<ref name="VarjoXR3Spec">Varjo. (2021). Varjo XR-3 Technical Specifications. | *'''[[Varjo]] XR-3:''' Utilizes advanced optics potentially incorporating microlens structures as part of its "Bionic Display" system to achieve very high resolution ("human-eye resolution") in the central foveal area of the display<ref name="VarjoXR3Spec">Varjo. (2021). Varjo XR-3 Technical Specifications.</ref>. | ||
*'''[[Meta Quest Pro]]:''' Incorporates [[pancake lens]] optics, which, while distinct from pure MLA eyepieces, often involve multiple optical elements and films where micro-structures might play a role in managing light paths or correcting aberrations within the compact, folded design<ref name="MetaQuestProOptics">Meta. (2022). Inside the Optics of Meta Quest Pro. Meta Technology Blog.</ref>. (Note: Specific use of MLAs in Quest Pro pancake optics needs confirmation from detailed teardowns or Meta disclosures). | *'''[[Meta Quest Pro]]:''' Incorporates [[pancake lens]] optics, which, while distinct from pure MLA eyepieces, often involve multiple optical elements and films where micro-structures might play a role in managing light paths or correcting aberrations within the compact, folded design<ref name="MetaQuestProOptics">Meta. (2022). Inside the Optics of Meta Quest Pro. Meta Technology Blog.</ref>. (Note: Specific use of MLAs in Quest Pro pancake optics needs confirmation from detailed teardowns or Meta disclosures). | ||
*'''[[Microsoft HoloLens 2]]:''' Uses [[waveguide]] display technology. While the primary coupling elements are often diffractive gratings, microlens structures could potentially be used in the light engine projecting the image into the waveguide, or as part of the integrated eye-tracking cameras<ref name="HoloLens2Optics">Kress, B. C. (2020). Optical architectures for augmented-, virtual-, and mixed-reality headsets. SPIE Press.</ref>. | *'''[[Microsoft HoloLens 2]]:''' Uses [[waveguide]] display technology. While the primary coupling elements are often diffractive gratings, microlens structures could potentially be used in the light engine projecting the image into the waveguide, or as part of the integrated eye-tracking cameras<ref name="HoloLens2Optics">Kress, B. C. (2020). Optical architectures for augmented-, virtual-, and mixed-reality headsets. SPIE Press.</ref>. | ||
*'''[[Magic Leap]] 2:''' Employs sophisticated waveguide displays. Earlier Magic Leap patents discussed [[photonic lightfield chip]] concepts that could involve micro-optical elements, potentially including MLA-like structures, for generating depth effects or managing light coupling<ref name="MagicLeapPatent">Abovitz, R., Schowengerdt, B. T., & Watson, E. A. (2015). Planar waveguide apparatus with diffraction element(s) and system employing same. U.S. Patent No. 9,244,280.</ref>. | *'''[[Magic Leap]] 2:''' Employs sophisticated waveguide displays. Earlier Magic Leap patents discussed [[photonic lightfield chip]] concepts that could involve micro-optical elements, potentially including MLA-like structures, for generating depth effects or managing light coupling<ref name="MagicLeapPatent">Abovitz, R., Schowengerdt, B. T., & Watson, E. A. (2015). Planar waveguide apparatus with diffraction element(s) and system employing same. U.S. Patent No. 9,244,280.</ref>. | ||
== Advantages in VR/AR == | ==Advantages in VR/AR== | ||
*'''Reduced [[Form Factor]]:''' MLA-based optics offer a pathway to significantly thinner and lighter HMDs compared to systems relying on single, thick conventional lenses or even Fresnel lenses<ref name="ThinVR"/><ref name="AzumaThinVR"/><ref name="PhotonicsArticle"/><ref name="PolGrating"/><ref name="SuperlensMLA"/>. Curved MLAs combined with curved displays can further enhance compactness, particularly for wide FOV designs<ref name="ThinVR"/><ref name="PhotonicsArticle"/>. MLA systems can potentially be thinner than pancake optics<ref name="LimbakReddit"/>. | *'''Reduced [[Form Factor]]:''' MLA-based optics offer a pathway to significantly thinner and lighter HMDs compared to systems relying on single, thick conventional lenses or even Fresnel lenses<ref name="ThinVR"/><ref name="AzumaThinVR"/><ref name="PhotonicsArticle"/><ref name="PolGrating"/><ref name="SuperlensMLA"/>. Curved MLAs combined with curved displays can further enhance compactness, particularly for wide FOV designs<ref name="ThinVR"/><ref name="PhotonicsArticle"/>. MLA systems can potentially be thinner than pancake optics<ref name="LimbakReddit"/>. | ||
*'''[[Wide Field of View]] (FOV):''' Advanced MLA designs (curved, heterogeneous) are a key enabling technology for achieving ultra-wide fields of view (approaching or exceeding 180° horizontally) that better match human peripheral vision, enhancing immersion<ref name="ThinVR"/><ref name="AzumaThinVR"/><ref name="PhotonicsArticle"/>. | *'''[[Wide Field of View]] (FOV):''' Advanced MLA designs (curved, heterogeneous) are a key enabling technology for achieving ultra-wide fields of view (approaching or exceeding 180° horizontally) that better match human peripheral vision, enhancing immersion<ref name="ThinVR"/><ref name="AzumaThinVR"/><ref name="PhotonicsArticle"/>. | ||
Line 61: | Line 61: | ||
*'''Miniaturization and Integration:''' The inherent nature of MLAs facilitates integration into compact modules for sensing and imaging functions within the HMD<ref name="OpticalComponents"/><ref name="StandardMLA"/>. | *'''Miniaturization and Integration:''' The inherent nature of MLAs facilitates integration into compact modules for sensing and imaging functions within the HMD<ref name="OpticalComponents"/><ref name="StandardMLA"/>. | ||
== Challenges and Considerations == | ==Challenges and Considerations== | ||
*'''Manufacturing Complexity and Cost:''' Fabricating MLAs with the required precision (sub-micron tolerances for shape and position), especially for complex designs (aspheric, freeform, heterogeneous, high fill factor, large area), remains challenging and can be expensive, particularly for achieving high yields in mass production<ref name="PhotonicsArticle"/><ref name="ReviewFab"/><ref name="OpticalComponents"/><ref name="Article1Ref_Huang2012">Huang, C. H., & Wang, W. P. (2012). Manufacturing challenges of microlens arrays for optical applications. Journal of Micromechanics and Microengineering, 22(12), 125031.</ref>. Mold fabrication for replication techniques is a critical and costly step<ref name="OpticalComponents"/><ref name="GDOptics"/>. | *'''Manufacturing Complexity and Cost:''' Fabricating MLAs with the required precision (sub-micron tolerances for shape and position), especially for complex designs (aspheric, freeform, heterogeneous, high fill factor, large area), remains challenging and can be expensive, particularly for achieving high yields in mass production<ref name="PhotonicsArticle"/><ref name="ReviewFab"/><ref name="OpticalComponents"/><ref name="Article1Ref_Huang2012">Huang, C. H., & Wang, W. P. (2012). Manufacturing challenges of microlens arrays for optical applications. Journal of Micromechanics and Microengineering, 22(12), 125031.</ref>. Mold fabrication for replication techniques is a critical and costly step<ref name="OpticalComponents"/><ref name="GDOptics"/>. | ||
*'''Resolution Trade-offs (Spatial vs. Angular):''' In light field display applications, there is a fundamental trade-off: increasing the angular resolution (more views, smoother depth) typically requires allocating more display pixels per lenslet, which reduces the overall spatial resolution (perceived sharpness) of the image, and vice versa<ref name="ResearchGateLFD"/><ref name="TAMULFD"/><ref name="Creal"/>. High-resolution microdisplays are essential. | *'''Resolution Trade-offs (Spatial vs. Angular):''' In light field display applications, there is a fundamental trade-off: increasing the angular resolution (more views, smoother depth) typically requires allocating more display pixels per lenslet, which reduces the overall spatial resolution (perceived sharpness) of the image, and vice versa<ref name="ResearchGateLFD"/><ref name="TAMULFD"/><ref name="Creal"/>. High-resolution microdisplays are essential. | ||
Line 71: | Line 71: | ||
*'''Image Quality Artifacts:''' Depending on the design and quality, MLA-based systems can exhibit artifacts like visible seams between lenslet views, Moiré patterns (if interacting with display pixel structure), or non-uniform brightness/sharpness across the field. | *'''Image Quality Artifacts:''' Depending on the design and quality, MLA-based systems can exhibit artifacts like visible seams between lenslet views, Moiré patterns (if interacting with display pixel structure), or non-uniform brightness/sharpness across the field. | ||
== Future Directions == | ==Future Directions== | ||
The development of microlens arrays for VR/AR is an active area of research and innovation: | The development of microlens arrays for VR/AR is an active area of research and innovation: | ||
*'''Advanced Manufacturing:''' Continued improvements in fabrication techniques (e.g., wafer-level optics, new materials, higher precision molding and lithography) are needed to enable cost-effective mass production of complex, high-performance MLAs<ref name="TEMICON"/><ref name="GDOptics"/><ref name="ReviewFab"/>. | *'''Advanced Manufacturing:''' Continued improvements in fabrication techniques (e.g., wafer-level optics, new materials, higher precision molding and lithography) are needed to enable cost-effective mass production of complex, high-performance MLAs<ref name="TEMICON"/><ref name="GDOptics"/><ref name="ReviewFab"/>. | ||
Line 81: | Line 81: | ||
*'''Multi-Layer Microlens Arrays:''' Stacking multiple MLA layers to enable more sophisticated optical functions, potentially improving FOV, reducing aberrations, or enhancing light field display capabilities<ref name="Article1Ref_Wetzstein2012">Wetzstein, G., Lanman, D., Hirsch, M., & Raskar, R. (2012). Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting. ACM Transactions on Graphics, 31(4), 1-11.</ref>. | *'''Multi-Layer Microlens Arrays:''' Stacking multiple MLA layers to enable more sophisticated optical functions, potentially improving FOV, reducing aberrations, or enhancing light field display capabilities<ref name="Article1Ref_Wetzstein2012">Wetzstein, G., Lanman, D., Hirsch, M., & Raskar, R. (2012). Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting. ACM Transactions on Graphics, 31(4), 1-11.</ref>. | ||
*'''Computational Co-Design:''' Increasingly sophisticated computational tools that co-optimize the optical design of the MLA with the required image processing and rendering algorithms to achieve system-level performance targets<ref name="ThinVR"/><ref name="AzumaThinVR"/><ref name="UltraThinMLA"/>. | *'''Computational Co-Design:''' Increasingly sophisticated computational tools that co-optimize the optical design of the MLA with the required image processing and rendering algorithms to achieve system-level performance targets<ref name="ThinVR"/><ref name="AzumaThinVR"/><ref name="UltraThinMLA"/>. | ||
*'''Environmental Robustness:''' Developing MLAs with enhanced durability and resistance to environmental factors like humidity, for example through superhydrophobic surface treatments<ref name="SuperhydrophobicMLA">(2022-11-17) | *'''Environmental Robustness:''' Developing MLAs with enhanced durability and resistance to environmental factors like humidity, for example through superhydrophobic surface treatments<ref name="SuperhydrophobicMLA">(2022-11-17) Flexible Superhydrophobic Microlens Arrays for Humid Outdoor Environment Applications - ACS Publications</ref>. | ||
== See Also == | ==See Also== | ||
* [[Lens]] | * [[Lens]] | ||
* [[Aspheric lens]] | * [[Aspheric lens]] | ||
Line 113: | Line 113: | ||
* [[Waveguide (optics)|Waveguide]] | * [[Waveguide (optics)|Waveguide]] | ||
== References == | ==References== | ||
<references /> | <references /> | ||
[[Category:Terms]] | |||
[[Category:Technical Terms]] | |||
[[Category:Optical Components]] | |||
[[Category:Micro-optics]] | |||
[[Category:Display Technology]] | |||
[[Category:VR Hardware Components]] | |||
[[Category:AR Hardware Components]] | |||
[[Category:Sensing Technology]] | |||
[[Category:Light Field Technology]] | |||
[[Category:Manufacturing Techniques]] |
Latest revision as of 23:44, 25 April 2025
- See also: Terms and Technical Terms
Microlens Arrays (MLAs), sometimes called micro-lens arrays or lenslet arrays, are optical components consisting of multiple small lenses (often called lenslets) arranged in a one-dimensional or two-dimensional pattern on a supporting substrate[1][2][3][4]. Each lenslet typically has a diameter significantly less than 10 millimeters, often ranging from tens or hundreds of micrometers down to just a few micrometers, or even sub-micrometer in specialized cases[1][2][5][6]. The array pattern is commonly periodic, such as a square or hexagonal grid, but can also be linear, rectangular, circular, or even random/stochastic for specific applications[1][2][3][6]. An array can contain thousands, millions, or even more individual lenslets[1][7][8].
MLAs are characterized by their potential for miniaturization, integration into complex systems, and considerable design flexibility[6][9]. They have become a critical enabling technology in virtual reality (VR) and augmented reality (AR) devices, where they help solve numerous optical challenges related to field of view, display brightness, visual quality, and form factor[10][8][11]. Beyond VR/AR, they are employed across diverse fields, including telecommunications (fiber coupling, optical switches), medical imaging (endoscopy, OCT), solar energy (concentrators), automotive LiDAR, laser beam homogenization and shaping, sensor technology (Shack–Hartmann wavefront sensors, image sensors), and consumer electronics (projectors, cameras, displays)[8][2][7][6][10][12]. Microlens arrays play an increasingly important role in next-generation display systems, waveguide technologies, eye tracking systems, light field display technologies, environmental sensing, and computational imaging applications within the immersive technology sector[13][11][14][15].
Characteristics
Microlens arrays possess several key parameters that define their function and application suitability:
- Materials: MLAs can be fabricated from a wide range of optical materials, including various types of glass (like BK7), UV-grade fused silica, silicon, quartz, zinc selenide (ZnSe), calcium fluoride, and numerous optical polymers such as PMMA, polycarbonate, and PET[5][6][8][11]. The material choice is critical and depends on the target wavelength range (from deep UV to far infrared), required durability, thermal stability, compatibility with manufacturing processes, and cost[8][5][16]. Fused silica, for example, offers excellent transmission from UV (around 193nm) to near-infrared (up to 3µm)[5][7][16]. Silicon is suitable for infrared applications (approx. 1.2µm to 5µm) and integrates well with MEMS fabrication[8][5].
- Lenslet Shape and Profile: Individual lenslets can have various shapes, including circular, square, or hexagonal footprints[2][6]. Their optical surfaces can be spherical, aspherical, cylindrical, or even freeform[6][8][3][17]. Common profiles include[18]:
- Plano-convex (flat on one side, convex on the other)
- Bi-convex (convex on both sides)
- Aspheric (non-spherical curves for reduced optical aberration)
- Fresnel-type (using concentric rings to achieve lens properties with reduced thickness)
Aspherical and freeform profiles are crucial for minimizing aberrations (like spherical aberration and chromatic aberration) and achieving specific beam shaping or imaging performance, especially off-axis[1][8][14]. Cylindrical lenses focus light along a line rather than to a point and are used in applications like barcode scanners or laser line generation[8][12]. Lens profiles can be continuous surface type or stepped (diffractive)[6].
- Array Pattern and Fill Factor: Common arrangements include square and hexagonal grids, which offer regular packing[1][2]. Hexagonal packing typically allows for a higher fill factor (the ratio of the optically active area to the total array area) compared to square packing of circular lenses, potentially exceeding 90%[19][3]. High fill factors (e.g., up to 98% or more for square arrays, potentially >99% for gapless hexagonal arrays) are often desired to maximize light throughput, ensure uniform illumination, and avoid energy loss or undesirable diffraction effects like zero-order hotspots in homogenization applications[2][7][20][3]. Some fabrication methods allow for "gapless" arrays where lenslets meet edge-to-edge[3]. Random or stochastic arrangements are also possible for specific diffusion or security applications[3].
- Pitch: The center-to-center distance between adjacent lenslets is known as the pitch. This can range widely, from millimeters down to a few micrometers or even less, depending on the application and fabrication limits[1][6][21].
- Focal Length and Numerical Aperture (NA): The focal length of the individual lenslets determines their focusing power. Available focal lengths range from sub-millimeter to tens or hundreds of millimeters[6][22][21]. The NA describes the range of angles over which the lens can accept or emit light.
- Coatings: Anti-reflection coatings are frequently applied to the MLA surfaces (often both sides) to minimize reflection losses and maximize light transmission within the desired spectral range[1][5][11]. Other coatings might be applied for filtering or environmental protection.
- Dimensional tolerances and Quality: Key quality parameters include the accuracy of lenslet shape (surface form error, often specified in fractions of a wavelength), surface quality (scratch-dig), lenslet positioning accuracy (centration, pitch uniformity), focal length uniformity across the array, and overall array flatness[23][12]. Positional accuracy better than 1 µm can be achieved[12].
Fabrication Methods
Manufacturing microlens arrays requires specialized techniques capable of creating microscale optical structures with high precision and often in large volumes:
- Photolithography and Etching: This is a foundational technique borrowed from the semiconductor industry. A pattern is defined using a photomask and photoresist on a substrate. Subsequent etching processes (e.g., wet etching or dry etching like reactive ion etching (RIE)) transfer the pattern into the substrate material, creating the lens structures[8][20][7][24][5]. Multiple etching steps can create multi-level diffractive structures[6]. Grayscale photolithography uses masks with varying transparency to directly pattern 3D lens profiles[25].
- Photoresist Reflow: In this method, cylindrical pillars of photoresist are first patterned using lithography. The substrate is then heated above the resist's glass transition temperature, causing the pillars to melt and reflow. Surface tension naturally pulls the resist into a spherical cap shape (lenslet). This resist lens pattern can then be transferred into the underlying substrate (e.g., fused silica) using RIE, or the reflowed resist structure itself can serve as a mold for replication[3][24][18].
- Laser Direct Writing (LDW): High-precision lasers are used to directly shape the lenslets. This can involve selectively hardening a photosensitive material (like a photopolymer, e.g., via two-photon polymerization[26]) or ablating material from the substrate surface[8][7][9]. LDW offers great flexibility in creating complex, aspheric, or freeform profiles but can be slower and more expensive than mask-based methods, though it is used for creating high-quality masters for replication[8][9].
- Nanoimprint Lithography (NIL) / Hot Embossing: These are replication techniques. A master mold (stamp) containing the negative pattern of the MLA is created (e.g., using LDW or etching). This mold is then pressed into a softened thermoplastic material (hot embossing) or a UV-curable resin (UV-NIL) coated on a substrate. After hardening (cooling or UV curing), the mold is removed, leaving the MLA structure replicated on the substrate[8][3][7][24]. These methods are suitable for cost-effective high-volume production[3]. Soft lithography uses similar principles with elastomeric stamps[27].
- Injection Molding: Similar to NIL, this is a replication method suitable for mass production. A mold insert containing the negative MLA structure is placed in an injection molding machine. Molten optical-grade plastic (or sometimes specialized glass - Precision Glass Molding) is injected into the mold cavity. After cooling and solidifying, the finished MLA part is ejected[8][3][9][17]. Precision Glass Molding (PGM) uses polished glass preforms (blanks) heated to their transition temperature and pressed into shape by molds, offering high precision for glass MLAs[12][24].
- Diamond Turning: Ultra-precision lathes equipped with diamond cutting tools can directly machine the MLA structures onto suitable substrate materials (metals for molds, or some IR materials like silicon or polymers directly)[5]. It's highly accurate but generally used for prototyping or creating master molds due to its serial nature.
- Other Methods: Techniques like inkjet printing of optical polymers have also been demonstrated for fabricating microlenses[28].
Applications in VR/AR
Microlens arrays provide enabling capabilities for next-generation VR and AR systems, helping to address critical challenges related to form factor, field of view (FOV), visual quality (resolution, brightness, depth perception), and power consumption.
Display Optics
- Compact Magnifiers / Eyepieces: One key application is replacing bulky single-element eyepiece lenses (like traditional refractive lenses or even Fresnel lenses) with MLAs positioned between the microdisplay and the user's eye[29][30][31]. Each lenslet magnifies a portion of the microdisplay image. This architecture holds the potential for significantly thinner and lighter HMDs[32][33].
- Wide Field of View (FOV) Systems: To achieve ultra-wide FOVs (e.g., 180° horizontally, approaching the human visual system's range) while maintaining a compact form factor, researchers are exploring the use of curved MLAs paired with curved displays[29][32][33]. In such wide-FOV systems, many lenslets are viewed significantly off-axis. To manage image quality (e.g., reduce distortion, maintain eye box size) across the entire FOV, heterogeneous MLAs are crucial. In these arrays, the properties (e.g., shape, focal length, tilt) of the lenslets are custom-designed and vary systematically across the array[32][33]. Optimization algorithms are used to design these complex heterogeneous lenslet profiles[32][33].
- Light Field Displays: MLAs are a cornerstone technology for creating near-eye light field displays[34][35][36][37]. By placing a precisely aligned MLA over a high-resolution microdisplay, the light rays originating from different pixels under each lenslet can be controlled in direction[34][37]. Each lenslet projects a "micro-image" (sometimes called an elemental image or hogel) composed of pixels underneath it, and effectively acts as a projector sending different information in different directions[36][38]. This allows the display to reconstruct a light field that approximates the light rays that would emanate from a real 3D scene. Crucially, this enables the viewer's eye to naturally change focus to different depths within the virtual scene, potentially resolving the vergence-accommodation conflict (VAC) that plagues conventional stereoscopic displays and causes eye strain[39][40]. This technique is closely related to integral imaging[37][39]. An early NVIDIA research prototype used 1280x720 OLED microdisplays with 1.0 mm pitch microlens arrays (focal length ~3.3 mm) to achieve a 29°x16° light field FOV[35]. A key challenge remains the trade-off between spatial resolution (image sharpness) and angular resolution (number of views/depth cues)[37][36][40].
- Efficiency, Brightness, and Screen-door effect Reduction: MLAs can improve the overall light efficiency and perceived quality of display systems. In projectors or backlit displays like LCDs, MLAs can be used to focus light specifically onto the active (transmitting) area of each pixel, reducing light absorption by the surrounding pixel structure (e.g., thin-film transistors)[2][1]. This increases brightness and reduces power consumption[11][9]. Manufacturers like Sony have applied MLAs over OLED microdisplays to increase peak luminance[41]. By directing light more effectively or magnifying the apparent pixel area, MLAs can also help reduce the visible "screen door effect" (the dark gaps between pixels)[42]. Furthermore, MLA-based eyepiece designs can offer inherently better light efficiency compared to polarization-dependent folded optical paths used in pancake lens designs. Pancake lenses achieve thin form factors by folding the optical path using polarizers and half-mirrors, but this process typically results in very low light efficiency (often cited as 10-25%)[30][43][44][45][46]. Novel freeform MLA designs claim much higher efficiencies (e.g., 80%) while also achieving thin profiles[46].
- Waveguide Coupling: In AR waveguide displays, microlens arrays can potentially serve as in-coupling and out-coupling elements, efficiently directing light from miniature projectors (like microLED arrays) into thin waveguides and then out toward the user's eyes. Research suggests pixel-level collimating microlenses could narrow microLED emission angles for better waveguide injection, though adding fabrication complexity[47][48].
Sensing and Tracking
- Eye Tracking and Dynamic Focus: Tunable microlens arrays, such as those based on electrowetting liquid lenses or liquid crystal lenses, can be integrated into HMDs. Combined with eye-tracking cameras, these systems could dynamically adjust the focus of the displayed image or specific lenslets to match the user's gaze depth in real-time[39]. This could enhance the realism of light field displays, provide variable focus capabilities[49], potentially correct for individual user refractive errors, or even simulate depth-of-field effects by selectively blurring parts of the image[50]. MLAs are also used in some eye tracking systems to help collect and direct light for imaging the user's pupil, enabling features like foveated rendering[51].
- Depth Sensing (Time-of-Flight, Structured Light): MLAs play a role in the projection modules of active depth sensing systems. In Time-of-Flight (ToF) sensors, MLAs can shape and homogenize the output beam from illumination sources like VCSEL arrays, projecting a well-defined pattern (e.g., a "top-hat" profile) of infrared light onto the scene[13][11]. In Structured Light systems (like those used in some versions of Apple's Face ID), MLAs can project a complex pattern of spots or lines onto the target[13][3]. The distortion of this pattern as seen by a sensor reveals the 3D shape of the target. These capabilities are essential for environmental mapping, hand tracking, gesture recognition, and object recognition in AR/VR[13][3]. Some HMD patent designs use multiple MLAs combined with parallax barriers for 3D imaging[52].
- Wavefront Sensors: The Shack–Hartmann wavefront sensor uses an MLA placed in front of an image sensor (CCD or CMOS). An incoming optical wavefront is divided by the MLA into multiple beamlets, each focused onto the sensor. Deviations of the spot positions from a reference grid reveal the local slope of the wavefront, allowing its overall shape (including aberrations) to be reconstructed[1][7][6][12]. While primarily used in optical metrology and adaptive optics, this principle could potentially be adapted for HMD calibration or real-time measurement of the eye's aberrations for personalized display correction.
- Light Field Cameras / Imaging Enhancement: Placing an MLA in front of an image sensor enables the capture of light field information (intensity and direction of light rays), creating a plenoptic camera[1][2][38]. This allows computational features like post-capture refocusing, depth map extraction, and perspective shifting. Such capabilities could be valuable for outward-facing cameras on AR/VR headsets for improved environmental understanding or computational photography. Even in conventional cameras, MLAs are often placed directly on CMOS/CCD sensors (one lenslet per pixel) simply to increase light collection efficiency (the optical fill factor) by funneling more incident light onto the active photosensitive area of each pixel, improving low-light performance and sensitivity[1][8][7][9].
- High-Resolution Wide-FOV Imaging: Research demonstrates that combining ultrathin MLAs (potentially with wavelength-scale thickness using metasurface concepts) with computational reconstruction algorithms can achieve high-resolution imaging across a wide field of view within an extremely compact system[14]. This could lead to highly integrated, high-performance cameras for AR glasses or VR headset pass-through modes[14].
Implementation in Commercial Devices
While many advanced MLA applications are still in research, some forms are used in current or recent VR/AR products:
- Varjo XR-3: Utilizes advanced optics potentially incorporating microlens structures as part of its "Bionic Display" system to achieve very high resolution ("human-eye resolution") in the central foveal area of the display[53].
- Meta Quest Pro: Incorporates pancake lens optics, which, while distinct from pure MLA eyepieces, often involve multiple optical elements and films where micro-structures might play a role in managing light paths or correcting aberrations within the compact, folded design[54]. (Note: Specific use of MLAs in Quest Pro pancake optics needs confirmation from detailed teardowns or Meta disclosures).
- Microsoft HoloLens 2: Uses waveguide display technology. While the primary coupling elements are often diffractive gratings, microlens structures could potentially be used in the light engine projecting the image into the waveguide, or as part of the integrated eye-tracking cameras[55].
- Magic Leap 2: Employs sophisticated waveguide displays. Earlier Magic Leap patents discussed photonic lightfield chip concepts that could involve micro-optical elements, potentially including MLA-like structures, for generating depth effects or managing light coupling[56].
Advantages in VR/AR
- Reduced Form Factor: MLA-based optics offer a pathway to significantly thinner and lighter HMDs compared to systems relying on single, thick conventional lenses or even Fresnel lenses[32][33][29][30][31]. Curved MLAs combined with curved displays can further enhance compactness, particularly for wide FOV designs[32][29]. MLA systems can potentially be thinner than pancake optics[46].
- Wide Field of View (FOV): Advanced MLA designs (curved, heterogeneous) are a key enabling technology for achieving ultra-wide fields of view (approaching or exceeding 180° horizontally) that better match human peripheral vision, enhancing immersion[32][33][29].
- Light Field Rendering / Improved Depth Perception: MLA-based light field displays can generate more natural depth cues, allowing the eye to focus correctly and potentially mitigating the vergence-accommodation conflict, leading to greater visual comfort[34][39][40].
- Higher Optical Efficiency: Compared to polarization-based folded optics like pancake lenses, MLA systems can potentially offer significantly higher light throughput, leading to brighter displays or reduced power consumption for the same brightness[46][30].
- Aberration Correction: The ability to design individual lenslets with aspheric, freeform, or heterogeneous profiles allows for sophisticated, spatially-varying aberration correction across the field of view, potentially leading to sharper images[1][32][14][9].
- Miniaturization and Integration: The inherent nature of MLAs facilitates integration into compact modules for sensing and imaging functions within the HMD[8][6].
Challenges and Considerations
- Manufacturing Complexity and Cost: Fabricating MLAs with the required precision (sub-micron tolerances for shape and position), especially for complex designs (aspheric, freeform, heterogeneous, high fill factor, large area), remains challenging and can be expensive, particularly for achieving high yields in mass production[29][24][8][57]. Mold fabrication for replication techniques is a critical and costly step[8][12].
- Resolution Trade-offs (Spatial vs. Angular): In light field display applications, there is a fundamental trade-off: increasing the angular resolution (more views, smoother depth) typically requires allocating more display pixels per lenslet, which reduces the overall spatial resolution (perceived sharpness) of the image, and vice versa[37][36][40]. High-resolution microdisplays are essential.
- Diffraction Limits: As lenslet sizes shrink, diffraction effects become more pronounced, potentially limiting the achievable resolution or sharpness (spot size)[6] (Implied by size limits).
- Chromatic Aberration: Like single lenses, simple MLA lenslets made from standard materials exhibit chromatic aberration (color fringing), where different wavelengths of light focus at different points[58]. This can be particularly noticeable in light field displays or wide FOV systems[38]. Correction requires achromatic lenslet designs (e.g., using multiple materials or diffractive features) or sophisticated computational correction algorithms[14].
- Stray Light and Ghost Images: Multiple surfaces in an MLA-based optical system can lead to internal reflections, potentially causing stray light (reducing contrast) or noticeable ghost images, similar to issues encountered in multi-element lenses or pancake optics[30][9] (Mentions reducing glare/ghosting as advantage)[59]. Anti-reflection coatings and careful optical design are crucial mitigations.
- Computational Load: Rendering content for light field displays requires specialized algorithms (calculating the view for each direction from each lenslet) which can be significantly more computationally demanding than standard stereoscopic rendering[35][37][36]. Computational imaging techniques associated with some MLA sensors also require processing power[14].
- Eye Box Size and Alignment Sensitivity: Designing an MLA system that provides a sufficiently large eye box (the volume within which the user's pupil can move without losing the image or experiencing significant degradation) can be challenging, especially for wide FOV designs. Misalignment between the eye, the MLA, and the display can lead to image artifacts[32][33].
- Image Quality Artifacts: Depending on the design and quality, MLA-based systems can exhibit artifacts like visible seams between lenslet views, Moiré patterns (if interacting with display pixel structure), or non-uniform brightness/sharpness across the field.
Future Directions
The development of microlens arrays for VR/AR is an active area of research and innovation:
- Advanced Manufacturing: Continued improvements in fabrication techniques (e.g., wafer-level optics, new materials, higher precision molding and lithography) are needed to enable cost-effective mass production of complex, high-performance MLAs[3][12][24].
- Freeform and Heterogeneous Designs: Further exploration of freeform surfaces and heterogeneous lenslet optimization to simultaneously maximize FOV, eye box, resolution, and efficiency while minimizing aberrations and form factor[46][32][33].
- Metalens Arrays: Research into metasurface-based MLAs (metalenses) which control light using subwavelength nanostructures. These offer potential for ultra-thin, flat optics with novel functionalities like reduced chromatic aberration[60][14].
- Light Field Display Enhancement: Overcoming the spatial/angular resolution trade-off, reducing computational requirements, and improving image quality (e.g., reducing chromatic aliasing) for light field displays[40][38].
- Tunable and Active MLAs: Development and integration of dynamic MLAs using technologies like liquid crystals or electrowetting for real-time focus adjustment, aberration correction, or gaze-contingent rendering[39][49].
- Hybrid Optics: Combining MLAs with other advanced optical technologies like metasurfaces, diffractive optics (DOEs), holographic optical elements (HOEs), or polarization gratings to achieve novel functionalities or enhanced performance[14][24][30][61].
- Multi-Layer Microlens Arrays: Stacking multiple MLA layers to enable more sophisticated optical functions, potentially improving FOV, reducing aberrations, or enhancing light field display capabilities[62].
- Computational Co-Design: Increasingly sophisticated computational tools that co-optimize the optical design of the MLA with the required image processing and rendering algorithms to achieve system-level performance targets[32][33][14].
- Environmental Robustness: Developing MLAs with enhanced durability and resistance to environmental factors like humidity, for example through superhydrophobic surface treatments[63].
See Also
- Lens
- Aspheric lens
- Fresnel lens
- Pancake Lens
- Metalens
- Metasurface
- Light Field Display
- Integral Imaging
- Head-Mounted Display
- Virtual Reality
- Augmented Reality
- Microdisplay
- Eye Tracking
- Time-of-Flight camera
- Structured Light
- Shack–Hartmann wavefront sensor
- Optical Aberration
- Field of View
- Eye box
- Form Factor
- Fill Factor
- Photolithography
- Injection Molding
- Nanoimprint Lithography
- Two-photon polymerization
- Vergence-accommodation conflict
- Computational imaging
- Waveguide
References
- ↑ 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 1.10 1.11 1.12 Microlens arrays – fabrication, parameters, applications - RP Photonics
- ↑ 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 Microlens Array - Shanghai Optics
- ↑ 3.00 3.01 3.02 3.03 3.04 3.05 3.06 3.07 3.08 3.09 3.10 3.11 3.12 3.13 Microlens Arrays, MLA - temicon
- ↑ Microlens array - Photonics Dictionary
- ↑ 5.0 5.1 5.2 5.3 5.4 5.5 5.6 5.7 Microlens Arrays | Single Point Diamond Turning - Syntec Optics
- ↑ 6.00 6.01 6.02 6.03 6.04 6.05 6.06 6.07 6.08 6.09 6.10 6.11 6.12 6.13 Standard Microlens Array - Bön Optics (appears to be distributor for brand, original mfg unclear)
- ↑ 7.0 7.1 7.2 7.3 7.4 7.5 7.6 7.7 7.8 Introducing Microlens Arrays - Avantier Inc.
- ↑ 8.00 8.01 8.02 8.03 8.04 8.05 8.06 8.07 8.08 8.09 8.10 8.11 8.12 8.13 8.14 8.15 8.16 8.17 Detailed Insights in Microlens Array Products - OPTICAL COMPONENTS
- ↑ 9.0 9.1 9.2 9.3 9.4 9.5 9.6 9.7 Injection-molded microlens arrays - Apollo Optical Systems
- ↑ 10.0 10.1 Microlens Arrays: Versatile and Efficient Optical Solutions - Bote Optics Singapore
- ↑ 11.0 11.1 11.2 11.3 11.4 11.5 AR-VR - Augmented and Virtual Reality - BrightView Technologies, Inc.
- ↑ 12.0 12.1 12.2 12.3 12.4 12.5 12.6 12.7 Efficient and precise production of microlens arrays using precision glass molding - GD Optics (PDF)
- ↑ 13.0 13.1 13.2 13.3 Apple Invents an optical system with Microlens Array Projectors to advance time-of-flight sensing for Face ID, delivering more realistic AR/VR features+ - Patently Apple (July 21, 2022)
- ↑ 14.00 14.01 14.02 14.03 14.04 14.05 14.06 14.07 14.08 14.09 (2024-03-19) Imaging with high resolution and wide field of view based on an ultrathin microlens array - AIP Publishing
- ↑ Types of Micro Optics - Avantier Inc.
- ↑ 16.0 16.1 Microlens Arrays - Edmund Optics
- ↑ 17.0 17.1 lens array - Isuzu Glass
- ↑ 18.0 18.1 Daly, D., Stevens, R. F., Hutley, M. C., & Davies, N. (1990). The manufacture of microlenses by melting photoresist. Measurement Science and Technology, 1(8), 759-766.
- ↑ Microlens array fill factor - Wikipedia
- ↑ 20.0 20.1 Microlens Arrays - Newport
- ↑ 21.0 21.1 Lens Arrays | Microlens Array - MEETOPTICS
- ↑ MALS18 Micro Lens Array - Newport
- ↑ Microlens arrays – fabrication, parameters, applications - RP Photonics (mentions high accuracy vital)
- ↑ 24.0 24.1 24.2 24.3 24.4 24.5 24.6 (2024-12-09) Fabrication of Microlens Array and Its Application: A Review - ResearchGate (PDF)
- ↑ Microlens Fabrication - Wikipedia
- ↑ Gissibl, T., Thiele, S., Herkommer, A., & Giessen, H. (2016). Two-photon direct laser writing of ultracompact multi-lens objectives. Nature Photonics, 10(8), 554-560.
- ↑ Rogers, J. A., & Nuzzo, R. G. (2005). Recent progress in soft lithography. Materials Today, 8(2), 50-56.
- ↑ Cox, W. R., Chen, T., & Hayes, D. J. (2001). Micro-optics fabrication by ink-jet printing. Optics and Photonics News, 12(6), 32-35.
- ↑ 29.0 29.1 29.2 29.3 29.4 29.5 (2023-05-10) Advanced Study of Optical Imaging Systems for Virtual Reality Head-Mounted Displays - Photonics
- ↑ 30.0 30.1 30.2 30.3 30.4 30.5 (2025-01-02) Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating - Optica Publishing Group
- ↑ 31.0 31.1 Compact near-eye display system using a superlens-based microlens array magnifier - Optica Publishing Group (mentions MLA as magnifier)
- ↑ 32.00 32.01 32.02 32.03 32.04 32.05 32.06 32.07 32.08 32.09 32.10 (2024-10-22) ThinVR: Heterogeneous microlens arrays for compact, 180 degree FOV VR near-eye displays - ResearchGate
- ↑ 33.0 33.1 33.2 33.3 33.4 33.5 33.6 33.7 33.8 ThinVR: Heterogeneous microlens arrays for compact, 180 degree FOV VR near-eye displays - Ronald Azuma (PDF of paper)
- ↑ 34.0 34.1 34.2 Nvidia Near-Eye Light Field Display - LightField Forum
- ↑ 35.0 35.1 35.2 Supplementary Material: Near-Eye Light Field Displays - Research at NVIDIA (PDF)
- ↑ 36.0 36.1 36.2 36.3 36.4 Light-field Display Technical Deep Dive - Texas A&M College of Architecture (PDF)
- ↑ 37.0 37.1 37.2 37.3 37.4 37.5 Design and simulation of a light field display - ResearchGate
- ↑ 38.0 38.1 38.2 38.3 (2021-09-29) Examining the utility of pinhole-type screens for lightfield display - Optica Publishing Group
- ↑ 39.0 39.1 39.2 39.3 39.4 Fabrication of an electrowetting liquid microlens array for a focus tunable integral imaging system - Optica Publishing Group
- ↑ 40.0 40.1 40.2 40.3 40.4 VR IN FOCUS - Creal (PDF)
- ↑ (Nov 2021) Emerging Display Technologies for the AR/VR Market - Display Daily
- ↑ Lanman, D., & Luebke, D. (2013). Near-eye light field displays. ACM Transactions on Graphics, 32(6), 1-10.
- ↑ (2024-12-09) Fabrication of Microlens Array and Its Application: A Review - ResearchGate (PDF)
- ↑ (2024-07-05) Catadioptric Imaging System Based on Pancake Lenses - LightTrans
- ↑ What are pancake lenses? - Reddit (Sept 13, 2022)
- ↑ 46.0 46.1 46.2 46.3 46.4 (2022-05-27) LIMBAK's freeform microlens array is thinner and much more efficient than pancake lenses for VR and MR - Reddit
- ↑ (2023) Waveguide-based augmented reality displays: perspectives and challenges - eLight (SpringerOpen)
- ↑ Kress, B. C., & Meyrueis, P. (2019). Applied digital optics: from micro-optics to nanophotonics. John Wiley & Sons.
- ↑ 49.0 49.1 Muenster, R., Jaeger, G., Hubner, M., Stetter, M., & Stilla, U. (2019). Liquid crystal tunable microlens array for augmented reality displays. In Digital Optical Technologies 2019 (Vol. 11062, p. 110620J). International Society for Optics and Photonics.
- ↑ (B) CN107942517B: VR head-mounted display device with function of relieving visual fatigue based on liquid crystal microlens array - Google Patents
- ↑ Kim, J., Jeong, Y., Stengel, M., Akşit, K., Albert, R., Boudaoud, B., ... & Luebke, D. (2019). Foveated AR: dynamically-foveated augmented reality display. ACM Transactions on Graphics, 38(4), 1-15.
- ↑ (A1) US20140168783A1: Near-eye microlens array displays - Google Patents
- ↑ Varjo. (2021). Varjo XR-3 Technical Specifications.
- ↑ Meta. (2022). Inside the Optics of Meta Quest Pro. Meta Technology Blog.
- ↑ Kress, B. C. (2020). Optical architectures for augmented-, virtual-, and mixed-reality headsets. SPIE Press.
- ↑ Abovitz, R., Schowengerdt, B. T., & Watson, E. A. (2015). Planar waveguide apparatus with diffraction element(s) and system employing same. U.S. Patent No. 9,244,280.
- ↑ Huang, C. H., & Wang, W. P. (2012). Manufacturing challenges of microlens arrays for optical applications. Journal of Micromechanics and Microengineering, 22(12), 125031.
- ↑ Sweeney, M. O., & Hoang, A. (2018). Methods and systems for reducing the effects of chromatic aberration in optical systems. U.S. Patent No. 10,120,194.
- ↑ Wang, Y., & Ji, H. (2012). Ghost image removal using visibility modulation method for lens-array-based integral imaging displays. Journal of Display Technology, 8(12), 709-714.
- ↑ Capasso, F. (2018). The future of optics: metalenses. American Scientist, 106(1), 20-25.
- ↑ Lee, B., Hong, J., Yoo, D., Shin, C., & Lee, S. (2017). Hybrid optical systems for see-through head-mounted displays. In Digital Optical Technologies 2017 (Vol. 10335, p. 103350I). International Society for Optics and Photonics.
- ↑ Wetzstein, G., Lanman, D., Hirsch, M., & Raskar, R. (2012). Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting. ACM Transactions on Graphics, 31(4), 1-11.
- ↑ (2022-11-17) Flexible Superhydrophobic Microlens Arrays for Humid Outdoor Environment Applications - ACS Publications