Jump to content

Lens array: Difference between revisions

No edit summary
No edit summary
Line 1: Line 1:
= Lens array in virtual and augmented reality =
= Lens array in virtual and augmented reality =


Lens arrays are two-dimensional arrangements of many small lenses (often [[Microlens array]]s) that manipulate light fields for imaging or display. In [[Virtual reality]] (VR) and [[Augmented reality]] (AR) systems, lens arrays serve two broad roles: as '''display optics''' that create 3D or light-field images, and as '''sensor optics''' that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref><ref>Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref> In sensing, microlens-based ''plenoptic'' or light-field cameras capture the full 4D light field, allowing computational refocusing and depth estimation.<ref>Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref><ref>Yang L, Guo Y. Eye tracking using a light field camera on a head-mounted display. US Patent Application US20180173303A1. 2018 Jun 21.</ref> Modern VR/AR prototypes leverage microlens arrays, [[Light field display]] techniques, [[Integral imaging]], holographic waveguide couplers, and specialized lens-array modules for eye tracking and depth sensing. These components appear in devices such as wide-FOV near-eye displays and optical see-through [[Head-mounted display]]s.
Lens arrays are two-dimensional arrangements of many small lenses (often [[Microlens array]]s) that manipulate light fields for imaging or display. In [[Virtual reality]] (VR) and [[Augmented reality]] (AR) systems, lens arrays serve two broad roles: as '''display optics''' that create 3D or light-field images, and as '''sensor optics''' that capture directional light for depth and eye tracking. In displays, lens arrays enable multi-view and focal-plane rendering (e.g. light-field displays or integral imaging) by splitting the image into many sub-images corresponding to different angles or depths.<ref name="Li2019">Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref><ref name="Ng2005">Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref> In sensing, microlens-based ''plenoptic'' or light-field cameras capture the full 4D light field, allowing computational refocusing and depth estimation.<ref name="Ng2005" /> <ref name="Yang2018">Yang L, Guo Y. Eye tracking using a light field camera on a head-mounted display. US Patent Application US20180173303A1. 2018 Jun 21.</ref> Modern VR/AR prototypes leverage microlens arrays, [[Light field display]] techniques, [[Integral imaging]], holographic waveguide couplers, and specialized lens-array modules for eye tracking and depth sensing. These components appear in devices such as wide-FOV near-eye displays and optical see-through [[Head-mounted display]]s.


== History ==
== History ==


Lens-array technology traces back over a century. Gabriel Lippmann first proposed "integral photography" in 1908, capturing 3D scenes via a lens grid.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref> Early implementations used pinhole arrays (circa 1911) and later simple microlens plates (around 1948) to record and replay light fields.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref> In the mid-20th century, lenticular (cylindrical lens) sheets became popular for autostereoscopic prints and displays (e.g. 3D postcards and packaging), providing separate views for each eye. By the 2000s, advances in digital displays and microfabrication revived lens-array research for head-worn displays. For example, smartphone-scale integral imaging was demonstrated by pairing a display with a matching MLA.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref> In recent years, VR/AR research has produced thin, wide-FOV near-eye displays using sophisticated lens arrays (e.g. polarization optics or metasurfaces)<ref>Shin G, Lee Y, Kim J, et al. Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating. PubMed. 2023;39876217.</ref>, as well as compact eye-tracking and depth cameras using microlens arrays.<ref>Yang L, Guo Y. Eye tracking using a light field camera on a head-mounted display. US Patent Application US20180173303A1. 2018 Jun 21.</ref><ref>Microsoft. Camera comprising lens array. Patent Nweon. 2020;30768.</ref>
Lens-array technology traces back over a century. Gabriel Lippmann first proposed "integral photography" in 1908, capturing 3D scenes via a lens grid.<ref name="Li2019" /> Early implementations used pinhole arrays (circa 1911) and later simple microlens plates (around 1948) to record and replay light fields.<ref name="Li2019" /> In the mid-20th century, lenticular (cylindrical lens) sheets became popular for autostereoscopic prints and displays (e.g. 3D postcards and packaging), providing separate views for each eye. By the 2000s, advances in digital displays and microfabrication revived lens-array research for head-worn displays. For example, smartphone-scale integral imaging was demonstrated by pairing a display with a matching MLA.<ref name="Li2019" /> In recent years, VR/AR research has produced thin, wide-FOV near-eye displays using sophisticated lens arrays (e.g. polarization optics or metasurfaces)<ref name="Shin2023">Shin G, Lee Y, Kim J, et al. Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating. PubMed. 2023;39876217.</ref>, as well as compact eye-tracking and depth cameras using microlens arrays.<ref name="Yang2018" /><ref name="Microsoft2020">Microsoft. Camera comprising lens array. Patent Nweon. 2020;30768.</ref>


== Types of lens arrays ==
== Types of lens arrays ==
Line 11: Line 11:
Lens arrays in VR/AR come in several varieties:
Lens arrays in VR/AR come in several varieties:


'''Spherical microlens arrays:''' Regular arrays of small convex (often spherical or aspheric) lenses. These planar MLAs are common for light-field displays and cameras. Pitch (spacing) can range from tens of micrometers (in cameras) up to a few millimeters (in HMD displays).<ref>Wei K. Near-eye augmented reality display using wide field-of-view scanning polarization pupil replication. University of California, Berkeley. 2023.</ref><ref>Jang C, Bang K, Asaduzzaman A, Lee S, Lee B. Three-dimensional see-through augmented-reality display system using a holographic micromirror array. Applied Optics. 2021;60(25):7545-7553.</ref> Each lenslet has a focal length chosen to suit the application (e.g. to collimate a display or focus on a sensor).
'''Spherical microlens arrays:''' Regular arrays of small convex (often spherical or aspheric) lenses. These planar MLAs are common for light-field displays and cameras. Pitch (spacing) can range from tens of micrometers (in cameras) up to a few millimeters (in HMD displays).<ref name="Wei2023">Wei K. Near-eye augmented reality display using wide field-of-view scanning polarization pupil replication. University of California, Berkeley. 2023.</ref><ref name="Jang2021">Jang C, Bang K, Asaduzzaman A, Lee S, Lee B. Three-dimensional see-through augmented-reality display system using a holographic micromirror array. Applied Optics. 2021;60(25):7545-7553.</ref> Each lenslet has a focal length chosen to suit the application (e.g. to collimate a display or focus on a sensor).


'''Lenticular arrays:''' Arrays of cylindrical microlenses (lenticules) arranged one-dimensionally or two-dimensionally. These produce multiple horizontal viewing zones in glasses-free 3D displays. For example, a lenticular lens array can restrict the exit pupil to certain angles, enabling light-field panels that show different images to each eye.<ref>Balogh T, Nagy Z, Kerbel I, et al. Directional and Eye-Tracking Light Field Display with Efficient Rendering and Illumination. PMC. 2023;10385613.</ref> Such arrays are widely used in glasses-free 3D signage and have been adapted to VR/AR light-field display prototypes.
'''Lenticular arrays:''' Arrays of cylindrical microlenses (lenticules) arranged one-dimensionally or two-dimensionally. These produce multiple horizontal viewing zones in glasses-free 3D displays. For example, a lenticular lens array can restrict the exit pupil to certain angles, enabling light-field panels that show different images to each eye.<ref name="Balogh2023">Balogh T, Nagy Z, Kerbel I, et al. Directional and Eye-Tracking Light Field Display with Efficient Rendering and Illumination. PMC. 2023;10385613.</ref> Such arrays are widely used in glasses-free 3D signage and have been adapted to VR/AR light-field display prototypes.


'''Holographic optical element (HOE) arrays:''' These use diffractive hologram patterns that act like an array of lenses. In AR waveguide combiners, ''lens-array holographic optical elements'' have been used to form 2D/3D transparent display screens.<ref>Liu YS, Kuo CY, Hwang CC, et al. Two-dimensional and three-dimensional see-through screen using holographic optical elements. Digital Holography and Three-Dimensional Imaging. 2012;DM2C.6.</ref> A HOE can replace a physical lens array by encoding lens behavior into a recorded interference pattern. In one prototype, a ''lens-array HOE'' was created to build a see-through AR screen.<ref>Liu YS, Kuo CY, Hwang CC, et al. Two-dimensional and three-dimensional see-through screen using holographic optical elements. Digital Holography and Three-Dimensional Imaging. 2012;DM2C.6.</ref> Other works use holographic micromirror arrays in conjunction with MLAs to couple images into waveguides.<ref>Jang C, Bang K, Asaduzzaman A, Lee S, Lee B. Three-dimensional see-through augmented-reality display system using a holographic micromirror array. Applied Optics. 2021;60(25):7545-7553.</ref>
'''Holographic optical element (HOE) arrays:''' These use diffractive hologram patterns that act like an array of lenses. In AR waveguide combiners, ''lens-array holographic optical elements'' have been used to form 2D/3D transparent display screens.<ref name="Liu2012">Liu YS, Kuo CY, Hwang CC, et al. Two-dimensional and three-dimensional see-through screen using holographic optical elements. Digital Holography and Three-Dimensional Imaging. 2012;DM2C.6.</ref> A HOE can replace a physical lens array by encoding lens behavior into a recorded interference pattern. In one prototype, a ''lens-array HOE'' was created to build a see-through AR screen.<ref name="Liu2012" /> Other works use holographic micromirror arrays in conjunction with MLAs to couple images into waveguides.<ref name="Jang2021" />


'''Liquid crystal / tunable lens arrays:''' Some arrays use liquid crystal (LC) or fluidic lenses whose optical power can be electronically changed. For example, a chiral (polarization-sensitive) LC lens array was demonstrated in an AR system to steer light and break conventional FOV limits.<ref>Wei K. Near-eye augmented reality display using wide field-of-view scanning polarization pupil replication. University of California, Berkeley. 2023.</ref> Variable-focus MLAs can allow dynamic focus adjustment or multi-focal displays.
'''Liquid crystal / tunable lens arrays:''' Some arrays use liquid crystal (LC) or fluidic lenses whose optical power can be electronically changed. For example, a chiral (polarization-sensitive) LC lens array was demonstrated in an AR system to steer light and break conventional FOV limits.<ref name="Wei2023" /> Variable-focus MLAs can allow dynamic focus adjustment or multi-focal displays.


'''Metamaterial / metalens arrays:''' Flat optical metasurfaces can form ultra-thin lenslets. Metalens arrays (nanoscale-post structures) have been fabricated for light-field imaging.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref> Such metasurface MLAs can be made extremely thin (sub-micron) and can correct chromatic aberration.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref><ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref> Patents also describe ''metamaterial lens arrays'' for eye-tracking cameras.<ref>Microsoft. Camera comprising lens array. Patent Nweon. 2020;30768.</ref> As fabrication matures, metasurface MLAs promise lighter, compact optics.
'''Metamaterial / metalens arrays:''' Flat optical metasurfaces can form ultra-thin lenslets. Metalens arrays (nanoscale-post structures) have been fabricated for light-field imaging.<ref name="Li2019" /> Such metasurface MLAs can be made extremely thin (sub-micron) and can correct chromatic aberration.<ref name="Li2019" /> Patents also describe ''metamaterial lens arrays'' for eye-tracking cameras.<ref name="Microsoft2020" /> As fabrication matures, metasurface MLAs promise lighter, compact optics.


'''Fresnel / diffractive lens arrays:''' Arrays of Fresnel lenses or diffractive patterned surfaces can also be used. A Fresnel lens array trades off sharpness for thinness and can be cheaper to mold. Diffractive MLAs (e.g. polymer gratings) are used in some projection optics.
'''Fresnel / diffractive lens arrays:''' Arrays of Fresnel lenses or diffractive patterned surfaces can also be used. A Fresnel lens array trades off sharpness for thinness and can be cheaper to mold. Diffractive MLAs (e.g. polymer gratings) are used in some projection optics.
Line 27: Line 27:
== Applications in displays ==
== Applications in displays ==


'''Light-field and multi-view VR displays:''' Lens arrays enable near-eye systems that present multiple image perspectives simultaneously. By placing an MLA at an optical conjugate of the display panel, a VR headset can create a plenoptic or light-field effect where each eye perceives slightly different images at different angles. For example, a thin MLA-based VR near-eye display was demonstrated using a polarization grating to expand the field of view (FoV) from ~59° to ~95°, while preserving image resolution.<ref>Shin G, Lee Y, Kim J, et al. Field of view and angular-resolution enhancement in microlens array type virtual reality near-eye display using polarization grating. PubMed. 2023;39876217.</ref> Such designs address the FoV–resolution tradeoff inherent in lens-array displays: more lenslets (or larger aperture per lenslet) can increase FoV but reduce angular detail. Integral imaging is a related concept: an array of lenslets reconstructs a 3D scene from a grid of "elemental" images.<ref>Li X, Chen L, Li Y, et al. A broadband achromatic metalens array for integral imaging in the visible. Light Sci Appl. 2019;8:99.</ref> In tabletop or handheld 3D displays, integral imaging systems (with pixelated panels and matching MLAs) produce glasses-free 3D images with true focal cues.
'''Light-field and multi-view VR displays:''' Lens arrays enable near-eye systems that present multiple image perspectives simultaneously. By placing an MLA at an optical conjugate of the display panel, a VR headset can create a plenoptic or light-field effect where each eye perceives slightly different images at different angles. For example, a thin MLA-based VR near-eye display was demonstrated using a polarization grating to expand the field of view (FoV) from ~59° to ~95°, while preserving image resolution.<ref name="Shin2023" /> Such designs address the FoV–resolution tradeoff inherent in lens-array displays: more lenslets (or larger aperture per lenslet) can increase FoV but reduce angular detail. Integral imaging is a related concept: an array of lenslets reconstructs a 3D scene from a grid of "elemental" images.<ref name="Li2019" /> In tabletop or handheld 3D displays, integral imaging systems (with pixelated panels and matching MLAs) produce glasses-free 3D images with true focal cues.


'''Autostereoscopic (glasses-free) 3D displays:''' Both lenticular arrays and MLAs have been used in commercial 3D displays (e.g. display signage, digital signage). In VR contexts, lenticular arrays can form static multi-view panels. One example combined a lenticular array with eye-tracking: as the viewer moves or the gaze changes, the system updates which view is shown. In a light-field display prototype, a lenticular array restricted the viewing zone and an eye-tracking system expanded the effective viewing area, using only ~17% of the data normally required.<ref>Balogh T, Nagy Z, Kerbel I, et al. Directional and Eye-Tracking Light Field Display with Efficient Rendering and Illumination. PMC. 2023;10385613.</ref> This demonstrates how lens arrays (like lenticules) coupled with tracking can reduce bandwidth and still provide smooth motion parallax.
'''Autostereoscopic (glasses-free) 3D displays:''' Both lenticular arrays and MLAs have been used in commercial 3D displays (e.g. display signage, digital signage). In VR contexts, lenticular arrays can form static multi-view panels. One example combined a lenticular array with eye-tracking: as the viewer moves or the gaze changes, the system updates which view is shown. In a light-field display prototype, a lenticular array restricted the viewing zone and an eye-tracking system expanded the effective viewing area, using only ~17% of the data normally required.<ref name="Balogh2023" /> This demonstrates how lens arrays (like lenticules) coupled with tracking can reduce bandwidth and still provide smooth motion parallax.


'''Holographic and waveguide displays:''' In AR near-eye displays, lens arrays appear in the form of couplers or combiners. Diffractive or holographic elements on waveguides often act like lens arrays. For example, a recent see-through AR prototype used a ''holographic micromirror array'' together with a fixed microlens array as a lenslet-based in-coupler. This system overlaid a reconstructed 3D image on the real world without additional bulky optics.<ref>Jang C, Bang K, Asaduzzaman A, Lee S, Lee B. Three-dimensional see-through augmented-reality display system using a holographic micromirror array. Applied Optics. 2021;60(25):7545-7553.</ref> Similarly, research has fabricated ''lens-array HOEs'' on planar substrates to create see-through 2D/3D screens.<ref>Liu YS, Kuo CY, Hwang CC, et al. Two-dimensional and three-dimensional see-through screen using holographic optical elements. Digital Holography and Three-Dimensional Imaging. 2012;DM2C.6.</ref> In these AR designs, each microlens or holographic lenslet directs light into the waveguide or to the eye, enabling floating virtual images. The advantage is that such arrays can maintain a clear real-world view while presenting virtual content at various depths (an [[Optical see-through display]] property).
'''Holographic and waveguide displays:''' In AR near-eye displays, lens arrays appear in the form of couplers or combiners. Diffractive or holographic elements on waveguides often act like lens arrays. For example, a recent see-through AR prototype used a ''holographic micromirror array'' together with a fixed microlens array as a lenslet-based in-coupler. This system overlaid a reconstructed 3D image on the real world without additional bulky optics.<ref name="Jang2021" /> Similarly, research has fabricated ''lens-array HOEs'' on planar substrates to create see-through 2D/3D screens.<ref name="Liu2012" /> In these AR designs, each microlens or holographic lenslet directs light into the waveguide or to the eye, enabling floating virtual images. The advantage is that such arrays can maintain a clear real-world view while presenting virtual content at various depths (an [[Optical see-through display]] property).


'''Focal cueing and depth enhancement:''' Some VR/AR displays incorporate multiple lens arrays for depth/focus manipulation. For instance, a light-field HMD may use two stacked MLA arrays (a so-called dual-focal arrangement) to enlarge the depth range so that virtual objects at different distances can appear simultaneously in focus. Polarization or liquid crystal arrays have been used to switch between focus planes. These advanced architectures aim to overcome the vergence-accommodation mismatch by aligning virtual image focus with convergence.
'''Focal cueing and depth enhancement:''' Some VR/AR displays incorporate multiple lens arrays for depth/focus manipulation. For instance, a light-field HMD may use two stacked MLA arrays (a so-called dual-focal arrangement) to enlarge the depth range so that virtual objects at different distances can appear simultaneously in focus. Polarization or liquid crystal arrays have been used to switch between focus planes. These advanced architectures aim to overcome the vergence-accommodation mismatch by aligning virtual image focus with convergence.
Line 37: Line 37:
== Applications in sensing ==
== Applications in sensing ==


'''Light-field (plenoptic) cameras for depth and eye tracking:''' Lens arrays are fundamental to plenoptic imaging. Placing an MLA a focal distance in front of an image sensor allows each micro-image to capture rays from different angles.<ref>Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref> This effectively samples the full 4D light field of the scene. With computational processing, one can refocus the image after capture or compute depth maps from parallax between the micro-images.<ref>Ng R, Levoy M, Brédif M, et al. Light field photography with a hand-held plenoptic camera. Computer Science Technical Report. 2005;2(11):1-11.</ref> In VR/AR, this is useful both for external depth sensing (scene reconstruction) and internal eye imaging. For example, patents describe using a light-field camera (with MLA) inside an HMD to capture the user's eye. The captured plenoptic data lets the system digitally refocus on various eye regions and compute gaze direction without needing precise IR glints.<ref>Yang L, Guo Y. Eye tracking using a light field camera on a head-mounted display. US Patent Application US20180173303A1. 2018 Jun 21.</ref> This relaxes the geometric constraints on eye-tracker placement. Thus, microlens-based light-field cameras can support both environmental mapping and fine eye tracking in headsets.
'''Light-field (plenoptic) cameras for depth and eye tracking:''' Lens arrays are fundamental to plenoptic imaging. Placing an MLA a focal distance in front of an image sensor allows each micro-image to capture rays from different angles.<ref name="Ng2005" /> This effectively samples the full 4D light field of the scene. With computational processing, one can refocus the image after capture or compute depth maps from parallax between the micro-images.<ref name="Ng2005" /> In VR/AR, this is useful both for external depth sensing (scene reconstruction) and internal eye imaging. For example, patents describe using a light-field camera (with MLA) inside an HMD to capture the user's eye. The captured plenoptic data lets the system digitally refocus on various eye regions and compute gaze direction without needing precise IR glints.<ref name="Yang2018" /> This relaxes the geometric constraints on eye-tracker placement. Thus, microlens-based light-field cameras can support both environmental mapping and fine eye tracking in headsets.


'''Multi-view eye-tracking cameras:''' Even simpler lens arrays are used in gaze trackers. An array of small lenses can cover a wide field-of-view of the eye by imaging it onto different sensor regions. For example, a patented near-eye system used four tiny metamaterial lenses arranged in an array, each imaging a portion of the eye onto separate areas of the detector.<ref>Microsoft. Camera comprising lens array. Patent Nweon. 2020;30768.</ref><ref>Microsoft. Camera comprising lens array. Patent Nweon. 2020;30768.</ref> By analyzing these sub-images, the headset can determine the pupil center and gaze direction for each eye. Such lens-array trackers can be more compact than single large lenses and may capture more robust features (like iris patterns) for tracking.
'''Multi-view eye-tracking cameras:''' Even simpler lens arrays are used in gaze trackers. An array of small lenses can cover a wide field-of-view of the eye by imaging it onto different sensor regions. For example, a patented near-eye system used four tiny metamaterial lenses arranged in an array, each imaging a portion of the eye onto separate areas of the detector.<ref name="Microsoft2020" /> By analyzing these sub-images, the headset can determine the pupil center and gaze direction for each eye. Such lens-array trackers can be more compact than single large lenses and may capture more robust features (like iris patterns) for tracking.


'''Depth sensors:''' Outside of full light-field cameras, some depth-sensing concepts also use microlenses. One approach is a multi-aperture structured-light projector: an array of tiny beams (formed by a lens array) projects a coded IR pattern for depth triangulation. Another is embedding micro-lenses over depth-sensing pixels to increase fill factor or directivity. In practice, however, most time-of-flight and stereo cameras in VR/AR do not use discrete lens arrays (they use single large lenses or laser projectors). The main use of lens arrays in sensing is thus in light-field capture (including gaze capture) rather than typical ToF or stereo modules.
'''Depth sensors:''' Outside of full light-field cameras, some depth-sensing concepts also use microlenses. One approach is a multi-aperture structured-light projector: an array of tiny beams (formed by a lens array) projects a coded IR pattern for depth triangulation. Another is embedding micro-lenses over depth-sensing pixels to increase fill factor or directivity. In practice, however, most time-of-flight and stereo cameras in VR/AR do not use discrete lens arrays (they use single large lenses or laser projectors). The main use of lens arrays in sensing is thus in light-field capture (including gaze capture) rather than typical ToF or stereo modules.
Line 47: Line 47:
Lens-array designs involve several key parameters:
Lens-array designs involve several key parameters:


'''Lens pitch (size):''' The center-to-center spacing of the lenslets. Near-eye display lens pitches are often on the order of 0.5–3 mm. For example, a wide-FOV scanning AR prototype used a "chiral" LC lens array of 8×15 lenses with 2 mm pitch.<ref>Wei K. Near-eye augmented reality display using wide field-of-view scanning polarization pupil replication. University of California, Berkeley. 2023.</ref> An AR waveguide coupler in another system used spherical lenslets with 1 mm pitch.<ref>Jang C, Bang K, Asaduzzaman A, Lee S, Lee B. Three-dimensional see-through augmented-reality display system using a holographic micromirror array. Applied Optics. 2021;60(25):7545-7553.</ref> Plenoptic camera MLAs, by contrast, have much finer pitch (tens to hundreds of µm) to densely sample the image plane. Pitch determines the tradeoff between image resolution and angular coverage: smaller pitch yields higher angular resolution (more sub-aperture views) but collects less light per lens.
'''Lens pitch (size):''' The center-to-center spacing of the lenslets. Near-eye display lens pitches are often on the order of 0.5–3 mm. For example, a wide-FOV scanning AR prototype used a "chiral" LC lens array of 8×15 lenses with 2 mm pitch.<ref name
 
'''Focal length and f-number:''' Each lenslet's focal length sets the viewing frustum of that micro-aperture. Low f-number (wide aperture) means a large view angle per lens, which broadens the overall FOV of the system. In the scanning waveguide example, the 2 mm lenslets had an f-number of about 0.41 at 639 nm.<ref>Wei K. Near-eye augmented reality display using wide field-of-view scanning polarization pupil replication. University of California, Berkeley. 2023.</ref> In designs, the focal length is often chosen to collimate or focus light from the display panel to the eye (in displays) or from the scene to the sensor (in cameras). Mismatches in focal length across the array can create blurring or depth errors.


'''Aperture shape and fill factor:''' Lenslets may be round or hexagonal. Hexagonal or honeycomb layouts can achieve near-100% fill factor (no dead zones) which maximizes brightness. Fill-factor and uniformity are critical: any gap between lenses can cause vignetting or loss of resolution. In fabrication, arrays are usually molded or imprinted in photoresist, and then replicated in glass or plastic.
'''Aperture shape and fill factor:''' Lenslets may be round or hexagonal. Hexagonal or honeycomb layouts can achieve near-100% fill factor (no dead zones) which maximizes brightness. Fill-factor and uniformity are critical: any gap between lenses can cause vignetting or loss of resolution. In fabrication, arrays are usually molded or imprinted in photoresist, and then replicated in glass or plastic.