Jump to content

AR glasses: Difference between revisions

No edit summary
m Text replacement - "Xreal" to "XREAL"
Tags: Mobile edit Mobile web edit
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{see also|Terms|Technical Terms}}
{{see also|Terms|Technical Terms}}
{{see also|Smart glasses}}
{{see also|Smart glasses|AR Glasses}}
[[File:ar glasses1.jpg|350px|right]]
[[File:ar glasses1.jpg|350px|right]]
'''[[AR glasses]]''' (also known as '''[[augmented reality]] glasses''' or '''[[smart glasses]]''') are wearable [[head-mounted display|head-mounted devices]] that overlay computer-generated imagery, data or 3-D models onto a user’s direct view of the physical world. Unlike [[virtual reality]] (VR) headsets, which occlude outside vision, AR glasses use transparent or semi-transparent optics ([[waveguide]]s, [[prism]]s or [[optical combiner|combiners]]) so the wearer simultaneously sees real surroundings and virtual overlays.<ref name="SynopsysAROptics">Synopsys. "How Do Augmented Reality Optics Work?". Retrieved 30 April 2025. https://www.synopsys.com/glossary/what-is-augmented-reality-optics.html</ref><ref name="VarjoExplained">Varjo. "Virtual Reality, Augmented Reality and Mixed Reality Explained". Retrieved 30 April 2025. https://varjo.com/virtual-augmented-and-mixed-reality-explained/</ref> Modern eyewear integrates miniature [[microdisplay|micro-displays]] (often [[OLED]], [[LCD]], or [[LCoS]]), transparent [[waveguide]] optics, and an array of [[sensor]]s—[[RGB camera|RGB]]/[[depth camera|depth cameras]], an [[inertial measurement unit]] (IMU), [[eye tracking|eye-trackers]], and sometimes [[LiDAR]]—all driven by low-power [[system-on-chip|SoCs]]. Real-time [[simultaneous localization and mapping]] (SLAM) locks holograms to the environment while voice, [[hand tracking|hand-tracking]] or gaze serves as input.<ref name="SLAMBenchmark">Sarlin P. et al. (2022). "LaMAR – Benchmarking Localization and Mapping for Augmented Reality". Proceedings of ECCV 2022. https://link.springer.com/chapter/10.1007/978-3-031-20071-7_40  https://lamar.ethz.ch/</ref> In this way AR glasses provide hands-free, heads-up access to information – for example showing navigation cues, text annotations, or [[3D model]]s superimposed on actual objects – without obscuring the user’s natural vision.
'''[[AR glasses]]''' (also known as '''[[augmented reality]] glasses''' or '''[[smart glasses]]''') are wearable [[head-mounted display|head-mounted devices]] that overlay computer-generated imagery, data or 3-D models onto a user’s direct view of the physical world. Unlike [[virtual reality]] (VR) headsets, which occlude outside vision, AR glasses use transparent or semi-transparent optics ([[waveguide]]s, [[prism]]s or [[optical combiner|combiners]]) so the wearer simultaneously sees real surroundings and virtual overlays.<ref name="SynopsysAROptics">Synopsys. "How Do Augmented Reality Optics Work?". Retrieved 30 April 2025. https://www.synopsys.com/glossary/what-is-augmented-reality-optics.html</ref><ref name="VarjoExplained">Varjo. "Virtual Reality, Augmented Reality and Mixed Reality Explained". Retrieved 30 April 2025. https://varjo.com/virtual-augmented-and-mixed-reality-explained/</ref> Modern eyewear integrates miniature [[microdisplay|micro-displays]] (often [[OLED]], [[LCD]], or [[LCoS]]), transparent [[waveguide]] optics, and an array of [[sensor]]s: [[RGB camera|RGB]]/[[depth camera|depth cameras]], an [[inertial measurement unit]] (IMU), [[eye tracking|eye-trackers]], and sometimes [[LiDAR]]. All driven by low-power [[system-on-chip|SoCs]]. Real-time [[simultaneous localization and mapping]] (SLAM) locks holograms to the environment while voice, [[hand tracking|hand-tracking]] or gaze serves as input.<ref name="SLAMBenchmark">Sarlin P. et al. (2022). "LaMAR – Benchmarking Localization and Mapping for Augmented Reality". Proceedings of ECCV 2022. https://link.springer.com/chapter/10.1007/978-3-031-20071-7_40  https://lamar.ethz.ch/</ref> In this way AR glasses provide hands-free, heads-up access to information – for example showing navigation cues, text annotations, or [[3D model]]s superimposed on actual objects – without obscuring the user’s natural vision.


AR glasses come in various [[form factor]]s (from bulky [[headset]]s to slim [[spectacles]]) but typically resemble ordinary eyewear. Some experimental prototypes like the AirySense system (shown above) allow a wearer to see and manipulate virtual objects as though they were real. Because the hardware must balance optics, electronics, and power in a compact package, current devices range from one-eye displays to full pair-of-glasses designs. In either case, all employ specialized optics (such as [[holographic waveguide|holographic]] or [[diffractive waveguide|diffractive]] [[waveguide]]s) to focus virtual images at a comfortable viewing distance while still letting the user see the world around them.<ref name="SynopsysAROptics" /><ref name="ARDisplaysReview">Xiong J. et al. (2021). "Augmented reality and virtual reality displays: perspectives and challenges". Light: Science & Applications. 10 (1): 216. doi:10.1038/s41377-021-00658-8</ref>
AR glasses come in various [[form factor]]s (from bulky [[headset]]s to slim [[spectacles]]) but typically resemble ordinary eyewear. Some experimental prototypes like the AirySense system (shown above) allow a wearer to see and manipulate virtual objects as though they were real. Because the hardware must balance optics, electronics, and power in a compact package, current devices range from one-eye displays to full pair-of-glasses designs. In either case, all employ specialized optics (such as [[holographic waveguide|holographic]] or [[diffractive waveguide|diffractive]] [[waveguide]]s) to focus virtual images at a comfortable viewing distance while still letting the user see the world around them.<ref name="SynopsysAROptics" /><ref name="ARDisplaysReview">Xiong J. et al. (2021). "Augmented reality and virtual reality displays: perspectives and challenges". Light: Science & Applications. 10 (1): 216. doi:10.1038/s41377-021-00658-8</ref>
Line 35: Line 35:


=== Processing and Power ===
=== Processing and Power ===
Standalone (untethered) glasses rely on mobile [[system-on-chip|SoCs]] such as [[Qualcomm]]’s [[Snapdragon#XR (Extended Reality)|Snapdragon XR]] series or [[Apple Inc.|Apple]]’s dual-chip [[Apple M2|M2]] + [[Apple R1|R1]] architecture in the [[Apple Vision Pro]].<ref name="VisionProAvailability" /><ref name="QualcommXR2">Qualcomm. "Snapdragon XR2+ Gen 2 Platform". Retrieved 30 April 2025. https://www.qualcomm.com/products/mobile/snapdragon/xr-vr-ar/snapdragon-xr2-plus-gen-2-platform</ref> [[Tethered computing|Tethered]] designs (e.g., early [[Magic Leap One]]) off-load computation to a [[smartphone]] or belt-worn “compute puck” to reduce head-borne weight and potentially increase performance. [[Battery (electricity)|Battery]] life remains a significant constraint, typically lasting only a few hours under active use.
Standalone (untethered) glasses rely on mobile [[system-on-chip|SoCs]] such as [[Qualcomm]]’s [[Snapdragon#XR (Extended Reality)|Snapdragon XR]] series or [[Apple Inc.|Apple]]’s dual-chip [[Apple M2|M2]] + [[Apple R1|R1]] architecture in the [[Apple Vision Pro]].<ref name="VisionProAvailability" /><ref name="QualcommXR2">Qualcomm. "Snapdragon XR2+ Gen 2 Platform". Retrieved 30 April 2025. https://www.qualcomm.com/products/mobile/snapdragon/xr-vr-ar/snapdragon-xr2-plus-gen-2-platform</ref> [[Tethered computing|Tethered]] designs (for example early [[Magic Leap One]]) off-load computation to a [[smartphone]] or belt-worn “compute puck” to reduce head-borne weight and potentially increase performance. [[Battery (electricity)|Battery]] life remains a significant constraint, typically lasting only a few hours under active use.


== Types of AR glasses ==
== Types of AR glasses ==
Line 42: Line 42:
*  '''[[Monocular]] vs. [[Binocular]]:''' ''Monocular'' glasses display to one eye, often simpler and lighter. ''Binocular'' glasses display to both eyes for [[stereoscopic 3D|stereoscopic]] vision and wider immersion.
*  '''[[Monocular]] vs. [[Binocular]]:''' ''Monocular'' glasses display to one eye, often simpler and lighter. ''Binocular'' glasses display to both eyes for [[stereoscopic 3D|stereoscopic]] vision and wider immersion.
*  '''[[Tethered computing|Tethered]] vs. [[Standalone VR headset|Standalone]]:''' ''Tethered'' glasses require a connection to an external device (PC, phone, compute pack). ''Standalone'' glasses contain all processing and power onboard.
*  '''[[Tethered computing|Tethered]] vs. [[Standalone VR headset|Standalone]]:''' ''Tethered'' glasses require a connection to an external device (PC, phone, compute pack). ''Standalone'' glasses contain all processing and power onboard.
*  '''[[Optical see-through]] vs. [[Video pass-through]]:''' ''Optical see-through'' uses transparent optics to directly view the world with overlays. ''Video pass-through'' uses external cameras to capture the world, digitally mixing it with virtual content before displaying it internally (e.g., [[Apple Vision Pro]]).
*  '''[[Optical see-through]] vs. [[Video pass-through]]:''' ''Optical see-through'' uses transparent optics to directly view the world with overlays. ''Video pass-through'' uses external cameras to capture the world, digitally mixing it with virtual content before displaying it internally (for example [[Apple Vision Pro]]).


== Key applications ==
== Key applications ==
Line 70: Line 70:
| [[Epson Moverio]] BT-45CS / BT-45C || [[Epson]] || 2022 || Si-OLED binocular displays, industrial/remote assistance focus
| [[Epson Moverio]] BT-45CS / BT-45C || [[Epson]] || 2022 || Si-OLED binocular displays, industrial/remote assistance focus
|-
|-
| [[Xreal|Xreal Air 2]] / Air 2 Pro || [[Xreal]] || 2023 || Binocular [[OLED]], lightweight "AR viewer" tethered to phone/PC, consumer media/productivity
| [[XREAL|XREAL Air 2]] / Air 2 Pro || [[XREAL]] || 2023 || Binocular [[OLED]], lightweight "AR viewer" tethered to phone/PC, consumer media/productivity
|-
|-
| [[Ray-Ban Stories]] / Meta Smart Glasses || [[Meta Platforms]] / [[Luxottica]] || 2021 / 2023 || Camera/audio glasses, limited display/AR (Gen 2 adds livestreaming), consumer
| [[Ray-Ban Stories]] / Meta Smart Glasses || [[Meta Platforms]] / [[Luxottica]] || 2021 / 2023 || Camera/audio glasses, limited display/AR (Gen 2 adds livestreaming), consumer
Line 91: Line 91:
Key concerns include:
Key concerns include:
*Collection and use of sensitive data (video, audio, [[spatial mapping|spatial maps]], [[eye tracking]] data).
*Collection and use of sensitive data (video, audio, [[spatial mapping|spatial maps]], [[eye tracking]] data).
*Potential for misuse (e.g., covert recording, [[face recognition]] without consent).
*Potential for misuse (for example covert recording, [[face recognition]] without consent).
*Digital distraction and safety risks (e.g., obscured vision, attention diversion).
*Digital distraction and safety risks (for example obscured vision, attention diversion).
*[[Social norm]] disruption and the [[digital divide]].
*[[Social norm]] disruption and the [[digital divide]].
*Aesthetic and [[ergonomics|ergonomic]] issues impacting adoption. Bulky or conspicuous designs can lead to stigma.
*Aesthetic and [[ergonomics|ergonomic]] issues impacting adoption. Bulky or conspicuous designs can lead to stigma.
Line 129: Line 129:
*'''[[Optics]]:''' Research focuses on thinner, lighter, and wider-FOV optics like [[metasurface]]-based [[waveguide]]s or advanced [[holographic optical element]]s, potentially achieving eyeglass form factors.<ref name="NatureMetasurface" /><ref name="NVIDIAAI" /> [[Retinal projection]] and [[varifocal display]]s aim to address [[vergence-accommodation conflict]] and reduce [[eye strain]].
*'''[[Optics]]:''' Research focuses on thinner, lighter, and wider-FOV optics like [[metasurface]]-based [[waveguide]]s or advanced [[holographic optical element]]s, potentially achieving eyeglass form factors.<ref name="NatureMetasurface" /><ref name="NVIDIAAI" /> [[Retinal projection]] and [[varifocal display]]s aim to address [[vergence-accommodation conflict]] and reduce [[eye strain]].
*'''Processing and Power:''' Continued improvement in low-power [[processor]]s and specialized [[AI]] chips ([[Apple R1|R1]], dedicated [[NPU]]s). Better battery technology and [[wireless power transfer|wireless charging]] are crucial. Offloading computation to [[edge computing|edge]]/[[cloud computing|cloud]] via [[5G]] or [[Wi-Fi 6|Wi-Fi 6/7]] may enable lighter devices.
*'''Processing and Power:''' Continued improvement in low-power [[processor]]s and specialized [[AI]] chips ([[Apple R1|R1]], dedicated [[NPU]]s). Better battery technology and [[wireless power transfer|wireless charging]] are crucial. Offloading computation to [[edge computing|edge]]/[[cloud computing|cloud]] via [[5G]] or [[Wi-Fi 6|Wi-Fi 6/7]] may enable lighter devices.
*'''AI Integration:''' On-device [[AI]] assistants that understand user context, interpret the environment, and provide proactive information (e.g., [[Meta Platforms|Meta]]'s [[Project Orion (Meta)|Orion]] prototype concept).<ref name="OrionVerge" />
*'''AI Integration:''' On-device [[AI]] assistants that understand user context, interpret the environment, and provide proactive information (for example [[Meta Platforms|Meta]]'s [[Project Orion (Meta)|Orion]] prototype concept).<ref name="OrionVerge" />
*'''Sensing and Interaction:''' More robust [[hand tracking]], [[eye tracking]], and development of [[brain-computer interface|brain-computer interfaces]] (BCIs) or [[electromyography|EMG]]-based inputs.
*'''Sensing and Interaction:''' More robust [[hand tracking]], [[eye tracking]], and development of [[brain-computer interface|brain-computer interfaces]] (BCIs) or [[electromyography|EMG]]-based inputs.
*'''Software and Ecosystem:''' Maturation of [[spatial computing]] platforms, expansion of [[OpenXR]] support, development of persistent, shared AR experiences ([[AR Cloud]]), and richer content creation tools.
*'''Software and Ecosystem:''' Maturation of [[spatial computing]] platforms, expansion of [[OpenXR]] support, development of persistent, shared AR experiences ([[AR Cloud]]), and richer content creation tools.