Jump to content

AR glasses: Difference between revisions

From VR & AR Wiki
Created page with "# AR glasses AR glasses (also known as augmented reality glasses or ''smart glasses'') are wearable head-mounted devices that overlay computer-generated content onto the user’s view of the real world. Unlike virtual reality headsets that block out the real environment, AR glasses maintain a transparent or semi-transparent display so that digital images appear integrated with the physical surroundings​:contentReference[oaicite:0]{index=0}. In pr..."
 
No edit summary
 
(22 intermediate revisions by the same user not shown)
Line 1: Line 1:
# [[AR glasses]]
{{see also|Terms|Technical Terms}}
{{see also|Smart glasses}}
[[File:ar glasses1.jpg|350px|right]]
'''[[AR glasses]]''' (also known as '''[[augmented reality]] glasses''' or '''[[smart glasses]]''') are wearable [[head-mounted display|head-mounted devices]] that overlay computer-generated imagery, data or 3-D models onto a user’s direct view of the physical world. Unlike [[virtual reality]] (VR) headsets, which occlude outside vision, AR glasses use transparent or semi-transparent optics ([[waveguide]]s, [[prism]]s or [[optical combiner|combiners]]) so the wearer simultaneously sees real surroundings and virtual overlays.<ref name="SynopsysAROptics">Synopsys. "How Do Augmented Reality Optics Work?". Retrieved 30 April 2025. https://www.synopsys.com/glossary/what-is-augmented-reality-optics.html</ref><ref name="VarjoExplained">Varjo. "Virtual Reality, Augmented Reality and Mixed Reality Explained". Retrieved 30 April 2025. https://varjo.com/virtual-augmented-and-mixed-reality-explained/</ref> Modern eyewear integrates miniature [[microdisplay|micro-displays]] (often [[OLED]], [[LCD]], or [[LCoS]]), transparent [[waveguide]] optics, and an array of [[sensor]]s—[[RGB camera|RGB]]/[[depth camera|depth cameras]], an [[inertial measurement unit]] (IMU), [[eye tracking|eye-trackers]], and sometimes [[LiDAR]]—all driven by low-power [[system-on-chip|SoCs]]. Real-time [[simultaneous localization and mapping]] (SLAM) locks holograms to the environment while voice, [[hand tracking|hand-tracking]] or gaze serves as input.<ref name="SLAMBenchmark">Sarlin P. et al. (2022). "LaMAR – Benchmarking Localization and Mapping for Augmented Reality". Proceedings of ECCV 2022. https://link.springer.com/chapter/10.1007/978-3-031-20071-7_40  https://lamar.ethz.ch/</ref> In this way AR glasses provide hands-free, heads-up access to information – for example showing navigation cues, text annotations, or [[3D model]]s superimposed on actual objects – without obscuring the user’s natural vision.


[[AR glasses]] (also known as [[augmented reality]] glasses or ''smart glasses'') are wearable head-mounted devices that overlay computer-generated content onto the user’s view of the real world.  Unlike [[virtual reality]] headsets that block out the real environment, AR glasses maintain a transparent or semi-transparent display so that digital images appear integrated with the physical surroundings&#8203;:contentReference[oaicite:0]{index=0}. In practice they use miniature displays (often OLED or LCD microdisplays) and optical combiners or waveguides to project images directly into the user’s field of view&#8203;:contentReference[oaicite:1]{index=1}&#8203;:contentReference[oaicite:2]{index=2}. Sensors such as cameras and inertial measurement units (accelerometer/gyroscope) feed data into on-device processors and algorithms, enabling [[simultaneous localization and mapping]] (SLAM) so that virtual objects can be stably placed and tracked relative to the real world&#8203;:contentReference[oaicite:3]{index=3}. In this way AR glasses provide hands-free, heads-up access to information – for example showing navigation cues, text annotations, or 3D models superimposed on actual objects – without obscuring the user’s natural vision.
AR glasses come in various [[form factor]]s (from bulky [[headset]]s to slim [[spectacles]]) but typically resemble ordinary eyewear. Some experimental prototypes like the AirySense system (shown above) allow a wearer to see and manipulate virtual objects as though they were real. Because the hardware must balance optics, electronics, and power in a compact package, current devices range from one-eye displays to full pair-of-glasses designs. In either case, all employ specialized optics (such as [[holographic waveguide|holographic]] or [[diffractive waveguide|diffractive]] [[waveguide]]s) to focus virtual images at a comfortable viewing distance while still letting the user see the world around them.<ref name="SynopsysAROptics" /><ref name="ARDisplaysReview">Xiong J. et al. (2021). "Augmented reality and virtual reality displays: perspectives and challenges". Light: Science & Applications. 10 (1): 216. doi:10.1038/s41377-021-00658-8</ref>


:contentReference[oaicite:4]{index=4} AR glasses come in various form factors (from bulky headsets to slim spectacles) but typically resemble ordinary eyewear. Some experimental prototypes like the AirySense system (shown above) allow a wearer to see and manipulate virtual objects as though they were real. Because the hardware must balance optics, electronics, and power in a compact package, current devices range from one-eye displays to full pair-of-glasses designs.  In either case, all employ specialized optics (such as holographic or diffractive waveguides) to focus virtual images at a comfortable viewing distance while still letting the user see the world around them&#8203;:contentReference[oaicite:5]{index=5}&#8203;:contentReference[oaicite:6]{index=6}.  
== History and evolution ==
The concept of see-through [[head-mounted display]]s (HMDs) dates back to the 1960s. [[Ivan Sutherland]]’s 1968 “Sword of Damocles” HMD is often cited as the first prototype, displaying dynamic wire-frame graphics aligned to the real world.<ref>Sutherland I. E. (1968). "A head-mounted three-dimensional display". AFIPS Conf. Proc. 33: 757–764.</ref> In 1990 the term “[[augmented reality]]” was coined by [[Thomas Caudell]] while describing a heads-up wiring guide for [[Boeing]] assembly.<ref>AWE XR. "Thomas Caudell – XR Hall of Fame". Retrieved 30 April 2025. https://www.awexr.com/hall-of-fame/20-thomas-caudell</ref> Early AR research explored wearable optics for [[pilot]]s and [[maintenance]]. However, practical AR glasses remained largely experimental until the 2010s.


== History and evolution ==
The first mass-public AR headset was arguably [[Google Glass]] (Explorer Edition released 2013), a US $1,500 [[monocular]] smartglass project that drew widespread attention and significant privacy debate.<ref name="GoogleGlassVerge">The Verge (May 2, 2013). "Google Glass review". Retrieved 30 April 2025. https://www.theverge.com/2013/2/22/4013406/i-used-google-glass-its-the-future-with-monthly-updates</ref> Around the same time other companies like [[Vuzix]] (with products such as the M100 smart glass) and [[Epson]] ([[Epson Moverio|Moverio]] series) began selling eyewear with AR capabilities. The mid-2010s saw a wave of [[miniaturization]] and new optics.
The concept of see-through head-mounted displays dates back to the 1960s (e.g. Ivan Sutherland’s 1968 “Sword of Damocles” HMD is often cited as the first prototype)&#8203;:contentReference[oaicite:7]{index=7}.  In 1990 the term “augmented reality” was coined, and early AR research (sometimes in military labs) explored wearable optics for pilots and maintenance.  However, practical AR glasses remained largely experimental until the 2010s.  The first mass-public AR headset was arguably [[Google Glass]] (2013), a monocular smartglass project that drew widespread attention&#8203;:contentReference[oaicite:8]{index=8}. Around the same time other companies like Vuzix (with products such as the M100 smart glass) and Epson (Moverio series) began selling eyewear with AR capabilities. The mid-2010s saw a wave of miniaturization and new optics: for example, ODG’s R-7 and first-generation smartglasses offered more compact designs&#8203;:contentReference[oaicite:9]{index=9}.


In 2016 Microsoft launched the first [[Microsoft HoloLens]] as a mixed-reality headset for enterprise use, featuring binocular transparent displays, spatial mapping cameras and gesture controls. HoloLens (and its 2019 successor HoloLens 2) brought advanced SLAM and interaction (voice, hands) to AR glasses. In 2018 Magic Leap (a Florida-based startup) released the Magic Leap One “Creator Edition”, an MR headset using diffractive waveguide optics and a powerful tethered compute pack&#8203;:contentReference[oaicite:10]{index=10}. Meanwhile consumer AR eyewear efforts appeared: Snap Inc. introduced the original Snap [[Spectacles]] (2016) as camera glasses, and later (2021) a new generation of Spectacles with dual waveguide displays for AR effects&#8203;:contentReference[oaicite:11]{index=11}. Other attempts included fashionable AR frames like North Focals and Ray-Ban Stories (camera-equipped smartglasses by Meta and Ray-Ban).
In 2016 [[Microsoft]] launched the first [[Microsoft HoloLens]] as the first untethered, [[binocular]] [[mixed reality|MR]] headset for [[enterprise]] use, featuring [[spatial mapping]] cameras and [[gesture control]].<ref name="HoloLensVerge">The Verge (April 1, 2016). "Microsoft HoloLens review: the future, now". Retrieved 30 April 2025. https://www.theverge.com/2016/4/1/11334488/microsoft-hololens-video-augmented-reality-ar-headset-hands-on</ref> HoloLens (and its 2019 successor HoloLens 2) brought advanced [[SLAM]] and interaction (voice, hands) to AR glasses. In 2018 [[Magic Leap]] released the [[Magic Leap One]] “Creator Edition”, an MR headset using [[diffractive waveguide]] optics and a powerful tethered compute pack.<ref name="MagicLeapAxios">Axios (Dec 20, 2017). "Magic Leap finally unveils its first augmented reality headset". Retrieved 30 April 2025. https://www.axios.com/2018/01/05/magic-leap-finally-shows-its-ar-headset-1515110723</ref> Meanwhile [[consumer electronics|consumer]] AR eyewear efforts appeared: [[Snap Inc.]] introduced the original [[Snap Spectacles]] (2016) as camera glasses, and later the 4th generation Spectacles (2021) with dual [[waveguide]] displays, 6-DoF tracking, and AR effects for creators.<ref name="Spectacles2021">The Verge (May 20, 2021). "Snap unveils AR Spectacles that overlay digital images on the real world". Retrieved 30 April 2025. https://www.theverge.com/2021/5/20/22445481/snap-spectacles-ar-augmented-reality-announced</ref> Other attempts included fashionable AR frames like [[North Focals]] and [[Ray-Ban Stories]] (camera-equipped smartglasses by [[Meta Platforms]] and [[Ray-Ban]]).


By the early 2020s, virtually all major tech players signaled interest in AR glasses. In 2023 Apple unveiled the [[Apple Vision Pro]], a high-end spatial computing headset combining video passthrough AR and VR (with retina-tracking and custom R1/M2 chips)&#8203;:contentReference[oaicite:12]{index=12}&#8203;:contentReference[oaicite:13]{index=13}.  Meta (Facebook) showcased prototypes (Project Aria) and in 2024 announced “Project Orion” – a glasses-style AR device with transparent lenses and a wide field of view&#8203;:contentReference[oaicite:14]{index=14}. Other recent entries include Lenovo’s ThinkReality A3, Pico’s AR headsets, and continuing updates from enterprise vendors like Vuzix (Blade 2) and Epson (Moverio BT-45 series). Industry analysts note that the modern wave of AR glasses began around 2012 and accelerated after 2015 with breakthroughs in waveguide optics and miniaturized components&#8203;:contentReference[oaicite:15]{index=15}&#8203;:contentReference[oaicite:16]{index=16}. As of 2025 the technology continues to evolve rapidly, with new prototypes leveraging AI and novel displays (see below).
By the early 2020s, virtually all major tech players signaled interest in AR glasses. In 2023 [[Apple Inc.|Apple]] unveiled the [[Apple Vision Pro]], a premium [[mixed reality]] headset combining high-resolution [[micro-OLED]] displays (23 million pixels total), [[video pass-through]] AR, an [[Apple M2|M2]] [[system-on-chip|SoC]] and a custom [[Apple R1|R1]] sensor-fusion chip.<ref name="VisionProAvailability">
Apple Inc. (January 8, 2024). “Apple Vision Pro available in the U.S. on February 2”.
Press release. Retrieved 30 April 2025.
https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/
</ref> [[Meta Platforms]] (Facebook) showcased prototypes ([[Project Aria]]) and in 2024 discussed “[[Project Orion (Meta)|Project Orion]]” – a prototype glasses-style AR device featuring silicon-carbide [[microLED]] waveguides and an on-device [[AI]] assistant.<ref name="OrionVerge">The Verge (Oct 15, 2024). "Meta shows off Orion AR glasses prototype with AI assistant". Retrieved 30 April 2025. https://www.theverge.com/24253908/meta-orion-ar-glasses-demo-mark-zuckerberg-interview</ref> Other recent entries include [[Lenovo]]’s [[Lenovo ThinkReality A3|ThinkReality A3]], [[Pico (VR company)|Pico]]’s AR headsets, and continuing updates from enterprise vendors like [[Vuzix]] ([[Vuzix Blade 2|Blade 2]]) and [[Epson]] ([[Epson Moverio BT-45|Moverio BT-45 series]]). Industry analysts note that the modern wave of AR glasses began around 2012 and accelerated after 2015 with breakthroughs in [[waveguide]] optics and miniaturized components. As of 2025 the technology continues to evolve rapidly.


== Technical components and specifications ==
== Technical components ==
AR glasses integrate several key hardware subsystems:
AR glasses integrate several key hardware subsystems:


- **Optics and Displays:** Most AR glasses use *see-through* optics, meaning transparent elements such as waveguides, prisms, or combiner lenses. These optically combine the real scene with projected imagery. Common display engines are microdisplays (small OLED or LCD panels) or pico projectors that feed a virtual image into the optics&#8203;:contentReference[oaicite:17]{index=17}&#8203;:contentReference[oaicite:18]{index=18}. For binocular (two-eye) systems this often means dual microdisplays providing separate left/right images for stereoscopy. Holographic displays or spatial light modulators are also emerging in research AR systems to create true 3D depth cues&#8203;:contentReference[oaicite:19]{index=19}.  The optics must collimate and focus the image at a distance comfortable for the eye, requiring precision lenses or waveguides (e.g. diffractive gratings).  Many commercial AR glasses (e.g. HoloLens 2, Magic Leap One) use layered waveguides: special transparent glass with embedded holographic or diffractive patterns that guide light from the display into the eye with minimal thickness. The overall field-of-view (FOV) and resolution are critical specs; modern high-end AR glasses target several tens of degrees of FOV and high pixel counts (Vision Pro’s combined display is 4K per eye) to make digital content sharp and immersive. Brightness (measured in nits) is also key, since the displays must compete with ambient light when overlaying onto the real world.
=== Optics and Displays ===
Most systems employ transparent [[waveguide]] combiners or reflective [[prism]]s to channel light from [[microdisplay]]s into the user’s eyes. A 2021 review summarized state-of-the-art grating, holographic and reflective waveguide architectures.<ref name="ARDisplaysReview" /> Common display engines are [[microdisplay]]s (small [[OLED]], [[LCD]], or [[LCoS]] panels) or [[pico projector]]s. For [[binocular]] systems, dual displays provide [[stereoscopy]]. [[Holographic display]]s or [[spatial light modulator]]s are emerging in research systems.<ref name="ARDisplaysReview" /> The optics collimate and focus the image, often using precision [[waveguide]]s (e.g. [[diffractive grating|diffractive]] or [[holography|holographic]] patterns) embedded in thin glass layers. Key specifications include [[field-of-view]] (FOV), [[resolution]], and brightness ([[nits]]) to compete with ambient light. Research directions now include inverse-designed [[metasurface]] gratings that could enable full-colour holographic AR in eyeglass-scale optics.<ref name="NatureMetasurface">
Gopakumar, M.; Lee, G-Y.; Choi, S. <i>et al.</i> (2024).
“Full-colour 3D holographic augmented-reality displays with metasurface waveguides”.
<i>Nature</i> 629 (800): 791–797. doi:10.1038/s41586-024-07386-0.
Retrieved 30 April 2025.
https://www.nature.com/articles/s41586-024-07386-0
</ref><ref name="NVIDIAAI">NVIDIA Blog (May 30, 2024). "NVIDIA Research Unveils AI-Powered Holographic Glasses Prototype". Retrieved 30 April 2025.https://developer.nvidia.com/blog/developing-smaller-lighter-extended-reality-glasses-using-ai/</ref>


- **Sensors and Tracking:** AR glasses require extensive sensing to align virtual imagery to the environment. Typical sensors include multiple cameras (RGB cameras, depth sensors or Time-of-Flight/LiDAR units, infrared cameras) and an inertial measurement unit (IMU) combining accelerometers, gyroscopes and magnetometers. The cameras scan the surroundings (e.g. room, objects, markers), while the IMU tracks rapid head movements. Together these feed into computer vision and SLAM algorithms that continuously map the 3D world and localize the user.  For example, systems perform visual-inertial odometry to determine the device’s position, and detect feature points to anchor holograms in space. Some devices also include eye-tracking cameras (to detect gaze) and hand-tracking (depth camera to see hands or gloves), enabling intuitive interaction. Recent devices like the Vision Pro include a LiDAR Scanner (for coarse 3D mapping) and Apple’s TrueDepth cameras to fuse a real-time depth map of the scene&#8203;:contentReference[oaicite:20]{index=20}.  As a result, AR glasses can “see” what the user sees and render graphics that remain registered with real objects even as the wearer moves.
=== Sensors and Tracking ===
AR glasses require extensive sensing for environmental awareness and interaction. Typical sensors include multiple [[camera]]s ([[RGB camera|RGB]], [[depth sensor]]s or [[Time-of-Flight camera|Time-of-Flight]]/[[LiDAR]] units) and an [[inertial measurement unit]] (IMU). [[Microsoft HoloLens 2|HoloLens 2]], for example, lists four visible-light cameras, a 1-MP time-of-flight depth sensor, and a 9-axis IMU.<ref name="HoloLensHardware">Microsoft Learn. "HoloLens 2 hardware details". Retrieved 30 April 2025. https://learn.microsoft.com/en-us/hololens/hololens2-hardware</ref> These feed [[computer vision]] and [[SLAM]] algorithms for [[spatial mapping]] and [[visual-inertial odometry]]. [[Eye tracking]] cameras detect gaze, while [[hand tracking]] enables gesture input. Sensor fusion keeps virtual content registered to the real world.


- **Processors:** The computational load for AR is high, involving graphics rendering, vision processing, spatial mapping and input recognition.  AR glasses typically run on powerful low-power processors.  For standalone (untethered) devices, this often means system-on-chip (SoC) platforms similar to those in smartphones or tablets, but optimized for XR. For instance, Meta’s HoloLens uses a Qualcomm Snapdragon XR2 with an additional Holographic Processing Unit (HPU) for sensor fusion, while Apple Vision Pro uses two Apple Silicon chips (an M2-class CPU/GPU and a custom R1 coprocessor) for handling sensor data and rendering&#8203;:contentReference[oaicite:21]{index=21}. Other AR glasses rely on arm-based processors (e.g. Qualcomm Snapdragon XR1/XR2/XR2+ series) to provide built-in graphics and neural engines. Some designs remain *tethered*, meaning the glasses connect by cable or wirelessly to a more powerful external computer or smartphone (offloading computation for higher performance).  In general, AR glasses pack into a small frame: multi-core CPUs, mobile GPUs, memory and storage, along with connectivity (Bluetooth, Wi-Fi) and battery.
=== Processing and Power ===
Standalone (untethered) glasses rely on mobile [[system-on-chip|SoCs]] such as [[Qualcomm]]’s [[Snapdragon#XR (Extended Reality)|Snapdragon XR]] series or [[Apple Inc.|Apple]]’s dual-chip [[Apple M2|M2]] + [[Apple R1|R1]] architecture in the [[Apple Vision Pro]].<ref name="VisionProAvailability" /><ref name="QualcommXR2">Qualcomm. "Snapdragon XR2+ Gen 2 Platform". Retrieved 30 April 2025. https://www.qualcomm.com/products/mobile/snapdragon/xr-vr-ar/snapdragon-xr2-plus-gen-2-platform</ref> [[Tethered computing|Tethered]] designs (e.g., early [[Magic Leap One]]) off-load computation to a [[smartphone]] or belt-worn “compute puck” to reduce head-borne weight and potentially increase performance. [[Battery (electricity)|Battery]] life remains a significant constraint, typically lasting only a few hours under active use.


== Types of AR glasses ==
== Types of AR glasses ==
AR glasses can be categorized by several criteria:
AR glasses can be categorized by several criteria:


- **Monocular vs. Binocular:** *Monocular* AR glasses have a display for one eye only (the other eye sees the world unobstructed).  These are typically simpler and lighter, offering a limited field of view but leaving one eye fully aware of real surroundings. Monocular designs (e.g. some Epson or Vuzix models) reduce weight and cost, and keep one eye free for safety-critical tasks (important in industrial use)&#8203;:contentReference[oaicite:22]{index=22}.  In contrast, *binocular* AR glasses have displays for both eyes (like most smart VR headsets).  These allow a more immersive and natural experience: the user sees a wide virtual screen as if floating in front of them, and gains true stereoscopic 3D perception and better depth cues&#8203;:contentReference[oaicite:23]{index=23}.  Binocular systems are generally higher-end (HoloLens, Magic Leap, Vision Pro) and support richer content, but tend to be bulkier and more expensive.
*   '''[[Monocular]] vs. [[Binocular]]:''' ''Monocular'' glasses display to one eye, often simpler and lighter. ''Binocular'' glasses display to both eyes for [[stereoscopic 3D|stereoscopic]] vision and wider immersion.
 
*   '''[[Tethered computing|Tethered]] vs. [[Standalone VR headset|Standalone]]:''' ''Tethered'' glasses require a connection to an external device (PC, phone, compute pack). ''Standalone'' glasses contain all processing and power onboard.
- **Tethered vs. Standalone:** Some AR headsets must be connected (tethered) to an external PC or console that provides processing and graphics (via cable or short-range link). Tethered devices can use more powerful GPUs off-board and often have lighter head units since power and heavy compute are external&#8203;:contentReference[oaicite:24]{index=24}. Other AR glasses are fully standalone (untethered) and contain all electronics and a battery on-board, allowing mobility at the cost of more weight.  For example, the first-generation Magic Leap One used a wearable “Lightpack” tether, whereas HoloLens and Vision Pro are standalone headsets.  As with VR, tethered AR headsets need a constant connection to work, while standalone units run independently (often using onboard ARM SoCs)&#8203;:contentReference[oaicite:25]{index=25}&#8203;:contentReference[oaicite:26]{index=26}.
*   '''[[Optical see-through]] vs. [[Video pass-through]]:''' ''Optical see-through'' uses transparent optics to directly view the world with overlays. ''Video pass-through'' uses external cameras to capture the world, digitally mixing it with virtual content before displaying it internally (e.g., [[Apple Vision Pro]]).
 
- **Optical See-Through vs. Video Pass-Through:** Most AR glasses are *optical see-through* – the user’s eyes directly see the real world through transparent lenses or gaps, with virtual images superimposed. This is typical of true AR spectacles.  A few devices (especially high-end mixed reality systems) employ *video pass-through*: cameras capture the real world and feed it to displays, and digital content is mixed in the video feed. Video pass-through (as in VR headsets with forward cameras) gives more control (the system can manipulate the entire view and apply full 3D effects) but introduces latency and loses direct visual perception. (Note: Apple Vision Pro primarily uses video pass-through to achieve very high realism, whereas most enterprise AR glasses remain optical.


== Key applications ==
== Key applications ==
AR glasses find use in many domains:
AR glasses find use in many domains:


- **Enterprise & Industry:** One of the largest markets today is enterprise and industrial.  In manufacturing and field service, AR glasses enable remote assistance and maintenance support: a technician wearing AR glasses can see step-by-step instructions or 3D overlayed diagrams on equipment, and can share their view with a remote expert for real-time guidance&#8203;:contentReference[oaicite:27]{index=27}. Analysts cite significant productivity gains: for instance, an Ericsson report noted that AR/VR-enabled remote assistance and training can reduce maintenance time by up to 50% and improve first-time fix rates by ~30%&#8203;:contentReference[oaicite:28]{index=28}. AR glasses are also used in logistics (hands-free picking), assembly (visual overlays for wiring or components), and quality control. In training simulators, AR can project scenarios onto physical mock-ups (e.g. pilots training with AR flight visuals or medical trainees practicing procedures with virtual anatomy). Indeed, militaries have long experimented with AR: astronauts have tested HoloLens on the ISS (NASA’s Project Sidekick) to assist in assembly tasks&#8203;:contentReference[oaicite:29]{index=29}, and soldier HUD systems provide tactical overlays.
*   '''[[Enterprise software|Enterprise]] & Industry:''' Including [[manufacturing]], [[field service]], and [[logistics]]. Applications include [[remote assistance]], step-by-step instructions, [[3D model]] overlays for [[maintenance (technical)|maintenance]] or assembly, and hands-free [[warehouse management system|warehouse]] picking ('pick-by-vision'). Live video, annotations and 3-D holograms can cut maintenance time significantly and improve first-time fix rates.<ref name="SoftwebEricsson">Softweb Solutions. "Augmented Reality in Manufacturing: Use Cases and Benefits" (Citing Ericsson study findings). Retrieved 30 April 2025. https://www.softwebsolutions.com/resources/augmented-reality-in-manufacturing.html</ref>
 
*  '''[[Healthcare|Medical]]:''' Uses include [[surgical navigation]] (overlaying [[medical imaging]] onto patients), medical training with virtual anatomy, and remote proctoring or consultation.
- **Consumer & Entertainment:** In the consumer space, applications are growing but more nascent.  Games and experiences can leverage AR glasses to create immersive play; for example, future AR glasses could support location-based games (extending smartphone AR titles into hands-free mode).  Wearable AR can also enable on-demand media (watching videos or virtual screens anywhere).  Social media and communication is another use-case: e.g. sharing augmented video streams or placing 3D avatars of friends in one’s environment.  Navigation is a natural fit – instead of looking down at a map, AR glasses could display directions on the road.  Some visionaries foresee “metaverse” uses where people interact via AR overlays in daily life.  However, as of 2025 the consumer ecosystem remains limited by hardware and apps.
*   '''[[Consumer electronics|Consumer]] & [[Entertainment]]:''' Applications include immersive AR [[video game|gaming]], virtual cinema screens for media consumption, [[navigation]] overlays, and [[social media]] integration.
 
*   '''[[Remote collaboration]]:''' Facilitating shared views with remote annotations for teamwork across distances.
- **Remote Collaboration & Assistance:** Closely related to enterprise, AR glasses facilitate remote collaboration.  A user can stream what they see to remote participants, who can draw annotations or highlight objects in the shared view.  Fields like field maintenance, healthcare (telemedicine), and customer service benefit: e.g. a mechanic wearing AR glasses could have a remote engineer overlay repair instructions on the engine.  This hands-free remote assistance can cut travel needs and speed up problem solving.  In training simulators, AR allows simulated failure modes on real equipment (e.g. overlaying a virtual fire or electrical fault) for safe practice.
*   '''[[Military]] & [[Aerospace]]:''' Applications include [[heads-up display|HUDs]] for [[pilot]]s, [[situational awareness]] tools for soldiers, and training simulators. [[NASA]] flew [[Microsoft HoloLens|HoloLens]] units to the [[International Space Station]] (ISS) in 2015 under **Project Sidekick** to test remote expert guidance for astronauts.<ref name="NASASidekick">NASA (June 25, 2015). "NASA, Microsoft Collaborate to Bring Science Fiction to Science Fact". Retrieved 30 April 2025. https://www.nasa.gov/press-release/nasa-microsoft-collaborate-to-bring-science-fiction-to-science-fact</ref>
 
- **Specialized Professional Use:** AR glasses are used in niche professional scenarios: surgeons use them to overlay medical images onto a patient during operations; engineers use them for 3D model visualization on real sites; retailers experiment with AR for in-store navigation or product demos.  Audio-visual wearables (like Bose Frames or Echo Frames) blur lines between AR and smart eyewear by providing augmented audio cues.  In entertainment, theme parks and museums have used AR headsets for interactive experiences. 
 
- **Military and Space:** Defense applications include heads-up displays for pilots and augmented vision in training and combat. The image above shows NASA astronaut Scott Kelly on the ISS wearing Microsoft HoloLens&#8203;:contentReference[oaicite:30]{index=30}, demonstrating AR for crew training. The military is exploring AR for situational awareness, maintenance in the field, and complex training (e.g. overlaying enemy positions on landscapes).


== Leading products and companies ==
== Leading products and companies ==
Major technology companies and startups have developed AR glasses:
Major technology companies and specialized startups are active in the AR glasses market:
 
{| class="wikitable sortable"
- **Microsoft HoloLens:** Microsoft’s HoloLens (1st gen 2016, 2nd gen 2019) is a leading enterprise AR headset.  It features binocular waveguide displays, depth cameras, and gesture/voice controls, and runs Windows Mixed Reality. HoloLens devices have been used in engineering, design, and healthcare. 
! Device !! Company !! First release !! Key Features / Target Market
 
|-
- **Magic Leap:** Magic Leap (Florida, USA) launched the Magic Leap One in 2018 and Magic Leap 2 in 2022 – tethered AR headsets targeting developers and enterprise.  The Magic Leap One had a lightweight visor and external compute pack, using diffractive optics.  Reviewers noted it offered more natural field-of-view and display quality compared to HoloLens&#8203;:contentReference[oaicite:31]{index=31}.
| [[Microsoft HoloLens 2]] || [[Microsoft]] || 2019 || Binocular waveguides, hand/eye tracking, [[enterprise software|enterprise]] focus
 
|-
- **Apple:** Apple entered the space with Vision Pro (2024), a high-end mixed-reality headset priced at $3,499&#8203;:contentReference[oaicite:32]{index=32}.  Vision Pro has ultra-high-resolution OLED panels (one per eye), advanced eye-tracking, and run a new visionOS. Apple is also rumored to be developing prescription-frame-style AR glasses for future release
| [[Magic Leap 2]] || [[Magic Leap]] || 2022 || 70° diagonal FOV, dynamic dimming, enterprise/developer focus
 
|-
- **Snap Inc.:** Snap pioneered consumer AR glasses with its Spectacles line. The original Spectacles (2016) were camera sunglasses; the 2021 “Next Gen Spectacles” introduced true AR via dual waveguide displays and depth cameras&#8203;:contentReference[oaicite:33]{index=33}. Snap’s Spectacles pair with its Lens Studio AR platform for content creators
| [[Apple Vision Pro]] || [[Apple Inc.|Apple]] || 2024 || Dual 4K [[micro-OLED]], [[eye tracking]], [[video pass-through]], high-end consumer/prosumer ([[Spatial computing]])
 
|-
- **Vuzix:** Vuzix (New York, USA) produces a range of AR eyewear (e.g. Vuzix Blade, M-Series). The Blade is a sunglasses-style, see-through smartglass with Alexa built-in.  Vuzix often targets enterprise (warehousing, logistics) and was one of the first to market consumer-style AR glasses. 
| [[Spectacles (Snap)|Spectacles]] (Gen 4, limited release) || [[Snap Inc.]] || 2021 || Dual 46° FOV waveguides, [[6DoF]] tracking, AR creators
 
|-
- **Epson Moverio:** Epson’s Moverio series are AR smart glasses featuring binocular transparent displays (using Si-OLED tech). Moverio headsets (e.g. BT-300, BT-350) have been used in drone piloting, maintenance, and education applications. 
| [[Vuzix Blade 2]] || [[Vuzix]] || 2023 || Monocular waveguide, ANSI Z87.1 safety rated, enterprise/industrial
 
|-
- **Other notable AR glasses:** 
| [[Epson Moverio]] BT-45CS / BT-45C || [[Epson]] || 2022 || Si-OLED binocular displays, industrial/remote assistance focus
  - *RealWear HMT:* An industrial wearable with a single-eye display (mounted near one eye) and voice control, used for hands-free guided work (despite not being see-through). 
|-
  - *Ray-Ban Stories:* Co-developed by Meta and Luxottica, Ray-Ban Stories (2021) look like normal sunglasses with built-in cameras and speakers (though they do not project AR imagery beyond recording video). 
| [[Xreal|Xreal Air 2]] / Air 2 Pro || [[Xreal]] || 2023 || Binocular [[OLED]], lightweight "AR viewer" tethered to phone/PC, consumer media/productivity
  - *North Focals:* (Discontinued) Smart glasses by North (acquired by Google) that projected basic AR notifications via tiny lasers. 
|-
 
| [[Ray-Ban Stories]] / Meta Smart Glasses || [[Meta Platforms]] / [[Luxottica]] || 2021 / 2023 || Camera/audio glasses, limited display/AR (Gen 2 adds livestreaming), consumer
In addition to hardware, AR ecosystems involve software firms.  Meta (formerly Facebook) with Ray-Ban and its Project Aria/Orion, Google (Glass Enterprise Edition, Android), Apple (iOS/visionOS), and Qualcomm (Snapdragon XR chips) play major roles.  Emerging companies include Varjo (XR-4 for enterprise XR), RealSense (Intel), and startups like Nreal (Nebula glasses).
|}
Other notable players include [[Google]] ([[Google Glass|Glass Enterprise Edition]]), [[Lenovo]] ([[Lenovo ThinkReality A3|ThinkReality A3]]), [[Qualcomm]] (chipsets like [[Snapdragon#XR (Extended Reality)|Snapdragon XR]]), [[Varjo]], and [[RealWear]].


== Software platforms and ecosystems ==
==Software platforms and ecosystems==
AR glasses rely on software frameworks and content ecosystems:
AR glasses rely on software frameworks and content ecosystems:


- **ARKit and ARCore:** Apple’s [[ARKit]] (introduced in 2017) and Google’s [[ARCore]] (2017) are mobile AR development platforms for iOS and Android respectively&#8203;:contentReference[oaicite:34]{index=34}.  They provide tracking, scene understanding, and rendering tools for developers.  Many smartphone AR apps today use these, and some AR glasses (especially those based on mobile OS) leverage them as well.
*'''Mobile [[AR SDK]]s:''' [[Apple Inc.|Apple]]’s [[ARKit]] (for [[iOS]], [[visionOS]]) and [[Google]]’s [[ARCore]] (for [[Android (operating system)|Android]]) provide foundational tracking, scene understanding, and rendering APIs, primarily for [[smartphone]] AR but also influencing AR glasses development.
 
*'''[[Mixed Reality|MR]]/[[Spatial Computing]] Platforms:''' Include [[Microsoft]]’s [[Windows Mixed Reality]] platform (for [[Microsoft HoloLens|HoloLens]]), [[Magic Leap]]’s [[Lumin OS]], and [[Apple Inc.|Apple]]’s [[visionOS]] (for [[Apple Vision Pro]]). Development often uses [[Unity (game engine)|Unity]] or [[Unreal Engine]].
- **Mixed Reality Platforms:** Microsoft’s HoloLens uses the Windows Mixed Reality platform (built on Windows 10/11), and supports development via Unity/Unreal or the Windows SDK.  Magic Leap uses its Lumin OS and Lumin SDK (integrated with popular engines). Apple’s Vision Pro introduced visionOS (based on iOS) for 3D app development, with support in Xcode and RealityKit.
*'''Creator Platforms:''' [[Snap Inc.|Snap]]’s [[Lens Studio]] allows creation of AR "Lenses" for [[Snapchat]] and [[Spectacles (Snap)|Spectacles]].
 
*'''Web Standards:''' [[WebXR]] Device API enables AR experiences directly within compatible [[web browser]]s.
- **Snap AR and Lens Studio:** Snap provides the Lens Studio authoring environment for building AR filters (“Lenses”) for both the Snapchat app and Spectacles hardware.  Spectacles can directly run custom AR experiences created in Lens Studio and pushed to the headset&#8203;:contentReference[oaicite:35]{index=35}.
*'''Cross-Platform Standards:''' [[OpenXR]], an open standard from the [[Khronos Group]], aims to provide cross-vendor runtime compatibility for AR and VR applications and devices.<ref name="OpenXR">The Khronos Group. "OpenXR Overview". Retrieved 30 April 2025. https://www.khronos.org/openxr/</ref>
 
*'''Enterprise Platforms:''' Solutions like [[PTC Vuforia|Vuforia]], [[TeamViewer Frontline|Frontline (TeamViewer)]], and [[Wikitude]] provide tools specifically for industrial AR applications.
- **WebXR:** The WebXR API enables AR experiences in web browsers on devices that support it. Browsers like Chrome on AR-capable devices can run lightweight AR without a native app, facilitating cross-platform deployment. 
 
- **Other SDKs:** Many AR glasses run on top of general development tools (Unity, Unreal Engine) that have AR toolkits.  There are also proprietary SDKs (e.g. Qualcomm’s Snapdragon XR SDK) and enterprise solutions (PTC Vuforia, Wikitude, etc.) that support various AR headsets. 
 
- **Ecosystem:** In practice, AR glasses often integrate with smartphone or cloud apps.  For example, Google Glass paired with Android apps, HoloLens can connect to Azure services, and Apple’s glasses will interoperate with iPhone/iPad apps.  Interoperability standards like OpenXR aim to let AR/VR content run across different headsets.
 
== Privacy, ethics, and social acceptance ==
AR glasses raise notable privacy and social concerns. Because they often include cameras and microphones recording the wearer’s surroundings, bystanders may feel surveilled. The launch of Google Glass famously sparked public outcry: wearers were sometimes banned from venues and became known derogatorily as “Glassholes”&#8203;:contentReference[oaicite:36]{index=36}. Observers worried about being filmed or analyzed without consent.  Security analysts warn that AR devices collect rich personal data (video, audio, spatial scans), potentially more intimate than social media, creating heightened privacy risk&#8203;:contentReference[oaicite:37]{index=37}. Questions arise about how face recognition or eye-tracking might be used.
 
Ethical issues include digital distraction and misinformation: unsolicited or misleading virtual overlays could confuse users.  There are concerns around the digital divide and constant connectivity changing social norms (e.g. people wearing always-on displays).  Safety is also an issue; poor ergonomics or inattentiveness (due to overlays) could cause accidents, similar to distracted driving.  On the positive side, advocates argue AR must be developed responsibly, with opt-out mechanisms, privacy-by-design, and ethical guidelines. 
 
Social acceptance hinges on form factor and aesthetics.  Early AR glasses were often bulky and visually obvious, which hindered adoption.  New designs focus on making AR glasses look like normal eyewear&#8203;:contentReference[oaicite:38]{index=38}.  Optical engineers also work to eliminate distracting artifacts (such as “eye glow” from stray light, ghosting, and rainbow effects) that can annoy others or break immersion&#8203;:contentReference[oaicite:39]{index=39}.  Wearables that block too much of the face or appear intrusive are likely to remain stigmatized.  Surveys suggest that many consumers will not embrace conspicuous AR hardware until it becomes stylish, unobtrusive and clearly useful.


Overall, the technology is still earning public trust. High-profile missteps (privacy breaches, bulky gadgets) have made users cautious. However, as devices shrink and perform useful tasks (e.g. translation, navigation, information), acceptance may grow. Companies are also sensitive to “reasonable expectation of privacy” laws – for example, early Google Glass devices disabled audio recording in public places in some jurisdictions. These debates around privacy and etiquette continue to shape AR development. 
==Privacy, ethics, and social acceptance==
AR glasses raise significant [[privacy]], [[ethics]], and social acceptance challenges. The inclusion of outward-facing [[camera]]s and [[microphone]]s leads to concerns about [[surveillance]] and recording without consent. The launch of [[Google Glass]] notably sparked public backlash, leading to bans in some venues and the pejorative term “Glasshole”.<ref name="GlassholeWired">Wired (Jan 22, 2015). "Google Glass Got Banned. Why Did We Ever Think It Was OK?". Retrieved 30 April 2025. https://www.wired.com/story/google-glass-reasonable-expectation-of-privacy//</ref>


== Market trends, forecasts, and adoption barriers ==
Key concerns include:
Market analysts project steady growth but identify several barriers. Industry reports indicate that enterprise demand currently outpaces consumer, because businesses can justify the productivity gains. For example, AR Insider notes that “bulky headgear isn’t as much of an issue for industrial work,” and expects consumer adoption to catch up later&#8203;:contentReference[oaicite:40]{index=40}. Counterpoint Research forecasts that AR/AI smart glasses will see stronger growth in 2025, driven by advances like on-device AI and new platforms (e.g. Android’s XR OS)&#8203;:contentReference[oaicite:41]{index=41}. IDC data (via Neowin) shows AR/VR headset shipments modestly up (around +10% in 2024), with Meta currently dominating the market&#8203;:contentReference[oaicite:42]{index=42}. Apple’s Vision Pro (launched 2024) briefly boosted interest but at a niche price point&#8203;:contentReference[oaicite:43]{index=43}&#8203;:contentReference[oaicite:44]{index=44}.
*Collection and use of sensitive data (video, audio, [[spatial mapping|spatial maps]], [[eye tracking]] data).
*Potential for misuse (e.g., covert recording, [[face recognition]] without consent).
*Digital distraction and safety risks (e.g., obscured vision, attention diversion).
*[[Social norm]] disruption and the [[digital divide]].
*Aesthetic and [[ergonomics|ergonomic]] issues impacting adoption. Bulky or conspicuous designs can lead to stigma.
*Technical artifacts like "[[eye glow]]" (light leakage from [[waveguide]]s) can be distracting or reveal device usage.<ref name="EyeGlowReview">
Ding, Y.; Yang, Q.; Li, Y. <i>et al.</i> (2023).
“Waveguide-based augmented reality displays: perspectives and challenges”.
<i>eLight</i> 3 (24): 1–39. doi:10.1186/s43593-023-00057-z.
Section 2.1 & 3.2.5 discuss the “eye-glow” artifact.
Retrieved 30 April 2025.
https://elight.springeropen.com/articles/10.1186/s43593-023-00057-z
</ref>


Forecasts vary: one analyst suggests the AR glasses market (hardware alone) could reach a couple billion dollars by 2030&#8203;:contentReference[oaicite:45]{index=45}. Others note the broader “spatial computing” market (including software) scaling to hundreds of billions.  Key trends enabling AR include miniaturized optics (waveguides, light engines), 5G connectivity (for cloud offload), and AI-driven computer vision (improving mapping and context).
Manufacturers are attempting to address these concerns through measures like visible recording indicators (LEDs), [[privacy by design]] principles, onboard processing to limit data transfer, and focusing on more conventional eyeglass [[form factor]]s. Public acceptance likely depends on demonstrating clear user benefits while mitigating privacy risks and social friction.


Despite enthusiasm, barriers remain.  Hardware is still expensive (HoloLens and Vision Pro costing thousands of dollars), power-hungry (battery life often only a few hours), and somewhat cumbersome. Many AR glasses have relatively small FOVs compared to human vision, limiting immersion. The ecosystem of apps is also immature: unlike smartphones, there are few “killer apps” that compel mainstream users to wear glasses all day. Privacy/social concerns (as above) may slow adoption in public settings. Finally, technical challenges like sunlight readability and eye strain need solving before mass consumer acceptance. As of 2025, experts generally agree that AR glasses are promising for specialized applications, but widespread consumer use will take more years of innovation and content development.
==Market trends, forecasts, and adoption barriers==
The AR glasses market is growing, particularly in the [[enterprise software|enterprise]] sector where [[return on investment]] (ROI) through productivity gains can justify current costs. [[Consumer electronics|Consumer]] adoption is slower but anticipated to increase as technology matures. Market research firms like [[IDC]] estimate global AR/[[VR headset|VR]] headset shipments are growing, forecasting significant increases in the coming years after potential consolidation or pauses.<ref name="IDC2025">IDC (March 5, 2024). "AR/VR Headset Shipments Forecast to Rebound in 2024 Followed by Strong Growth in the Outer Years, According to IDC". Retrieved 30 April 2025. https://www.idc.com/getdoc.jsp?containerId=prUS51864224</ref><ref name="Neowin2025">Neowin (March 6, 2024). "IDC revises AR/VR headset shipment prediction for 2024, expects 41% growth in 2026". Retrieved 30 April 2025. https://my.idc.com/getdoc.jsp?containerId=prUS53278025/</ref>


== Future outlook and ongoing research directions ==
===Key Trends===
Looking forward, research and development are focused on overcoming current limitations. In optics, new approaches like the inverse-designed metasurface waveguides are emerging. For example, researchers at Nvidia recently demonstrated a prototype “holographic AR glasses” using AI-optimized metasurface gratings, enabling full-color 3D holograms in a much slimmer form factor than conventional designs&#8203;:contentReference[oaicite:46]{index=46}&#8203;:contentReference[oaicite:47]{index=47}.  Such nanophotonic waveguides could eventually allow true mixed-reality visuals in a normal glasses frame.  Other research explores retinal projection (directly scanning images onto the eye) and better focus-free lenses (multi-focal or varifocal displays) to reduce eye strain.
*Advances in [[miniaturization|miniaturized]] optics ([[waveguide]]s, [[microdisplay]]s).
*More powerful and efficient mobile [[system-on-chip|SoCs]] with dedicated [[AI]] capabilities.
*Improved [[SLAM]] and [[computer vision]] algorithms.
*The rollout of [[5G]] potentially enabling [[cloud computing|cloud]]/[[edge computing]] rendering and processing.


On the electronics side, low-power processors and dedicated AR chips continue to improve. For instance, Apple’s Vision Pro introduced a custom R1 chip to process sensor input with very low latency&#8203;:contentReference[oaicite:48]{index=48}. Future devices will likely incorporate on-device AI for tasks like object recognition, translation and adaptive interfaces. Meta’s Orion prototype, as mentioned, aims to integrate a “personal AI assistant” that understands the environment and user intent&#8203;:contentReference[oaicite:49]{index=49}. Advances in battery and wireless power (as well as edge/cloud compute over 5G) will extend usage time.
===Barriers===
*'''Cost:''' High prices ($1000+) for capable devices limit mainstream adoption.
*'''Form Factor & Comfort:''' Devices are often still too bulky, heavy, or unstylish for all-day wear.
*'''[[Battery life|Battery Life]]:''' Often limited to 2-4 hours of active use.
*'''[[Field-of-view|Field of View (FOV)]]:''' Often narrower than human vision, limiting immersion.
*'''[[Display technology|Display]] Quality:''' Issues like brightness, [[sunlight readability]], and resolution need further improvement.
*'''App Ecosystem:''' Lack of compelling, everyday "[[killer application]]s" for consumers.
*'''[[Privacy]] and Social Acceptance:''' As discussed above.


Software advances will come from richer development platforms.  Cross-platform standards (OpenXR) and AR-specific engines will make it easier to build compelling content.  Frameworks for persistent AR (shared maps of the world) and secure AR data handling are active areas.  We will also see integration with IoT: AR glasses could access environmental sensors or control smart devices as part of the scene. 
==Future outlook and ongoing research directions==
Future development aims to overcome current limitations and unlock mainstream potential:


On the consumer front, upcoming products are anticipated from major tech firms. Rumors in 2025 include a lower-cost Apple AR headset (or glasses), and Meta’s eventual consumer AR glasses following the enterprise Orion. Companies in Asia (like Huawei or China’s Xreal) are also developing stylish AR spectacles. Meanwhile, new use cases (e.g. augmented contact lenses, surgical AR overlays, AR in automotive HUDs) will expand the definition of “AR glasses”.  
*'''[[Optics]]:''' Research focuses on thinner, lighter, and wider-FOV optics like [[metasurface]]-based [[waveguide]]s or advanced [[holographic optical element]]s, potentially achieving eyeglass form factors.<ref name="NatureMetasurface" /><ref name="NVIDIAAI" /> [[Retinal projection]] and [[varifocal display]]s aim to address [[vergence-accommodation conflict]] and reduce [[eye strain]].
*'''Processing and Power:''' Continued improvement in low-power [[processor]]s and specialized [[AI]] chips ([[Apple R1|R1]], dedicated [[NPU]]s). Better battery technology and [[wireless power transfer|wireless charging]] are crucial. Offloading computation to [[edge computing|edge]]/[[cloud computing|cloud]] via [[5G]] or [[Wi-Fi 6|Wi-Fi 6/7]] may enable lighter devices.
*'''AI Integration:''' On-device [[AI]] assistants that understand user context, interpret the environment, and provide proactive information (e.g., [[Meta Platforms|Meta]]'s [[Project Orion (Meta)|Orion]] prototype concept).<ref name="OrionVerge" />
*'''Sensing and Interaction:''' More robust [[hand tracking]], [[eye tracking]], and development of [[brain-computer interface|brain-computer interfaces]] (BCIs) or [[electromyography|EMG]]-based inputs.
*'''Software and Ecosystem:''' Maturation of [[spatial computing]] platforms, expansion of [[OpenXR]] support, development of persistent, shared AR experiences ([[AR Cloud]]), and richer content creation tools.
*'''New Form Factors:''' Exploration beyond glasses, including [[augmented contact lens|AR contact lenses]] or projection-based systems.


In summary, AR glasses are at an inflection point.  Years of research in optics, sensing, and AI are coming together to create devices that may finally be practical for everyday use.  Ongoing work spans hardware (lighter optics, better displays), software (AI-driven AR experiences) and user studies (privacy norms, health effects).  While challenges remain, the convergence of technology and growing interest from industry and consumers suggests that true augmented reality eyewear could transition from niche tool to mainstream platform in the coming decade.
==References==
<references />


== References ==
[[Category:Terms]]
<references/>
[[Category:Technical Terms]]
[[Category:AR Device Types]]
[[Category:Technology]]
[[Category:Wearable Technology]]
[[Category:Augmented Reality]]
[[Category:Computing Devices]]
[[Category:Consumer Electronics]]
[[Category:Emerging Technologies]]
[[Category:Mixed Reality]]
[[Category:Display Technology]]
[[Category:Head-mounted Displays]]
[[Category:Virtual Reality]]
[[Category:Mobile Computing]]

Latest revision as of 08:23, 1 May 2025

See also: Terms and Technical Terms
See also: Smart glasses

AR glasses (also known as augmented reality glasses or smart glasses) are wearable head-mounted devices that overlay computer-generated imagery, data or 3-D models onto a user’s direct view of the physical world. Unlike virtual reality (VR) headsets, which occlude outside vision, AR glasses use transparent or semi-transparent optics (waveguides, prisms or combiners) so the wearer simultaneously sees real surroundings and virtual overlays.[1][2] Modern eyewear integrates miniature micro-displays (often OLED, LCD, or LCoS), transparent waveguide optics, and an array of sensorsRGB/depth cameras, an inertial measurement unit (IMU), eye-trackers, and sometimes LiDAR—all driven by low-power SoCs. Real-time simultaneous localization and mapping (SLAM) locks holograms to the environment while voice, hand-tracking or gaze serves as input.[3] In this way AR glasses provide hands-free, heads-up access to information – for example showing navigation cues, text annotations, or 3D models superimposed on actual objects – without obscuring the user’s natural vision.

AR glasses come in various form factors (from bulky headsets to slim spectacles) but typically resemble ordinary eyewear. Some experimental prototypes like the AirySense system (shown above) allow a wearer to see and manipulate virtual objects as though they were real. Because the hardware must balance optics, electronics, and power in a compact package, current devices range from one-eye displays to full pair-of-glasses designs. In either case, all employ specialized optics (such as holographic or diffractive waveguides) to focus virtual images at a comfortable viewing distance while still letting the user see the world around them.[1][4]

History and evolution

The concept of see-through head-mounted displays (HMDs) dates back to the 1960s. Ivan Sutherland’s 1968 “Sword of Damocles” HMD is often cited as the first prototype, displaying dynamic wire-frame graphics aligned to the real world.[5] In 1990 the term “augmented reality” was coined by Thomas Caudell while describing a heads-up wiring guide for Boeing assembly.[6] Early AR research explored wearable optics for pilots and maintenance. However, practical AR glasses remained largely experimental until the 2010s.

The first mass-public AR headset was arguably Google Glass (Explorer Edition released 2013), a US $1,500 monocular smartglass project that drew widespread attention and significant privacy debate.[7] Around the same time other companies like Vuzix (with products such as the M100 smart glass) and Epson (Moverio series) began selling eyewear with AR capabilities. The mid-2010s saw a wave of miniaturization and new optics.

In 2016 Microsoft launched the first Microsoft HoloLens as the first untethered, binocular MR headset for enterprise use, featuring spatial mapping cameras and gesture control.[8] HoloLens (and its 2019 successor HoloLens 2) brought advanced SLAM and interaction (voice, hands) to AR glasses. In 2018 Magic Leap released the Magic Leap One “Creator Edition”, an MR headset using diffractive waveguide optics and a powerful tethered compute pack.[9] Meanwhile consumer AR eyewear efforts appeared: Snap Inc. introduced the original Snap Spectacles (2016) as camera glasses, and later the 4th generation Spectacles (2021) with dual waveguide displays, 6-DoF tracking, and AR effects for creators.[10] Other attempts included fashionable AR frames like North Focals and Ray-Ban Stories (camera-equipped smartglasses by Meta Platforms and Ray-Ban).

By the early 2020s, virtually all major tech players signaled interest in AR glasses. In 2023 Apple unveiled the Apple Vision Pro, a premium mixed reality headset combining high-resolution micro-OLED displays (23 million pixels total), video pass-through AR, an M2 SoC and a custom R1 sensor-fusion chip.[11] Meta Platforms (Facebook) showcased prototypes (Project Aria) and in 2024 discussed “Project Orion” – a prototype glasses-style AR device featuring silicon-carbide microLED waveguides and an on-device AI assistant.[12] Other recent entries include Lenovo’s ThinkReality A3, Pico’s AR headsets, and continuing updates from enterprise vendors like Vuzix (Blade 2) and Epson (Moverio BT-45 series). Industry analysts note that the modern wave of AR glasses began around 2012 and accelerated after 2015 with breakthroughs in waveguide optics and miniaturized components. As of 2025 the technology continues to evolve rapidly.

Technical components

AR glasses integrate several key hardware subsystems:

Optics and Displays

Most systems employ transparent waveguide combiners or reflective prisms to channel light from microdisplays into the user’s eyes. A 2021 review summarized state-of-the-art grating, holographic and reflective waveguide architectures.[4] Common display engines are microdisplays (small OLED, LCD, or LCoS panels) or pico projectors. For binocular systems, dual displays provide stereoscopy. Holographic displays or spatial light modulators are emerging in research systems.[4] The optics collimate and focus the image, often using precision waveguides (e.g. diffractive or holographic patterns) embedded in thin glass layers. Key specifications include field-of-view (FOV), resolution, and brightness (nits) to compete with ambient light. Research directions now include inverse-designed metasurface gratings that could enable full-colour holographic AR in eyeglass-scale optics.[13][14]

Sensors and Tracking

AR glasses require extensive sensing for environmental awareness and interaction. Typical sensors include multiple cameras (RGB, depth sensors or Time-of-Flight/LiDAR units) and an inertial measurement unit (IMU). HoloLens 2, for example, lists four visible-light cameras, a 1-MP time-of-flight depth sensor, and a 9-axis IMU.[15] These feed computer vision and SLAM algorithms for spatial mapping and visual-inertial odometry. Eye tracking cameras detect gaze, while hand tracking enables gesture input. Sensor fusion keeps virtual content registered to the real world.

Processing and Power

Standalone (untethered) glasses rely on mobile SoCs such as Qualcomm’s Snapdragon XR series or Apple’s dual-chip M2 + R1 architecture in the Apple Vision Pro.[11][16] Tethered designs (e.g., early Magic Leap One) off-load computation to a smartphone or belt-worn “compute puck” to reduce head-borne weight and potentially increase performance. Battery life remains a significant constraint, typically lasting only a few hours under active use.

Types of AR glasses

AR glasses can be categorized by several criteria:

  • Monocular vs. Binocular: Monocular glasses display to one eye, often simpler and lighter. Binocular glasses display to both eyes for stereoscopic vision and wider immersion.
  • Tethered vs. Standalone: Tethered glasses require a connection to an external device (PC, phone, compute pack). Standalone glasses contain all processing and power onboard.
  • Optical see-through vs. Video pass-through: Optical see-through uses transparent optics to directly view the world with overlays. Video pass-through uses external cameras to capture the world, digitally mixing it with virtual content before displaying it internally (e.g., Apple Vision Pro).

Key applications

AR glasses find use in many domains:

Leading products and companies

Major technology companies and specialized startups are active in the AR glasses market:

Device Company First release Key Features / Target Market
Microsoft HoloLens 2 Microsoft 2019 Binocular waveguides, hand/eye tracking, enterprise focus
Magic Leap 2 Magic Leap 2022 70° diagonal FOV, dynamic dimming, enterprise/developer focus
Apple Vision Pro Apple 2024 Dual 4K micro-OLED, eye tracking, video pass-through, high-end consumer/prosumer (Spatial computing)
Spectacles (Gen 4, limited release) Snap Inc. 2021 Dual 46° FOV waveguides, 6DoF tracking, AR creators
Vuzix Blade 2 Vuzix 2023 Monocular waveguide, ANSI Z87.1 safety rated, enterprise/industrial
Epson Moverio BT-45CS / BT-45C Epson 2022 Si-OLED binocular displays, industrial/remote assistance focus
Xreal Air 2 / Air 2 Pro Xreal 2023 Binocular OLED, lightweight "AR viewer" tethered to phone/PC, consumer media/productivity
Ray-Ban Stories / Meta Smart Glasses Meta Platforms / Luxottica 2021 / 2023 Camera/audio glasses, limited display/AR (Gen 2 adds livestreaming), consumer

Other notable players include Google (Glass Enterprise Edition), Lenovo (ThinkReality A3), Qualcomm (chipsets like Snapdragon XR), Varjo, and RealWear.

Software platforms and ecosystems

AR glasses rely on software frameworks and content ecosystems:

Privacy, ethics, and social acceptance

AR glasses raise significant privacy, ethics, and social acceptance challenges. The inclusion of outward-facing cameras and microphones leads to concerns about surveillance and recording without consent. The launch of Google Glass notably sparked public backlash, leading to bans in some venues and the pejorative term “Glasshole”.[20]

Key concerns include:

  • Collection and use of sensitive data (video, audio, spatial maps, eye tracking data).
  • Potential for misuse (e.g., covert recording, face recognition without consent).
  • Digital distraction and safety risks (e.g., obscured vision, attention diversion).
  • Social norm disruption and the digital divide.
  • Aesthetic and ergonomic issues impacting adoption. Bulky or conspicuous designs can lead to stigma.
  • Technical artifacts like "eye glow" (light leakage from waveguides) can be distracting or reveal device usage.[21]

Manufacturers are attempting to address these concerns through measures like visible recording indicators (LEDs), privacy by design principles, onboard processing to limit data transfer, and focusing on more conventional eyeglass form factors. Public acceptance likely depends on demonstrating clear user benefits while mitigating privacy risks and social friction.

Market trends, forecasts, and adoption barriers

The AR glasses market is growing, particularly in the enterprise sector where return on investment (ROI) through productivity gains can justify current costs. Consumer adoption is slower but anticipated to increase as technology matures. Market research firms like IDC estimate global AR/VR headset shipments are growing, forecasting significant increases in the coming years after potential consolidation or pauses.[22][23]

Key Trends

Barriers

  • Cost: High prices ($1000+) for capable devices limit mainstream adoption.
  • Form Factor & Comfort: Devices are often still too bulky, heavy, or unstylish for all-day wear.
  • Battery Life: Often limited to 2-4 hours of active use.
  • Field of View (FOV): Often narrower than human vision, limiting immersion.
  • Display Quality: Issues like brightness, sunlight readability, and resolution need further improvement.
  • App Ecosystem: Lack of compelling, everyday "killer applications" for consumers.
  • Privacy and Social Acceptance: As discussed above.

Future outlook and ongoing research directions

Future development aims to overcome current limitations and unlock mainstream potential:

References

  1. 1.0 1.1 Synopsys. "How Do Augmented Reality Optics Work?". Retrieved 30 April 2025. https://www.synopsys.com/glossary/what-is-augmented-reality-optics.html
  2. Varjo. "Virtual Reality, Augmented Reality and Mixed Reality Explained". Retrieved 30 April 2025. https://varjo.com/virtual-augmented-and-mixed-reality-explained/
  3. Sarlin P. et al. (2022). "LaMAR – Benchmarking Localization and Mapping for Augmented Reality". Proceedings of ECCV 2022. https://link.springer.com/chapter/10.1007/978-3-031-20071-7_40 https://lamar.ethz.ch/
  4. 4.0 4.1 4.2 Xiong J. et al. (2021). "Augmented reality and virtual reality displays: perspectives and challenges". Light: Science & Applications. 10 (1): 216. doi:10.1038/s41377-021-00658-8
  5. Sutherland I. E. (1968). "A head-mounted three-dimensional display". AFIPS Conf. Proc. 33: 757–764.
  6. AWE XR. "Thomas Caudell – XR Hall of Fame". Retrieved 30 April 2025. https://www.awexr.com/hall-of-fame/20-thomas-caudell
  7. The Verge (May 2, 2013). "Google Glass review". Retrieved 30 April 2025. https://www.theverge.com/2013/2/22/4013406/i-used-google-glass-its-the-future-with-monthly-updates
  8. The Verge (April 1, 2016). "Microsoft HoloLens review: the future, now". Retrieved 30 April 2025. https://www.theverge.com/2016/4/1/11334488/microsoft-hololens-video-augmented-reality-ar-headset-hands-on
  9. Axios (Dec 20, 2017). "Magic Leap finally unveils its first augmented reality headset". Retrieved 30 April 2025. https://www.axios.com/2018/01/05/magic-leap-finally-shows-its-ar-headset-1515110723
  10. The Verge (May 20, 2021). "Snap unveils AR Spectacles that overlay digital images on the real world". Retrieved 30 April 2025. https://www.theverge.com/2021/5/20/22445481/snap-spectacles-ar-augmented-reality-announced
  11. 11.0 11.1 Apple Inc. (January 8, 2024). “Apple Vision Pro available in the U.S. on February 2”. Press release. Retrieved 30 April 2025. https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/
  12. 12.0 12.1 The Verge (Oct 15, 2024). "Meta shows off Orion AR glasses prototype with AI assistant". Retrieved 30 April 2025. https://www.theverge.com/24253908/meta-orion-ar-glasses-demo-mark-zuckerberg-interview
  13. 13.0 13.1 Gopakumar, M.; Lee, G-Y.; Choi, S. et al. (2024). “Full-colour 3D holographic augmented-reality displays with metasurface waveguides”. Nature 629 (800): 791–797. doi:10.1038/s41586-024-07386-0. Retrieved 30 April 2025. https://www.nature.com/articles/s41586-024-07386-0
  14. 14.0 14.1 NVIDIA Blog (May 30, 2024). "NVIDIA Research Unveils AI-Powered Holographic Glasses Prototype". Retrieved 30 April 2025.https://developer.nvidia.com/blog/developing-smaller-lighter-extended-reality-glasses-using-ai/
  15. Microsoft Learn. "HoloLens 2 hardware details". Retrieved 30 April 2025. https://learn.microsoft.com/en-us/hololens/hololens2-hardware
  16. Qualcomm. "Snapdragon XR2+ Gen 2 Platform". Retrieved 30 April 2025. https://www.qualcomm.com/products/mobile/snapdragon/xr-vr-ar/snapdragon-xr2-plus-gen-2-platform
  17. Softweb Solutions. "Augmented Reality in Manufacturing: Use Cases and Benefits" (Citing Ericsson study findings). Retrieved 30 April 2025. https://www.softwebsolutions.com/resources/augmented-reality-in-manufacturing.html
  18. NASA (June 25, 2015). "NASA, Microsoft Collaborate to Bring Science Fiction to Science Fact". Retrieved 30 April 2025. https://www.nasa.gov/press-release/nasa-microsoft-collaborate-to-bring-science-fiction-to-science-fact
  19. The Khronos Group. "OpenXR Overview". Retrieved 30 April 2025. https://www.khronos.org/openxr/
  20. Wired (Jan 22, 2015). "Google Glass Got Banned. Why Did We Ever Think It Was OK?". Retrieved 30 April 2025. https://www.wired.com/story/google-glass-reasonable-expectation-of-privacy//
  21. Ding, Y.; Yang, Q.; Li, Y. et al. (2023). “Waveguide-based augmented reality displays: perspectives and challenges”. eLight 3 (24): 1–39. doi:10.1186/s43593-023-00057-z. Section 2.1 & 3.2.5 discuss the “eye-glow” artifact. Retrieved 30 April 2025. https://elight.springeropen.com/articles/10.1186/s43593-023-00057-z
  22. IDC (March 5, 2024). "AR/VR Headset Shipments Forecast to Rebound in 2024 Followed by Strong Growth in the Outer Years, According to IDC". Retrieved 30 April 2025. https://www.idc.com/getdoc.jsp?containerId=prUS51864224
  23. Neowin (March 6, 2024). "IDC revises AR/VR headset shipment prediction for 2024, expects 41% growth in 2026". Retrieved 30 April 2025. https://my.idc.com/getdoc.jsp?containerId=prUS53278025/