|
|
Line 1: |
Line 1: |
| '''[[AR glasses]]''' (also known as '''[[augmented reality]] glasses''' or '''[[smart glasses]]''') are wearable [[head-mounted display|head-mounted devices]] that overlay computer-generated content onto the user’s view of the real world. Unlike [[virtual reality]] headsets that block out the real environment, AR glasses maintain a transparent or semi-transparent display so that digital images appear integrated with the physical surroundings<ref>:contentReference[oaicite:0]{index=0}</ref>. In practice they use miniature displays (often [[OLED]] or [[LCD]] [[microdisplay]]s) and [[optical combiner]]s or [[waveguide]]s to project images directly into the user’s field of view<ref>:contentReference[oaicite:1]{index=1}</ref><ref>:contentReference[oaicite:2]{index=2}</ref>. Sensors such as [[camera]]s and [[inertial measurement unit]]s ([[accelerometer]]/[[gyroscope]]) feed data into on-device [[processor]]s and algorithms, enabling [[simultaneous localization and mapping]] (SLAM) so that virtual objects can be stably placed and tracked relative to the real world<ref>:contentReference[oaicite:3]{index=3}</ref>. In this way AR glasses provide hands-free, heads-up access to information – for example showing navigation cues, text annotations, or [[3D model]]s superimposed on actual objects – without obscuring the user’s natural vision.
| | = [[AR glasses]] = |
|
| |
|
| <ref>:contentReference[oaicite:4]{index=4}</ref> AR glasses come in various [[form factor]]s (from bulky [[headset]]s to slim [[spectacles]]) but typically resemble ordinary eyewear. Some experimental prototypes like the AirySense system (shown above) allow a wearer to see and manipulate virtual objects as though they were real. Because the hardware must balance optics, electronics, and power in a compact package, current devices range from one-eye displays to full pair-of-glasses designs. In either case, all employ specialized optics (such as [[holographic waveguide|holographic]] or [[diffractive waveguide|diffractive]] [[waveguide]]s) to focus virtual images at a comfortable viewing distance while still letting the user see the world around them<ref>:contentReference[oaicite:5]{index=5}</ref><ref>:contentReference[oaicite:6]{index=6}</ref>.
| | '''[[AR glasses]]'''—also known as '''[[smart glasses]]''' or '''[[augmented reality]] (AR) glasses'''—are wearable [[head-mounted display|head-mounted]] devices that superimpose computer-generated images, data, or 3-D models onto the user’s real-world view. In contrast to [[virtual reality]] (VR) headsets, which fully occlude the outside world, AR glasses use transparent optics such as [[waveguide]] or prism [[optics]] so the wearer simultaneously sees both physical surroundings and virtual overlays.<ref name="IEEEDef">{{cite web |title=AR Glasses Spawn a Whole New Social Dynamic |url=https://spectrum.ieee.org/ar-glasses |website=IEEE Spectrum |date=28 July 2023 |access-date=30 April 2025}}</ref> |
| | |
| | Modern AR glasses integrate miniature [[microdisplay|micro-displays]] (OLED, LCD or LCoS), optical combiners, and an array of [[sensor]]s (RGB/depth cameras, [[IMU]], eye-trackers) driven by low-power [[system-on-chip|SoCs]]. Real-time [[simultaneous localization and mapping|SLAM]] keeps holograms locked to the environment while voice, hand-tracking, or gaze provide input—all in a hands-free form factor that resembles ordinary eyewear. |
|
| |
|
| == History and evolution == | | == History and evolution == |
| The concept of see-through [[head-mounted display]]s (HMDs) dates back to the 1960s (e.g. [[Ivan Sutherland]]’s 1968 “Sword of Damocles” HMD is often cited as the first prototype)<ref>:contentReference[oaicite:7]{index=7}</ref>. In 1990 the term “[[augmented reality]]” was coined, and early AR research (sometimes in [[military]] labs) explored wearable optics for [[pilot]]s and [[maintenance]]. However, practical AR glasses remained largely experimental until the 2010s. The first mass-public AR headset was arguably [[Google Glass]] (2013), a [[monocular]] smartglass project that drew widespread attention<ref>:contentReference[oaicite:8]{index=8}</ref>. Around the same time other companies like [[Vuzix]] (with products such as the M100 smart glass) and [[Epson]] ([[Epson Moverio|Moverio]] series) began selling eyewear with AR capabilities. The mid-2010s saw a wave of [[miniaturization]] and new optics: for example, ODG’s R-7 and first-generation smartglasses offered more compact designs<ref>:contentReference[oaicite:9]{index=9}</ref>.
| | * '''1968 – Ivan Sutherland’s “head-mounted display.”''' Widely regarded as the first optical see-through AR system, Sutherland’s ceiling-mounted prototype demonstrated dynamic 3-D wireframe graphics aligned to the real world.<ref name="Sutherland1968">{{cite web |last=Werner |first=John |title=Catchup With Ivan Sutherland — Inventor Of The First AR Headset |url=https://www.forbes.com/sites/johnwerner/2024/02/23/catchup-with-ivan-sutherlandinventor-of-the-first-ar-headset/ |website=Forbes |date=23 February 2024 |access-date=30 April 2025}}</ref> |
| | * '''1990 – Term “augmented reality.”''' [[Thomas Caudell]] of Boeing coins the phrase while describing a heads-up wiring harness guide for aircraft assembly.<ref name="Caudell1990">{{cite web |title=Thomas Caudell – Hall of Fame |url=https://www.awexr.com/hall-of-fame/20-thomas-caudell |website=AWE XR |access-date=30 April 2025}}</ref> |
| | * '''2013 – [[Google Glass]].''' The first large-scale consumer smart-glass trial sold a US$1,500 “Explorer Edition” monocular unit to early adopters and sparked privacy debates.<ref name="Glass2013">{{cite news |last=Blagdon |first=Jeff |title=Google expands Glass pre-orders to ‘creative individuals’ |url=https://www.theverge.com/2013/2/20/4006748/google-project-glass-explorer-edition-pre-order |work=The Verge |date=20 February 2013 |access-date=30 April 2025}}</ref> |
| | * '''2016 – [[Microsoft HoloLens]].''' The first fully untethered, binocular AR headset for enterprise shipped to developers, introducing integrated depth-sensing and gesture input.<ref name="HoloLens2016">{{cite news |last=Warren |first=Tom |title=This is what Microsoft HoloLens is really like |url=https://www.theverge.com/2016/4/1/11334488/microsoft-hololens-video-augmented-reality-ar-headset-hands-on |work=The Verge |date=1 April 2016 |access-date=30 April 2025}}</ref> |
| | * '''2018 – [[Magic Leap One]].''' Magic Leap released its first commercial mixed-reality visor with diffractive waveguides and an external “Lightpack” compute puck.<ref name="MagicLeap2018">{{cite news |title=Magic Leap launches its first product |url=https://www.axios.com/2018/08/08/magic-leap-launches-first-product-one-creator-addition |work=Axios |date=8 August 2018 |access-date=30 April 2025}}</ref> |
| | * '''2021 – [[Snap Spectacles]] (4th gen).''' Snap’s developer-only AR Spectacles added dual waveguide displays and 6-DoF tracking in a 134 g frame.<ref name="Spectacles2021">{{cite news |last=Lunden |first=Ingrid |last2=Matney |first2=Lucas |title=Snap announces a new generation of Spectacles, streamlined glasses to experience the world in AR |url=https://techcrunch.com/2021/05/20/snap-announces-its-latest-generation-of-its-spectacles-a-streamlined-device-for-experience-the-world-in-ar/ |work=TechCrunch |date=20 May 2021 |access-date=30 April 2025}}</ref> |
| | * '''2023 – [[Apple Vision Pro]].''' Apple unveiled a premium mixed-reality headset combining 23-million-pixel micro-OLED displays, a custom R1 coprocessor and visionOS.<ref name="VisionPro2023">{{cite press release |title=Introducing Apple Vision Pro: Apple’s first spatial computer |url=https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/ |publisher=Apple |date=5 June 2023 |access-date=30 April 2025}}</ref> |
|
| |
|
| In 2016 [[Microsoft]] launched the first [[Microsoft HoloLens]] as a [[mixed reality]] headset for [[enterprise]] use, featuring [[binocular]] transparent displays, [[spatial mapping]] cameras and [[gesture control]]. HoloLens (and its 2019 successor HoloLens 2) brought advanced [[SLAM]] and interaction (voice, hands) to AR glasses. In 2018 [[Magic Leap]] (a Florida-based startup) released the Magic Leap One “Creator Edition”, an MR headset using [[diffractive waveguide]] optics and a powerful tethered compute pack<ref>:contentReference[oaicite:10]{index=10}</ref>. Meanwhile [[consumer electronics|consumer]] AR eyewear efforts appeared: [[Snap Inc.]] introduced the original Snap [[Spectacles (Snap)|Spectacles]] (2016) as camera glasses, and later (2021) a new generation of Spectacles with dual [[waveguide]] displays for AR effects<ref>:contentReference[oaicite:11]{index=11}</ref>. Other attempts included fashionable AR frames like [[North Focals]] and [[Ray-Ban Stories]] (camera-equipped smartglasses by [[Meta Platforms]] and [[Ray-Ban]]).
| | NASA has even flown AR glasses: Microsoft HoloLens units reached the [[International Space Station]] in 2015 as part of Project Sidekick to provide astronauts with remote expert guidance.<ref name="NASA2015" /> |
|
| |
|
| By the early 2020s, virtually all major tech players signaled interest in AR glasses. In 2023 [[Apple Inc.|Apple]] unveiled the [[Apple Vision Pro]], a high-end [[spatial computing]] headset combining [[video pass-through]] AR and [[virtual reality|VR]] (with [[retina tracking]] and custom [[Apple R1|R1]]/[[Apple M2|M2]] chips)<ref>:contentReference[oaicite:12]{index=12}</ref><ref>:contentReference[oaicite:13]{index=13}</ref>. [[Meta Platforms]] (Facebook) showcased prototypes ([[Project Aria]]) and in 2024 announced “[[Project Orion (Meta)|Project Orion]]” – a glasses-style AR device with transparent lenses and a wide [[field-of-view]]<ref>:contentReference[oaicite:14]{index=14}</ref>. Other recent entries include [[Lenovo]]’s [[Lenovo ThinkReality A3|ThinkReality A3]], [[Pico (VR company)|Pico]]’s AR headsets, and continuing updates from enterprise vendors like [[Vuzix]] ([[Vuzix Blade 2|Blade 2]]) and [[Epson]] ([[Epson Moverio BT-45|Moverio BT-45 series]]). Industry analysts note that the modern wave of AR glasses began around 2012 and accelerated after 2015 with breakthroughs in [[waveguide]] optics and miniaturized components<ref>:contentReference[oaicite:15]{index=15}</ref><ref>:contentReference[oaicite:16]{index=16}</ref>. As of 2025 the technology continues to evolve rapidly, with new prototypes leveraging [[AI]] and novel displays (see below).
| | == Technical components == |
| | === Optics and displays === |
| | Most systems use transparent [[waveguide display|waveguide displays]] or reflective prisms that channel light from miniature [[OLED]]/[[LCD]] [[microdisplay]]s into the wearer’s eyes. Research prototypes now employ inverse-designed [[metasurface]] gratings to deliver full-color holography in eyeglass-scale optics.<ref name="Nature2024" /> |
|
| |
|
| == Technical components and specifications == | | === Sensors and tracking === |
| AR glasses integrate several key hardware subsystems: | | Typical AR glasses integrate: |
| | * Multiple RGB/depth cameras for environment capture |
| | * An [[inertial measurement unit]] for low-latency head pose |
| | * Eye-tracking cameras for foveated rendering or UI |
| | * Optional [[LiDAR]] or time-of-flight sensors for coarse depth |
|
| |
|
| === Optics and Displays ===
| | Fused [[visual-inertial odometry]] and SLAM keep virtual objects anchored in real space. |
| Most AR glasses use ''see-through'' optics, meaning transparent elements such as [[waveguide]]s, [[prism]]s, or [[combiner lens]]es. These optically combine the real scene with projected imagery. Common display engines are [[microdisplay]]s (small [[OLED]] or [[LCD]] panels) or [[pico projector]]s that feed a virtual image into the optics<ref>:contentReference[oaicite:17]{index=17}</ref><ref>:contentReference[oaicite:18]{index=18}</ref>. For [[binocular]] (two-eye) systems this often means dual [[microdisplay]]s providing separate left/right images for [[stereoscopy]]. [[Holographic display]]s or [[spatial light modulator]]s are also emerging in research AR systems to create true [[3D display|3D]] depth cues<ref>:contentReference[oaicite:19]{index=19}</ref>. The optics must collimate and focus the image at a distance comfortable for the eye, requiring precision lenses or [[waveguide]]s (e.g. [[diffractive grating]]s). Many commercial AR glasses (e.g. [[Microsoft HoloLens|HoloLens 2]], [[Magic Leap One]]) use layered [[waveguide]]s: special transparent glass with embedded [[holography|holographic]] or [[diffraction|diffractive]] patterns that guide light from the display into the eye with minimal thickness. The overall [[field-of-view]] (FOV) and [[resolution]] are critical specs; modern high-end AR glasses target several tens of degrees of FOV and high pixel counts ([[Apple Vision Pro]]’s combined display is [[4K resolution|4K]] per eye) to make digital content sharp and immersive. Brightness (measured in [[nits]]) is also key, since the displays must compete with ambient light when overlaying onto the real world.
| |
|
| |
|
| === Sensors and Tracking === | | === Processing and power === |
| AR glasses require extensive sensing to align virtual imagery to the environment. Typical sensors include multiple [[camera]]s ([[RGB camera]], [[depth sensor]]s or [[Time-of-Flight camera|Time-of-Flight]]/[[LiDAR]] units, [[infrared camera]]s) and an [[inertial measurement unit]] (IMU) combining [[accelerometer]]s, [[gyroscope]]s and [[magnetometer]]s. The cameras scan the surroundings (e.g. room, objects, markers), while the IMU tracks rapid head movements. Together these feed into [[computer vision]] and [[SLAM]] algorithms that continuously map the 3D world and localize the user. For example, systems perform [[visual-inertial odometry]] to determine the device’s position, and detect feature points to anchor holograms in space. Some devices also include [[eye tracking]] cameras (to detect gaze) and [[hand tracking]] ([[depth camera]] to see hands or gloves), enabling intuitive interaction. Recent devices like the [[Apple Vision Pro]] include a [[LiDAR]] Scanner (for coarse 3D mapping) and Apple’s [[TrueDepth camera]] system to fuse a real-time depth map of the scene<ref>:contentReference[oaicite:20]{index=20}</ref>. As a result, AR glasses can “see” what the user sees and render graphics that remain registered with real objects even as the wearer moves.
| | Standalone devices employ mobile [[Qualcomm Snapdragon|Snapdragon XR]] or custom silicon (e.g., Apple’s M2 + R1) with on-board batteries. Tethered designs off-load compute to a smartphone or “compute puck,” reducing head-borne weight. |
|
| |
|
| === Processors === | | == Form factors == |
| The computational load for AR is high, involving graphics rendering, vision processing, [[spatial mapping]] and input recognition. AR glasses typically run on powerful low-power [[processor]]s. For standalone (untethered) devices, this often means [[system-on-chip]] (SoC) platforms similar to those in [[smartphone]]s or tablets, but optimized for [[XR]] (Extended Reality). For instance, [[Microsoft HoloLens|HoloLens]] uses a [[Qualcomm Snapdragon XR2]] with an additional [[Holographic Processing Unit]] (HPU) for sensor fusion, while [[Apple Vision Pro]] uses two [[Apple Silicon]] chips (an [[Apple M2]]-class [[CPU]]/[[GPU]] and a custom [[Apple R1|R1]] coprocessor) for handling sensor data and rendering<ref>:contentReference[oaicite:21]{index=21}</ref>. Other AR glasses rely on [[ARM architecture|ARM]]-based processors (e.g. [[Qualcomm]] [[Snapdragon XR1|XR1]]/[[Snapdragon XR2|XR2]]/[[Snapdragon XR2+|XR2+]] series) to provide built-in graphics and neural engines. Some designs remain ''[[tethered computing|tethered]]'', meaning the glasses connect by cable or wirelessly to a more powerful external computer or smartphone (offloading computation for higher performance). In general, AR glasses pack into a small frame: multi-core [[CPU]]s, mobile [[GPU]]s, memory and storage, along with connectivity ([[Bluetooth]], [[Wi-Fi]]) and [[battery (electricity)|battery]].
| | * '''Monocular''' (single-eye) vs. '''binocular''' (two-eye) |
| | * '''Tethered''' (requires external host) vs. '''stand-alone''' |
| | * '''Optical see-through''' vs. '''video pass-through''' (e.g., Vision Pro) |
|
| |
|
| == Types of AR glasses == | | == Applications == |
| AR glasses can be categorized by several criteria:
| | * '''[[Remote assistance]] & field service''' – Live video, annotations and step-by-step holograms can cut maintenance time by up to 50 % and raise first-time-fix rates 30 %.<ref name="Ericsson2025" /> |
| | | * '''Industrial & logistics''' – Pick-by-vision, assembly guidance, quality inspection. |
| * '''[[Monocular]] vs. [[Binocular]]:''' ''Monocular'' AR glasses have a display for one eye only (the other eye sees the world unobstructed). These are typically simpler and lighter, offering a limited [[field-of-view]] but leaving one eye fully aware of real surroundings. Monocular designs (e.g. some [[Epson Moverio|Epson]] or [[Vuzix]] models) reduce weight and cost, and keep one eye free for safety-critical tasks (important in industrial use)<ref>:contentReference[oaicite:22]{index=22}</ref>. In contrast, ''binocular'' AR glasses have displays for both eyes (like most smart [[VR headset|VR headsets]]). These allow a more immersive and natural experience: the user sees a wide virtual screen as if floating in front of them, and gains true [[stereoscopic 3D]] perception and better depth cues<ref>:contentReference[oaicite:23]{index=23}</ref>. Binocular systems are generally higher-end ([[Microsoft HoloLens|HoloLens]], [[Magic Leap]], [[Apple Vision Pro]]) and support richer content, but tend to be bulkier and more expensive. | | * '''Medical''' – Surgical navigation, anatomy overlays, remote proctoring. |
| | | * '''Consumer entertainment & gaming''' – Hands-free AR games, giant virtual screens. |
| * '''[[Tethered computing|Tethered]] vs. [[Standalone VR headset|Standalone]]:''' Some AR headsets must be connected (tethered) to an external [[PC]] or console that provides processing and graphics (via cable or short-range link). Tethered devices can use more powerful [[GPU]]s off-board and often have lighter head units since power and heavy compute are external<ref>:contentReference[oaicite:24]{index=24}</ref>. Other AR glasses are fully standalone (untethered) and contain all electronics and a [[battery (electricity)|battery]] on-board, allowing mobility at the cost of more weight. For example, the first-generation [[Magic Leap One]] used a wearable “Lightpack” tether, whereas [[Microsoft HoloLens|HoloLens]] and [[Apple Vision Pro]] are standalone headsets. As with [[VR headset|VR]], tethered AR headsets need a constant connection to work, while standalone units run independently (often using onboard [[ARM architecture|ARM]] [[system-on-chip|SoCs]])<ref>:contentReference[oaicite:25]{index=25}</ref><ref>:contentReference[oaicite:26]{index=26}</ref>.
| | * '''Military & aerospace''' – Heads-up situational awareness; NASA’s Sidekick on ISS.<ref name="NASA2015">{{cite web |title=NASA, Microsoft Collaborate to Bring Science Fiction to Science Fact |url=https://www.nasa.gov/news-release/nasa-microsoft-collaborate-to-bring-science-fiction-to-science-fact/ |website=NASA |date=25 June 2015 |access-date=30 April 2025}}</ref> |
| | |
| * '''[[Optical see-through]] vs. [[Video pass-through]]:''' Most AR glasses are ''optical see-through'' – the user’s eyes directly see the real world through transparent lenses or gaps, with virtual images superimposed. This is typical of true AR spectacles. A few devices (especially high-end [[mixed reality]] systems) employ ''video pass-through'': [[camera]]s capture the real world and feed it to displays, and digital content is mixed in the video feed. Video pass-through (as in [[VR headset]]s with forward cameras) gives more control (the system can manipulate the entire view and apply full 3D effects) but introduces [[latency (engineering)|latency]] and loses direct visual perception. (Note: [[Apple Vision Pro]] primarily uses video pass-through to achieve very high realism, whereas most enterprise AR glasses remain optical.)
| |
| | |
| == Key applications ==
| |
| AR glasses find use in many domains:
| |
| | |
| * '''[[Enterprise software|Enterprise]] & Industry:''' One of the largest markets today is enterprise and industrial. In [[manufacturing]] and [[field service]], AR glasses enable [[remote assistance]] and [[maintenance (technical)|maintenance]] support: a technician wearing AR glasses can see step-by-step instructions or [[3D model|3D]] overlayed diagrams on equipment, and can share their view with a remote expert for real-time guidance<ref>:contentReference[oaicite:27]{index=27}</ref>. Analysts cite significant productivity gains: for instance, an [[Ericsson]] report noted that AR/[[virtual reality|VR]]-enabled remote assistance and training can reduce maintenance time by up to 50% and improve first-time fix rates by ~30%<ref>:contentReference[oaicite:28]{index=28}</ref>. AR glasses are also used in [[logistics]] (hands-free picking), [[assembly line|assembly]] (visual overlays for wiring or components), and [[quality control]]. In [[training simulator]]s, AR can project scenarios onto physical mock-ups (e.g. [[pilot]]s training with AR flight visuals or [[medicine|medical]] trainees practicing procedures with virtual [[anatomy]]). Indeed, [[military|militaries]] have long experimented with AR: [[astronaut]]s have tested [[Microsoft HoloLens|HoloLens]] on the [[International Space Station]] (ISS) ([[NASA]]’s Project Sidekick) to assist in assembly tasks<ref>:contentReference[oaicite:29]{index=29}</ref>, and soldier [[heads-up display|HUD]] systems provide tactical overlays.
| |
| | |
| * '''[[Consumer electronics|Consumer]] & [[Entertainment]]:''' In the consumer space, applications are growing but more nascent. Games and experiences can leverage AR glasses to create immersive play; for example, future AR glasses could support [[location-based game]]s (extending [[smartphone]] AR titles into hands-free mode). [[Wearable technology|Wearable]] AR can also enable on-demand media (watching videos or virtual screens anywhere). [[Social media]] and communication is another use-case: e.g. sharing augmented video streams or placing [[3D model|3D]] [[avatar]]s of friends in one’s environment. [[Navigation]] is a natural fit – instead of looking down at a map, AR glasses could display directions on the road. Some visionaries foresee “[[metaverse]]” uses where people interact via AR overlays in daily life. However, as of 2025 the consumer ecosystem remains limited by hardware and apps. | |
| | |
| * '''[[Remote collaboration]] & Assistance:''' Closely related to enterprise, AR glasses facilitate remote collaboration. A user can stream what they see to remote participants, who can draw annotations or highlight objects in the shared view. Fields like field [[maintenance (technical)|maintenance]], [[healthcare]] ([[telemedicine]]), and [[customer service]] benefit: e.g. a mechanic wearing AR glasses could have a remote [[engineer]] overlay repair instructions on the engine. This hands-free [[remote assistance]] can cut travel needs and speed up problem solving. In [[training simulator]]s, AR allows simulated failure modes on real equipment (e.g. overlaying a virtual fire or electrical fault) for safe practice. | |
| | |
| * '''Specialized Professional Use:''' AR glasses are used in niche professional scenarios: [[surgeon]]s use them to overlay [[medical imaging|medical images]] onto a patient during operations; [[engineer]]s use them for [[3D model]] visualization on real sites; [[retail]]ers experiment with AR for in-store navigation or product demos. [[Audio]]-visual wearables (like [[Bose Frames]] or [[Amazon Echo Frames|Echo Frames]]) blur lines between AR and smart eyewear by providing augmented audio cues. In entertainment, [[theme park]]s and [[museum]]s have used AR headsets for interactive experiences. | |
| | |
| * '''[[Military]] and Space:''' Defense applications include [[heads-up display]]s for [[pilot]]s and augmented vision in training and combat. The image above shows [[NASA]] [[astronaut]] Scott Kelly on the [[International Space Station|ISS]] wearing [[Microsoft HoloLens]]<ref>:contentReference[oaicite:30]{index=30}</ref>, demonstrating AR for crew training. The military is exploring AR for [[situational awareness]], [[maintenance (technical)|maintenance]] in the field, and complex training (e.g. overlaying enemy positions on landscapes). | |
|
| |
|
| == Leading products and companies == | | == Leading products and companies == |
| Major technology companies and startups have developed AR glasses:
| | {| class="wikitable" |
| | | ! Device !! First release !! Notes |
| * '''[[Microsoft HoloLens]]:''' [[Microsoft]]’s HoloLens (1st gen 2016, 2nd gen 2019) is a leading [[enterprise software|enterprise]] AR headset. It features [[binocular]] [[waveguide]] displays, [[depth camera|depth cameras]], and [[gesture control]]/voice controls, and runs [[Windows Mixed Reality]]. HoloLens devices have been used in engineering, design, and healthcare.
| | |- |
| | | | [[Microsoft HoloLens 2]] || 2019 || Binocular, waveguide optics, hand-tracking |
| * '''[[Magic Leap]]:''' Magic Leap (Florida, USA) launched the [[Magic Leap One]] in 2018 and Magic Leap 2 in 2022 – [[tethered computing|tethered]] AR headsets targeting developers and enterprise. The Magic Leap One had a lightweight visor and external compute pack, using [[diffractive waveguide|diffractive]] optics. Reviewers noted it offered more natural [[field-of-view]] and display quality compared to HoloLens<ref>:contentReference[oaicite:31]{index=31}</ref>.
| | |- |
| | | | [[Magic Leap 2]] || 2022 || 70° FOV, dynamic dimming, enterprise focus |
| * '''[[Apple Inc.|Apple]]:''' Apple entered the space with [[Apple Vision Pro]] (2024), a high-end [[mixed reality]] headset priced at $3,499<ref>:contentReference[oaicite:32]{index=32}</ref>. Vision Pro has ultra-high-[[resolution]] [[OLED]] panels (one per eye), advanced [[eye tracking]], and run a new [[visionOS]]. Apple is also rumored to be developing prescription-frame-style AR glasses for future release.
| | |- |
| | | | [[Apple Vision Pro]] || 2024 || Dual 4K micro-OLED, eye-tracking, video pass-through |
| * '''[[Snap Inc.]]:''' Snap pioneered consumer AR glasses with its [[Spectacles (Snap)|Spectacles]] line. The original Spectacles (2016) were camera sunglasses; the 2021 “Next Gen Spectacles” introduced true AR via dual [[waveguide]] displays and [[depth camera|depth cameras]]<ref>:contentReference[oaicite:33]{index=33}</ref>. Snap’s Spectacles pair with its [[Lens Studio]] AR platform for content creators.
| | |- |
| | | | [[Snap Spectacles]] (4th gen) || 2021 || 46° FOV waveguides, Creator beta |
| * '''[[Vuzix]]:''' Vuzix (New York, USA) produces a range of AR eyewear (e.g. [[Vuzix Blade]], M-Series). The Blade is a sunglasses-style, [[optical see-through|see-through]] smartglass with [[Amazon Alexa]] built-in. Vuzix often targets [[enterprise software|enterprise]] ([[warehouse|warehousing]], [[logistics]]) and was one of the first to market consumer-style AR glasses.
| | |- |
| | | | [[Vuzix Blade 2]] || 2023 || Sunglass form factor, ANSI-rated for industry |
| * '''[[Epson Moverio]]:''' [[Epson]]’s Moverio series are AR smart glasses featuring [[binocular]] transparent displays (using [[Si-OLED]] tech). Moverio headsets (e.g. BT-300, BT-350) have been used in [[unmanned aerial vehicle|drone]] piloting, [[maintenance (technical)|maintenance]], and education applications.
| | |- |
| | | | [[Epson Moverio]] BT-45 || 2022 || Si-OLED binocular smart-glasses |
| * '''Other notable AR glasses:'''
| | |} |
| * ''[[RealWear]] HMT:'' An industrial wearable with a [[monocular]] display (mounted near one eye) and voice control, used for hands-free guided work (despite not being [[optical see-through|see-through]]).
| |
| * ''[[Ray-Ban Stories]]:'' Co-developed by [[Meta Platforms]] and [[Luxottica]], Ray-Ban Stories (2021) look like normal sunglasses with built-in [[camera]]s and speakers (though they do not project AR imagery beyond recording video).
| |
| * ''[[North Focals]]:'' (Discontinued) Smart glasses by [[North (company)|North]] (acquired by [[Google]]) that projected basic AR notifications via tiny lasers.
| |
| | |
| In addition to hardware, AR ecosystems involve software firms. [[Meta Platforms]] (formerly Facebook) with [[Ray-Ban]] and its [[Project Aria]]/[[Project Orion (Meta)|Orion]], [[Google]] ([[Google Glass|Glass Enterprise Edition]], [[Android (operating system)|Android]]), [[Apple Inc.|Apple]] ([[iOS]]/[[visionOS]]), and [[Qualcomm]] ([[Qualcomm Snapdragon|Snapdragon XR]] chips) play major roles. Emerging companies include [[Varjo]] (XR-4 for enterprise [[XR]]), [[Intel RealSense|RealSense]] ([[Intel]]), and startups like [[Xreal|Nreal]] (Nebula glasses).
| |
| | |
| == Software platforms and ecosystems ==
| |
| AR glasses rely on software frameworks and content ecosystems:
| |
| | |
| * '''[[ARKit]] and [[ARCore]]:''' [[Apple Inc.|Apple]]’s [[ARKit]] (introduced in 2017) and [[Google]]’s [[ARCore]] (2017) are mobile AR development platforms for [[iOS]] and [[Android (operating system)|Android]] respectively<ref>:contentReference[oaicite:34]{index=34}</ref>. They provide tracking, scene understanding, and rendering tools for developers. Many [[smartphone]] AR apps today use these, and some AR glasses (especially those based on mobile OS) leverage them as well.
| |
| | |
| * '''[[Mixed Reality]] Platforms:''' [[Microsoft]]’s [[Microsoft HoloLens|HoloLens]] uses the [[Windows Mixed Reality]] platform (built on [[Windows 10]]/[[Windows 11]]), and supports development via [[Unity (game engine)|Unity]]/[[Unreal Engine]] or the [[Windows SDK]]. [[Magic Leap]] uses its [[Lumin OS]] and Lumin [[SDK]] (integrated with popular engines). [[Apple Inc.|Apple]]’s [[Apple Vision Pro|Vision Pro]] introduced [[visionOS]] (based on [[iOS]]) for 3D app development, with support in [[Xcode]] and [[RealityKit]].
| |
| | |
| * '''[[Snap AR]] and [[Lens Studio]]:''' [[Snap Inc.|Snap]] provides the [[Lens Studio]] authoring environment for building AR filters (“Lenses”) for both the [[Snapchat]] app and [[Spectacles (Snap)|Spectacles]] hardware. Spectacles can directly run custom AR experiences created in Lens Studio and pushed to the headset<ref>:contentReference[oaicite:35]{index=35}</ref>.
| |
| | |
| * '''[[WebXR]]:''' The [[WebXR]] API enables AR experiences in [[web browser]]s on devices that support it. Browsers like [[Google Chrome|Chrome]] on AR-capable devices can run lightweight AR without a native app, facilitating cross-platform deployment.
| |
| | |
| * '''Other [[SDK]]s:''' Many AR glasses run on top of general development tools ([[Unity (game engine)|Unity]], [[Unreal Engine]]) that have AR toolkits. There are also proprietary SDKs (e.g. [[Qualcomm]]’s [[Snapdragon#XR (Extended Reality)|Snapdragon XR SDK]]) and enterprise solutions ([[PTC Vuforia|Vuforia]], [[Wikitude]], etc.) that support various AR headsets.
| |
| | |
| * '''Ecosystem:''' In practice, AR glasses often integrate with [[smartphone]] or [[cloud computing|cloud]] apps. For example, [[Google Glass]] paired with [[Android (operating system)|Android]] apps, [[Microsoft HoloLens|HoloLens]] can connect to [[Microsoft Azure|Azure]] services, and [[Apple Inc.|Apple]]’s glasses will interoperate with [[iPhone]]/[[iPad]] apps. Interoperability standards like [[OpenXR]] aim to let AR/[[virtual reality|VR]] content run across different headsets.
| |
| | |
| == Privacy, ethics, and social acceptance ==
| |
| AR glasses raise notable [[privacy]] and social concerns. Because they often include [[camera]]s and [[microphone]]s recording the wearer’s surroundings, bystanders may feel [[surveillance|surveilled]]. The launch of [[Google Glass]] famously sparked public outcry: wearers were sometimes banned from venues and became known derogatorily as “Glassholes”<ref>:contentReference[oaicite:36]{index=36}</ref>. Observers worried about being filmed or analyzed without consent. Security analysts warn that AR devices collect rich personal data (video, audio, spatial scans), potentially more intimate than [[social media]], creating heightened privacy risk<ref>:contentReference[oaicite:37]{index=37}</ref>. Questions arise about how [[face recognition]] or [[eye tracking]] might be used.
| |
| | |
| [[Ethics|Ethical]] issues include digital distraction and misinformation: unsolicited or misleading virtual overlays could confuse users. There are concerns around the [[digital divide]] and constant connectivity changing social norms (e.g. people wearing always-on displays). Safety is also an issue; poor [[ergonomics]] or inattentiveness (due to overlays) could cause accidents, similar to [[distracted driving]]. On the positive side, advocates argue AR must be developed responsibly, with opt-out mechanisms, [[privacy by design]], and ethical guidelines.
| |
| | |
| Social acceptance hinges on [[form factor]] and [[aesthetics]]. Early AR glasses were often bulky and visually obvious, which hindered adoption. New designs focus on making AR glasses look like normal eyewear<ref>:contentReference[oaicite:38]{index=38}</ref>. Optical engineers also work to eliminate distracting artifacts (such as “[[eye glow]]” from stray light, [[ghosting (display)|ghosting]], and [[rainbow effect]]) that can annoy others or break immersion<ref>:contentReference[oaicite:39]{index=39}</ref>. [[Wearable technology|Wearables]] that block too much of the face or appear intrusive are likely to remain stigmatized. Surveys suggest that many consumers will not embrace conspicuous AR hardware until it becomes stylish, unobtrusive and clearly useful.
| |
| | |
| Overall, the technology is still earning public trust. High-profile missteps (privacy breaches, bulky gadgets) have made users cautious. However, as devices shrink and perform useful tasks (e.g. translation, navigation, information), acceptance may grow. Companies are also sensitive to “reasonable expectation of privacy” laws – for example, early [[Google Glass]] devices disabled audio recording in public places in some jurisdictions. These debates around [[privacy]] and etiquette continue to shape AR development.
| |
| | |
| == Market trends, forecasts, and adoption barriers ==
| |
| Market analysts project steady growth but identify several barriers. Industry reports indicate that [[enterprise software|enterprise]] demand currently outpaces [[consumer electronics|consumer]], because businesses can justify the productivity gains. For example, [[AR Insider]] notes that “bulky headgear isn’t as much of an issue for industrial work,” and expects consumer adoption to catch up later<ref>:contentReference[oaicite:40]{index=40}</ref>. [[Counterpoint Research]] forecasts that AR/[[AI]] smart glasses will see stronger growth in 2025, driven by advances like on-device AI and new platforms (e.g. [[Android (operating system)|Android]]’s XR OS)<ref>:contentReference[oaicite:41]{index=41}</ref>. [[IDC]] data (via [[Neowin]]) shows AR/[[VR headset|VR]] headset shipments modestly up (around +10% in 2024), with [[Meta Platforms]] currently dominating the market<ref>:contentReference[oaicite:42]{index=42}</ref>. [[Apple Inc.|Apple]]’s [[Apple Vision Pro]] (launched 2024) briefly boosted interest but at a niche price point<ref>:contentReference[oaicite:43]{index=43}</ref><ref>:contentReference[oaicite:44]{index=44}</ref>.
| |
| | |
| Forecasts vary: one analyst suggests the AR glasses market (hardware alone) could reach a couple billion dollars by 2030<ref>:contentReference[oaicite:45]{index=45}</ref>. Others note the broader “[[spatial computing]]” market (including software) scaling to hundreds of billions. Key trends enabling AR include [[miniaturization|miniaturized]] optics ([[waveguide]]s, [[light engine]]s), [[5G]] connectivity (for [[cloud computing|cloud]] offload), and [[AI]]-driven [[computer vision]] (improving mapping and context).
| |
| | |
| Despite enthusiasm, barriers remain. Hardware is still expensive ([[Microsoft HoloLens|HoloLens]] and [[Apple Vision Pro|Vision Pro]] costing thousands of dollars), power-hungry ([[battery life]] often only a few hours), and somewhat cumbersome. Many AR glasses have relatively small [[field-of-view]]s (FOVs) compared to human vision, limiting immersion. The ecosystem of apps is also immature: unlike [[smartphone]]s, there are few “[[killer application]]s” that compel mainstream users to wear glasses all day. [[Privacy]]/social concerns (as above) may slow adoption in public settings. Finally, technical challenges like [[sunlight readability]] and [[eye strain]] need solving before mass consumer acceptance. As of 2025, experts generally agree that AR glasses are promising for specialized applications, but widespread consumer use will take more years of innovation and content development.
| |
| | |
| == Future outlook and ongoing research directions ==
| |
| Looking forward, research and development are focused on overcoming current limitations. In optics, new approaches like the inverse-designed [[metasurface]] [[waveguide]]s are emerging. For example, researchers at [[Nvidia]] recently demonstrated a prototype “holographic AR glasses” using [[AI]]-optimized metasurface gratings, enabling full-color 3D [[hologram]]s in a much slimmer [[form factor]] than conventional designs<ref>:contentReference[oaicite:46]{index=46}</ref><ref>:contentReference[oaicite:47]{index=47}</ref>. Such [[nanophotonics]] [[waveguide]]s could eventually allow true [[mixed reality]] visuals in a normal glasses frame. Other research explores [[retinal projection]] (directly scanning images onto the eye) and better focus-free lenses ([[multi-focal display|multi-focal]] or [[varifocal display|varifocal]] displays) to reduce [[eye strain]].
| |
|
| |
|
| On the electronics side, [[low-power electronics|low-power]] [[processor]]s and dedicated AR chips continue to improve. For instance, [[Apple Inc.|Apple]]’s [[Apple Vision Pro|Vision Pro]] introduced a custom [[Apple R1|R1]] chip to process sensor input with very low [[latency (engineering)|latency]]<ref>:contentReference[oaicite:48]{index=48}</ref>. Future devices will likely incorporate on-device [[AI]] for tasks like [[object recognition]], [[machine translation|translation]] and [[adaptive interface]]s. [[Meta Platforms|Meta]]’s [[Project Orion (Meta)|Orion]] prototype, as mentioned, aims to integrate a “personal AI assistant” that understands the environment and user intent<ref>:contentReference[oaicite:49]{index=49}</ref>. Advances in [[battery (electricity)|battery]] and [[wireless power transfer|wireless power]] (as well as [[edge computing]]/[[cloud computing|cloud compute]] over [[5G]]) will extend usage time.
| | == Software platforms == |
| | * '''[[ARKit]]''' for iOS (2017)<ref name="ARKit2017" /> |
| | * '''[[ARCore]]''' for Android (2017)<ref name="ARCore2017" /> |
| | * '''[[OpenXR]]''' cross-vendor standard |
| | * '''visionOS''' (Apple), '''Windows Mixed Reality''', '''Lumin OS''', '''Snap OS''' |
|
| |
|
| Software advances will come from richer development platforms. Cross-platform standards ([[OpenXR]]) and AR-specific engines will make it easier to build compelling content. Frameworks for [[persistent AR]] (shared maps of the world) and secure AR data handling are active areas. We will also see integration with [[Internet of Things|IoT]]: AR glasses could access environmental sensors or control [[smart device]]s as part of the scene.
| | == Privacy, ethics and social acceptance == |
| | Always-on cameras and eye-tracking raise surveillance concerns. Google Glass’s 2013 rollout provoked bans in bars and cinemas and the term “Glasshole.”<ref name="Glass2013" /> Designers now emphasise LED capture indicators, on-device processing, and fashion-friendly styling to improve social acceptance. |
|
| |
|
| On the consumer front, upcoming products are anticipated from major tech firms. Rumors in 2025 include a lower-cost [[Apple Inc.|Apple]] AR headset (or glasses), and [[Meta Platforms|Meta]]’s eventual consumer AR glasses following the enterprise Orion. Companies in Asia (like [[Huawei]] or China’s [[Xreal]]) are also developing stylish AR spectacles. Meanwhile, new use cases (e.g. [[augmented contact lens|augmented contact lenses]], surgical AR overlays, AR in [[automotive head-up display|automotive HUDs]]) will expand the definition of “AR glasses”.
| | == Market trends == |
| | According to [[IDC]], global AR/VR headset shipments grew 10 % in 2024 and are forecast to jump 41 % in 2025, driven by cheaper hardware and on-device AI.<ref name="IDC2024" /> Counterpoint Research likewise projects “AI smart-glasses” to achieve double-digit million-unit volumes by 2029. Enterprise demand currently outpaces consumer uptake due to clear productivity ROI, but analysts expect mainstream adoption as weight, cost, and app ecosystems improve. |
|
| |
|
| In summary, [[AR glasses]] are at an inflection point. Years of research in optics, sensing, and [[AI]] are coming together to create devices that may finally be practical for everyday use. Ongoing work spans hardware (lighter optics, better displays), software (AI-driven AR experiences) and user studies ([[privacy]] norms, health effects). While challenges remain, the convergence of technology and growing interest from industry and consumers suggests that true augmented reality eyewear could transition from niche tool to mainstream platform in the coming decade.
| | == Future outlook == |
| | Research directions include: |
| | * [[Metasurface]] and holographic waveguides for thin, full-color optics<ref name="Nature2024" /> |
| | * [[Retinal projection]] and [[varifocal display]]s to solve vergence-accommodation conflict |
| | * Edge/cloud off-load over [[5G]] for light glasses with all-day battery |
| | * AI copilots that contextualise the environment and anticipate user intent (e.g., Meta’s Project Orion prototypes) |
|
| |
|
| == References == | | == References == |
| <references/> | | <references /> |
AR glasses—also known as smart glasses or augmented reality (AR) glasses—are wearable head-mounted devices that superimpose computer-generated images, data, or 3-D models onto the user’s real-world view. In contrast to virtual reality (VR) headsets, which fully occlude the outside world, AR glasses use transparent optics such as waveguide or prism optics so the wearer simultaneously sees both physical surroundings and virtual overlays.[1]
Modern AR glasses integrate miniature micro-displays (OLED, LCD or LCoS), optical combiners, and an array of sensors (RGB/depth cameras, IMU, eye-trackers) driven by low-power SoCs. Real-time SLAM keeps holograms locked to the environment while voice, hand-tracking, or gaze provide input—all in a hands-free form factor that resembles ordinary eyewear.
History and evolution
- 1968 – Ivan Sutherland’s “head-mounted display.” Widely regarded as the first optical see-through AR system, Sutherland’s ceiling-mounted prototype demonstrated dynamic 3-D wireframe graphics aligned to the real world.[2]
- 1990 – Term “augmented reality.” Thomas Caudell of Boeing coins the phrase while describing a heads-up wiring harness guide for aircraft assembly.[3]
- 2013 – Google Glass. The first large-scale consumer smart-glass trial sold a US$1,500 “Explorer Edition” monocular unit to early adopters and sparked privacy debates.[4]
- 2016 – Microsoft HoloLens. The first fully untethered, binocular AR headset for enterprise shipped to developers, introducing integrated depth-sensing and gesture input.[5]
- 2018 – Magic Leap One. Magic Leap released its first commercial mixed-reality visor with diffractive waveguides and an external “Lightpack” compute puck.[6]
- 2021 – Snap Spectacles (4th gen). Snap’s developer-only AR Spectacles added dual waveguide displays and 6-DoF tracking in a 134 g frame.[7]
- 2023 – Apple Vision Pro. Apple unveiled a premium mixed-reality headset combining 23-million-pixel micro-OLED displays, a custom R1 coprocessor and visionOS.[8]
NASA has even flown AR glasses: Microsoft HoloLens units reached the International Space Station in 2015 as part of Project Sidekick to provide astronauts with remote expert guidance.[9]
Technical components
Optics and displays
Most systems use transparent waveguide displays or reflective prisms that channel light from miniature OLED/LCD microdisplays into the wearer’s eyes. Research prototypes now employ inverse-designed metasurface gratings to deliver full-color holography in eyeglass-scale optics.[10]
Sensors and tracking
Typical AR glasses integrate:
- Multiple RGB/depth cameras for environment capture
- An inertial measurement unit for low-latency head pose
- Eye-tracking cameras for foveated rendering or UI
- Optional LiDAR or time-of-flight sensors for coarse depth
Fused visual-inertial odometry and SLAM keep virtual objects anchored in real space.
Processing and power
Standalone devices employ mobile Snapdragon XR or custom silicon (e.g., Apple’s M2 + R1) with on-board batteries. Tethered designs off-load compute to a smartphone or “compute puck,” reducing head-borne weight.
Form factors
- Monocular (single-eye) vs. binocular (two-eye)
- Tethered (requires external host) vs. stand-alone
- Optical see-through vs. video pass-through (e.g., Vision Pro)
Applications
- Remote assistance & field service – Live video, annotations and step-by-step holograms can cut maintenance time by up to 50 % and raise first-time-fix rates 30 %.[11]
- Industrial & logistics – Pick-by-vision, assembly guidance, quality inspection.
- Medical – Surgical navigation, anatomy overlays, remote proctoring.
- Consumer entertainment & gaming – Hands-free AR games, giant virtual screens.
- Military & aerospace – Heads-up situational awareness; NASA’s Sidekick on ISS.[9]
Leading products and companies
Software platforms
- ARKit for iOS (2017)[12]
- ARCore for Android (2017)[13]
- OpenXR cross-vendor standard
- visionOS (Apple), Windows Mixed Reality, Lumin OS, Snap OS
Privacy, ethics and social acceptance
Always-on cameras and eye-tracking raise surveillance concerns. Google Glass’s 2013 rollout provoked bans in bars and cinemas and the term “Glasshole.”[4] Designers now emphasise LED capture indicators, on-device processing, and fashion-friendly styling to improve social acceptance.
Market trends
According to IDC, global AR/VR headset shipments grew 10 % in 2024 and are forecast to jump 41 % in 2025, driven by cheaper hardware and on-device AI.[14] Counterpoint Research likewise projects “AI smart-glasses” to achieve double-digit million-unit volumes by 2029. Enterprise demand currently outpaces consumer uptake due to clear productivity ROI, but analysts expect mainstream adoption as weight, cost, and app ecosystems improve.
Future outlook
Research directions include:
- Metasurface and holographic waveguides for thin, full-color optics[10]
- Retinal projection and varifocal displays to solve vergence-accommodation conflict
- Edge/cloud off-load over 5G for light glasses with all-day battery
- AI copilots that contextualise the environment and anticipate user intent (e.g., Meta’s Project Orion prototypes)
References