Head-mounted display: Difference between revisions
Xinreality (talk | contribs) m Text replacement - "Oculus Rift" to "Oculus Rift (Platform)" |
Xinreality (talk | contribs) No edit summary |
||
(32 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
{{ | {{see also|Terms|Technical Terms}} | ||
[[File:oculus rift dk11.jpg|350px|thumb|right|[[Oculus Rift DK1]] released in 2013]] | |||
A [[head-mounted display]] ('''HMD''') is a [[display]] [[device]], worn on the head or as part of a [[helmet]] (see [[Helmet-mounted display]]), that has a small display optic in front of one ([[monocular]] HMD) or each eye ([[binocular]] HMD). HMDs serve various purposes, including gaming, aviation, engineering, medicine, and are the primary delivery systems for [[Virtual Reality]] (VR), [[Augmented Reality]] (AR), and [[Mixed Reality]] (MR) experiences, particularly when supporting a seamless blend of physical and digital elements.<ref name="Sutherland1968">Sutherland, Ivan E. (1968-12-09). "A head-mounted three dimensional display". ACM Digital Library. Retrieved 2023-10-27. https://dl.acm.org/doi/10.1145/1476589.1476686</ref><ref name="idc2025">IDC (25 March 2025). "Growth Expected to Pause for AR/VR Headsets, according to IDC". Retrieved 2025-05-15. https://www.idc.com/getdoc.jsp?containerId=prUS53278025</ref> | |||
HMDs function by presenting imagery, data, or a combination thereof directly to the wearer's visual field. Many modern HMDs are [[stereoscopic]], featuring separate displays or distinct images rendered for each eye to create a sense of depth through [[binocular disparity]]. Examples include VR headsets like the [[Meta Quest 3]] and [[Valve Index]]. Other HMDs, particularly earlier AR devices or specialized notification displays like the original [[Google Glass]], may be monocular, presenting information over only one eye.<ref name="GoogleGlassPatent">Heinrich, Jerome (assignee: Google Inc.) (2014-07-29). "Wearable display device". Google Patents. Retrieved 2023-10-27. https://patents.google.com/patent/US8791879B1/en</ref> | |||
==Virtual Reality HMDs== | The vast majority of consumer and enterprise VR and AR systems rely on HMDs. In AR applications, the display system is typically designed to be see-through, allowing digital information to be superimposed onto the user's view of the real world. These are often specifically termed [[Optical head-mounted display]]s (OHMDs), utilizing technologies like [[Waveguide (optics)|waveguides]] or [[beam splitter]]s.<ref name="AROpticsReview">Kress, Bernard C. & Starner, Thad (2018-11-01). "Optical see-through head-mounted displays: a review". ''Applied Optics''. '''57''' (31): 9311-9325. doi:10.1364/AO.57.009311. Retrieved 2023-10-27. https://www.osapublishing.org/ao/abstract.cfm?uri=ao-57-31-9311</ref> In VR applications, the display system is opaque, completely blocking the user's view of the real world and replacing it with a computer-generated virtual environment, aiming for high levels of [[immersion]] and [[presence]].<ref name="VRBookSlater">Slater, Mel & Sanchez-Vives, Maria V. (2016). "Chapter 1: Immersive Virtual Reality". ''Enhancing Our Lives with Immersive Virtual Reality''. Elsevier. ISBN 978-0128046377.</ref> Some modern VR HMDs incorporate external [[camera]]s to provide [[video passthrough]] capabilities, enabling a form of AR or "Mixed Reality" where the real world is viewed digitally on the opaque screens with virtual elements overlaid. | ||
{{ | |||
==History== | |||
===Features=== | The concept of a head-mounted display dates back further than often realized. One of the earliest precursors was Morton Heilig's "Telesphere Mask" patented in 1960, a non-computerized, photographic-based stereoscopic viewing device intended for individual use.<ref name="HeiligPatent">Heilig, Morton L. (1960-10-04). "Stereoscopic-television apparatus for individual use". Google Patents. Retrieved 2023-10-27. https://patents.google.com/patent/US2955156A/en</ref><ref name="heilig1960">USPTO (28 June 1960). "US 2,955,156 — Stereoscopic-television apparatus for individual use". Retrieved 2024-05-15. https://patents.google.com/patent/US2955156A</ref> | ||
==== | |||
However, the first true HMD connected to a computer is widely credited to [[Ivan Sutherland]] and his student Bob Sproull at Harvard University and later the University of Utah, around 1968. Dubbed the "[[Sword of Damocles]]" due to its imposing size and the heavy machinery suspended from the ceiling required to support its weight and track head movement, it presented simple wireframe graphics in stereo. This system pioneered many concepts still fundamental to VR and AR today, including head tracking and stereoscopic viewing.<ref name="Sutherland1968"/><ref name="sutherland1968">Wikipedia (20 April 2025). "The Sword of Damocles". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/The_Sword_of_Damocles_(virtual_reality)</ref> | |||
Throughout the 1970s and 1980s, HMD development continued primarily in military (especially for aviator helmet-mounted displays) and academic research labs, driven by organizations like the US Air Force and [[NASA]].<ref name="NASA_HMD">Fisher, S. S.; McGreevy, M.; Humphries, J.; Robinett, W. (1986-01-01). "Virtual Environment Display System". NASA Technical Reports Server. Retrieved 2023-10-27. https://ntrs.nasa.gov/citations/19860018487</ref> The late 1980s and early 1990s saw a "first wave" of commercial VR interest, with companies like VPL Research, founded by [[Jaron Lanier]], popularizing the term "Virtual Reality" and developing HMDs like the "[[EyePhone]]". However, technology limitations (low [[resolution]], high [[latency]], limited processing power, high cost) prevented widespread adoption.<ref name="RheingoldVR">Rheingold, Howard (1991). ''Virtual Reality''. Simon & Schuster. ISBN 978-0671693633.</ref> Nintendo's [[Virtual Boy]] (1995), while technically an HMD, used red LED displays and lacked head tracking, failing commercially but remaining a notable early attempt at consumer VR.<ref name="VirtualBoyHistory">Edwards, Benj (2015-08-21). "Unraveling The Enigma Of Nintendo’s Virtual Boy, 20 Years Later". Fast Company. Retrieved 2023-10-27. https://www.fastcompany.com/3050016/unraveling-the-enigma-of-nintendos-virtual-boy-20-years-later</ref> | |||
The modern era of consumer VR HMDs was effectively kickstarted by [[Palmer Luckey]]'s prototype [[Oculus Rift]] in the early 2010s, which demonstrated that high-quality, low-latency VR was becoming feasible with modern mobile display panels and [[sensor]]s. Its subsequent Kickstarter success and acquisition by [[Facebook]] (now [[Meta Platforms|Meta]]) spurred renewed industry-wide investment.<ref name="OculusKickstarter">Kickstarter. "Oculus Rift: Step Into the Game". Retrieved 2023-10-27. https://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game</ref> This led to the release of numerous consumer HMDs: | |||
*2014 - [[Google Cardboard]] popularised low-cost, smartphone-driven VR viewers.<ref name="cardboard2014">Time Magazine (28 Jan 2016). "Google’s New Head of Virtual Reality on What They’re Planning Next". Retrieved 2024-05-15. https://time.com/4193755/google-cardboard-virtual-reality-clay-bavor-vr/</ref> | |||
*2015 - [[Samsung Gear VR]] improved on the smartphone HMD concept with better optics and integrated controls.<ref name="gearvr2015">Wikipedia (24 Apr 2025). "Samsung Gear VR". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/Samsung_Gear_VR</ref> | |||
*2016 - The consumer [[Oculus Rift]] (CV1) and [[HTC Vive]] established high-end, PC-tethered VR with wide FOV and robust external tracking systems.<ref name="rift2016">Wikipedia (15 Apr 2025). "Oculus Rift". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/Oculus_Rift</ref><ref name="lighthouse">Valve Software (12 Feb 2025). "Valve Index Base Stations". Retrieved 2024-05-15. https://www.valvesoftware.com/index/base-stations</ref> [[PlayStation VR]] brought tethered VR to the console market. | |||
*2019 - [[Oculus Quest]] pioneered high-quality standalone (untethered) 6DoF VR using inside-out camera tracking (Oculus Insight).<ref name="insight2019">Meta Reality Labs (14 Aug 2019). "The Story Behind Oculus Insight Technology". Retrieved 2024-05-15. https://tech.facebook.com/reality-labs/2019/8/the-story-behind-oculus-insight-technology/</ref> [[Valve Index]] pushed fidelity in the PC VR space. | |||
*2023 - [[Meta Quest 3]] adopted the [[Pancake lens|pancake-style optics]] first introduced on the premium [[Meta Quest Pro]] (launched October 2022), and added high-resolution full-colour passthrough mixed reality plus a faster mobile chipset.<ref name="QuestProPancake">{{cite web |url=https://about.fb.com/news/2022/10/meta-quest-pro-social-vr-connect-2022/ |title=Meta Connect 2022: Meta Quest Pro, More Social VR and a Look Into the Future |website=Meta Newsroom |date=11 October 2022 |access-date=2025-04-29}}</ref><ref name="Quest3Features">{{cite web |url=https://www.roadtovr.com/quest-3-features-hands-on-preview/ |title=Quest 3 Features Confirmed in First Hands-on |website=Road to VR |date=12 June 2023 |access-date=2025-04-29}}</ref><ref name="Quest3Review">{{cite web |url=https://www.uploadvr.com/quest-3-review/ |title=Quest 3 Review: Excellent VR With Limited Mixed Reality |website=UploadVR |date=16 October 2023 |access-date=2025-04-29}}</ref> | |||
*2024 - [[Apple Vision Pro]] launched as a premium "spatial computer" featuring high-resolution Micro-OLED displays, advanced eye and hand tracking, and spatial video capabilities.<ref name="visionpro">Apple Newsroom (08 Jan 2024). "Apple Vision Pro available in the U.S. on February 2". Retrieved 2024-05-15. https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/</ref> | |||
==Core Concepts and Principles== | |||
===Visual Pathway=== | |||
The fundamental operation of an HMD involves generating an image and directing it into the user's eyes. This typically follows a path: | |||
#'''[[Central Processing Unit]] (CPU) / [[Graphics Processing Unit]] (GPU)''': Process application logic, handle [[tracking]] data, and render the images (frames) intended for display. In [[Integrated HMD|standalone HMDs]], these processors are inside the headset; in [[Discrete HMD|tethered HMDs]], they are in a connected PC or console. | |||
#'''Display Panel(s)''': Small, high-resolution screens (for example [[LCD]], [[OLED]], [[Micro-OLED]]) receive the rendered images from the GPU. Binocular HMDs typically use either one panel displaying side-by-side images or two separate panels, one for each eye. | |||
#'''Optics ([[Lens|Lenses]])''': Placed between the display panels and the user's eyes, lenses serve multiple crucial functions: | |||
#*Magnification: They enlarge the small display image to fill a significant portion of the user's [[Field of View]] (FOV). | |||
#*Focus: They collimate the light or set the focal plane, typically at a distance of 1.5-2 meters or optical infinity, reducing eye strain compared to focusing on a screen inches away. | |||
#*[[Distortion Correction]]: Simple magnification often introduces optical distortion (like pincushion distortion). The rendered image is typically pre-distorted (barrel distortion) in software to counteract the lens distortion, resulting in a geometrically correct view for the user.<ref name="LensDistortionVR">Oculus Developer Documentation. "Distortion Correction". Retrieved 2023-10-27. https://developer.oculus.com/documentation/native/pc/dg-render-distortion/</ref> | |||
#'''Eyes''': The light (photons) carrying the image information passes through the lenses and enters the user's pupils, forming an image on the retina. | |||
===Stereoscopic Vision=== | |||
Most VR HMDs and many AR HMDs are stereoscopic. They achieve the perception of three-dimensional depth by presenting slightly different images to each eye, mimicking how humans perceive depth in the real world through binocular disparity. The GPU renders the virtual scene from two slightly offset virtual camera positions, corresponding to the user's left and right eyes. When viewed simultaneously, the brain fuses these two images into a single 3D percept.<ref name="HowardRogersBinocular">Howard, Ian P. & Rogers, Brian J. (1995). ''Binocular Vision and Stereopsis''. Oxford University Press. ISBN 978-0195084764.</ref> The distance between these virtual cameras should ideally match the user's [[Interpupillary distance]] (IPD) for accurate scale perception and comfort. | |||
=== | ===Tracking=== | ||
Tracking the user's head movement is fundamental to creating immersive and interactive experiences, particularly in VR. As the user moves their head, the system updates the rendered images accordingly, making the virtual world appear stable and allowing the user to look around naturally. Failure to track accurately and with low latency can lead to disorientation and [[Motion sickness]] (often termed "cybersickness" in VR/AR contexts).<ref name="LaViolaMotionSickness">LaViola Jr., Joseph J. (2000). "A discussion of cybersickness in virtual environments". ''ACM SIGCHI Bulletin''. '''32''' (1): 47-56. doi:10.1145/333329.333033. Retrieved 2023-10-27. https://ieeexplore.ieee.org/document/947376</ref> Tracking operates in multiple [[Degrees of Freedom]] (DoF): | |||
*'''[[Rotational Tracking]] (3DoF)''': Tracks orientation changes: pitch (nodding yes), yaw (shaking no), and roll (tilting head side-to-side). This is the minimum required for a basic VR experience where the user can look around from a fixed viewpoint. It is typically achieved using an [[Inertial Measurement Unit]] (IMU) within the HMD, containing sensors like: | |||
*[[Accelerometer]]: Measures linear acceleration (and gravity). | |||
*[[Gyroscope]]: Measures angular velocity. | |||
*[[Magnetometer]]: Measures the local magnetic field (like a compass), used to correct for gyroscope drift, especially in yaw. [[Sensor fusion]] algorithms combine data from these sensors to provide a stable orientation estimate.<ref name="IMU_VR">Pell, Oliver (2017-07-12). "Use of IMU in Virtual Reality Systems". Analog Dialogue, Analog Devices. Retrieved 2023-10-27. https://www.analog.com/en/technical-articles/imu-in-virtual-reality-systems.html</ref><ref name="imu">Wikipedia (20 Apr 2025). "Inertial measurement unit". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/Inertial_measurement_unit</ref> | |||
*'''[[Positional Tracking]] (6DoF)''': Tracks both orientation (3DoF) and translation (movement through space: forward/backward, left/right, up/down). This allows the user to physically walk around, lean, crouch, and dodge within the virtual environment, significantly enhancing immersion and interaction. 6DoF tracking is achieved through various methods: | |||
**'''[[Outside-in tracking]]''': External sensors (cameras or infrared emitters/detectors like [[Lighthouse (tracking system)|Valve's Lighthouse system]]) are placed in the room to track markers (passive reflective or active IR LED) on the HMD and controllers. Examples: Original Oculus Rift (Constellation), HTC Vive/Valve Index (Lighthouse).<ref name="LighthouseExplained">XinReality Wiki. "Lighthouse". Retrieved 2023-10-27. https://xinreality.com/wiki/Lighthouse</ref><ref name="lighthouse" /> | |||
**'''[[Inside-out tracking]]''': Cameras mounted on the HMD itself observe the surrounding environment. Computer vision algorithms, often employing [[Simultaneous Localization and Mapping]] (SLAM) techniques, identify features in the room and track the HMD's movement relative to them. This eliminates the need for external sensors, making setup easier and enabling larger, unrestricted tracking volumes. Most modern standalone and many tethered HMDs use inside-out tracking. Examples: Meta Quest series, HTC Vive Cosmos, Windows Mixed Reality headsets.<ref name="SLAM_VRAR">Yousif, K.; Bab-Hadiashar, A.; Hand, S. (2019-07-30). "A Review on SLAM Techniques for Virtual and Augmented Reality Applications". ''Sensors''. '''19''' (15): 3338. doi:10.3390/s19153338. PMC 6696193. Retrieved 2023-10-27. https://www.mdpi.com/1424-8220/19/15/3338</ref><ref name="insight2019" /> | |||
===Latency=== | |||
[[Motion-to-photon latency]] - the time delay between a user's physical movement and the corresponding visual update on the display. It is a critical factor for comfort and immersion. High latency is strongly correlated with cybersickness. Modern VR systems aim for latency below 20 milliseconds (ms), with many achieving closer to 10ms under optimal conditions.<ref name="AbrashMTP">Abrash, Michael (2014-01-15). "What VR could, should, and almost certainly will be within two years". Steam Dev Days. Retrieved 2024-05-15. https://www.youtube.com/watch?v=G-2dQoeqVVo</ref><ref name="latency2022">MDPI Sensors (10 Aug 2022). "A Study on Sensor System Latency in VR Motion Sickness". Retrieved 2024-05-15. https://www.mdpi.com/2224-2708/10/3/53</ref> | |||
== | ==Key Technical Specifications== | ||
The quality and characteristics of an HMD are determined by numerous technical specifications: | |||
HMDs in the | ===Display Technology=== | ||
The type of display panel used significantly impacts image quality. Common types include: | |||
*[[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and [[Quantum dot display|quantum dots]] for better color.<ref name="LCDvsOLED_VR">AR/VR Tips. "LCD vs OLED VR Headsets: Which Screen is Best?". Retrieved 2023-10-27. https://arvrtips.com/lcd-vs-oled-vr-headsets/</ref> | |||
*[[OLED]] (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "[[Screen burn-in|burn-in]]" over long periods and may use [[PenTile matrix family|PenTile]] [[Subixel rendering|subpixel layouts]] affecting perceived sharpness. | |||
*[[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (for example [[Bigscreen Beyond]], [[Apple Vision Pro]]).<ref name="MicroOLED_Intro">OLED-Info. "MicroOLED displays". Retrieved 2023-10-27. https://www.oled-info.com/microoled</ref><ref name="microoled2025">Systems Contractor News (23 Apr 2025). "Dual micro-OLED displays grow within the AR/VR headset market". Retrieved 2024-05-15. https://www.svconline.com/proav-today/dual-micro-oled-displays-grow-within-the-ar-vr-headset-market</ref> | |||
*[[MicroLED]]: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs. | |||
===[[Resolution]]=== | |||
The number of [[pixel]]s on the display(s), usually specified per eye (for example 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the [[screen-door effect]] (the visible grid pattern between pixels) and increases image sharpness. [[Pixels Per Degree]] (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally. | |||
===[[Refresh Rate]]=== | |||
The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (for example 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.<ref name="LowPersistence">Abrash, Michael (2014-07-28). "Understanding Low Persistence on the DK2". Oculus Developer Blog. Retrieved 2023-10-27. https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/</ref> | |||
== | ===[[Field of View]] (FOV)=== | ||
The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like [[Pimax]] headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (for example 30°-55°) due to the challenges of see-through optics.<ref name="VR_FOV_Comparison">VR Compare. "Headset Feature: Field of View". Retrieved 2023-10-27. https://vr-compare.com/headsetfeature/fieldofview</ref> | |||
===Optics / [[Lens|Lenses]]=== | |||
The lenses used heavily influence FOV, image sharpness (center-to-edge), [[Chromatic aberration|chromatic aberration]], geometric distortion, and physical characteristics like size and weight. | |||
*[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | |||
*[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (for example Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | |||
*[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/</ref><ref name="optics2023">Expand Reality (05 Oct 2023). "Pancake vs Fresnel Lenses in VR Headsets". Retrieved 2024-05-15. https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr</ref> | |||
*[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (for example HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides</ref> | |||
*[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/</ref> | |||
*[[Holographic Optical Elements]] (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays. | |||
===[[Interpupillary distance]] (IPD) Adjustment=== | |||
The distance between the centers of the pupils varies between individuals (typically 54-72mm for adults). HMDs need to accommodate this for optimal clarity, comfort, and correct stereo rendering. Adjustment can be: | |||
*Physical/Manual: Lenses can be moved closer together or further apart, often via a slider or dial (for example Valve Index, Quest 3). Continuous adjustment allows finer tuning. | |||
*Stepped: Some HMDs offer discrete IPD steps (for example original Quest, Quest 2). | |||
**Software-based: The rendering viewpoint separation is adjusted in software (less common or effective for major mismatches without physical lens movement). | |||
*Automatic: High-end systems might use eye-tracking to measure IPD and adjust automatically or prompt the user. | |||
===[[Eye tracking]]=== | |||
Sensors (typically small infrared cameras) inside the HMD track the user's gaze direction. This enables: | |||
*[[Foveated rendering]]: Rendering the area where the user is looking at full resolution, and the periphery at lower resolution, saving significant computational power.<ref name="FoveatedRenderingNvidia">NVIDIA Developer. "NVIDIA Variable Rate Shading (VRS) & Foveated Rendering". Retrieved 2023-10-27. https://developer.nvidia.com/vrworks/graphics/foveatedrendering</ref> | |||
*Improved Social Interaction: [[Avatar]]s can mimic the user's eye movements, enhancing realism in social VR. | |||
*Automatic IPD adjustment. | |||
*Gaze-based interaction as an input method. | |||
**Examples: [[Meta Quest Pro]], [[PlayStation VR2]], [[HTC Vive Pro Eye]], [[Apple Vision Pro]]. | |||
===Connectivity=== | |||
How the HMD connects to the processing unit (if not standalone). | |||
*Wired: Typically [[USB]] (often Type-C for power/data) and [[DisplayPort]] or [[HDMI]] for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement. | |||
*Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E/7) or proprietary radio frequencies (for example older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">Heaney, David (2022-01-20). "Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless". UploadVR. Retrieved 2023-10-27. https://uploadvr.com/wireless-pc-vr-comparison/</ref> | |||
===Audio=== | |||
Sound is crucial for immersion. HMDs may feature: | |||
*[[Spatial Audio]]: 3D audio rendering techniques that make sounds appear to come from specific locations in the virtual environment. Supported via various output methods. | |||
*Integrated [[Speaker]]s: Often open-ear [[Near-field communication|near-field]] speakers built into the strap or near the ears, providing spatial audio without blocking external sounds (for example Valve Index, Quest series). | |||
*[[Headphone]] Jack (3.5mm): Allows users to connect their own headphones or earbuds. | |||
*Integrated Headphones: High-fidelity on-ear or over-ear headphones attached to the HMD (for example original Rift CV1, HP Reverb G2). | |||
*[[Microphone Arrays]]: Multiple microphones for clear voice input, communication in multiplayer apps, voice commands, and potentially noise cancellation. | |||
===[[Ergonomics]]=== | |||
Factors affecting comfort during extended use: | |||
*Weight: Lighter is generally better (most consumer HMDs are 400-700g). | |||
*Weight Distribution: Balanced weight (front-to-back) is often more important than total weight. Battery placement in standalone HMDs (for example rear-mounted) can improve balance. | |||
*Strap Design: Different mechanisms (soft elastic straps, rigid "halo" straps, top straps) distribute pressure differently. | |||
*Facial Interface: Foam or fabric padding, material breathability, light blocking. Options/space for glasses wearers or [[prescription lens]] inserts. | |||
==Types of HMDs== | ==Types of HMDs== | ||
[[ | HMDs can be broadly categorized based on their functionality and required hardware: | ||
[[ | ===Virtual Reality (VR) HMDs=== | ||
{{see also|Virtual Reality Devices}} | |||
These devices aim to fully immerse the user in a virtual world, blocking out the real environment. | |||
====[[Discrete HMD]] (Tethered HMD)==== | |||
These HMDs contain displays, optics, sensors, and audio, but rely on an external processing unit, typically a powerful [[Personal Computer|PC]] or a game [[console]], connected via cables (or sometimes a dedicated wireless adapter). They generally offer the highest fidelity graphics and performance due to leveraging powerful external GPUs. | |||
*'''PC VR Examples''': [[Valve Index]], [[HTC Vive Pro 2]], [[HP Reverb G2]], original [[Oculus Rift]], [[Oculus Rift S]], [[Varjo Aero]], [[Pimax]] series. | |||
*'''Console VR Examples''': [[PlayStation VR]], [[PlayStation VR2]] (connects to PlayStation consoles). | |||
====[[Integrated HMD]] (Standalone HMD)==== | |||
*[[ | Also known as All-in-One (AIO) HMDs, these devices contain all necessary components, [[displays]], [[optics]], [[sensors]], processing ([[CPU]]/[[GPU]], often based on mobile chipsets like [[Qualcomm Snapdragon XR]] series), [[memory]], [[storage]], [[battery]], and [[tracking]], within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end [[PC VR]] setups. Many [[standalone HMDs]] can optionally connect to a PC via cable (for example [[Meta Link]]) or wirelessly (for example [[Air Link]], [[Virtual Desktop]]) to function as a PC VR headset. | ||
*'''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[Pico Neo 3 Link]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]]. | |||
=== | ====[[Slide-on HMD]] (Smartphone HMD)==== | ||
[[ | These were an early, low-cost entry point to VR, consisting of a simple enclosure (often plastic or cardboard) with lenses, into which a compatible [[smartphone]] was inserted. The smartphone provided the display, processing, and basic 3[[DoF]] tracking (using its internal [[IMU]]). While popular initially due to accessibility (for example [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]]), they suffered from limitations like lower display quality, higher latency, potential overheating, limited interaction (often just a single button or touchpad), inconsistent experiences across different phones, and generally only 3DoF tracking. This category is now largely obsolete, superseded by standalone HMDs. | ||
*'''Examples''': [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]], [[Zeiss VR One]], [[Merge VR/AR Goggles]]. | |||
===Augmented Reality (AR) / Mixed Reality (MR) HMDs=== | |||
These devices overlay digital information onto the user's view of the real world, or blend real and virtual views. | |||
=== | ====[[Optical head-mounted display]] (OHMD)==== | ||
[[ | These HMDs use transparent optical elements (like waveguides or beam splitters) placed in front of the user's eyes. A small projector or microdisplay generates the digital image, which is then directed through the optics and combined with the light from the real world, allowing the user to see both simultaneously. Achieving a wide FOV, high brightness, good opacity for virtual objects, and unobtrusive form factor are major challenges. Often targeted at enterprise, industrial, or specific professional use cases due to cost and complexity. | ||
* '''Examples''': [[Microsoft HoloLens]], [[Microsoft HoloLens 2]], [[Magic Leap 1]], [[Magic Leap 2]], [[Vuzix Blade]], [[RealWear Navigator]], [[Google Glass Enterprise Edition]], [[Nreal Light]] (now [[XREAL Light]]). Simple notification-style displays like original [[Google Glass]] also fall under this category. | |||
====Video Passthrough AR/MR HMDs==== | |||
*[[ | These utilize opaque displays, essentially functioning like VR HMDs, but incorporate outward-facing cameras. The live video feed from these cameras is processed (often correcting for distortion and perspective) and displayed on the internal screens, with digital elements rendered on top. This allows users to see their surroundings digitally, blended with virtual content. Modern implementations increasingly use high-resolution, low-latency, color cameras, aiming to create a more seamless blend ("Mixed Reality"). While not optically transparent, they can offer wider FOV for the AR/MR content compared to many current OHMDs and better occlusion of real objects by virtual ones. | ||
* '''Examples''': [[Meta Quest Pro]], [[Meta Quest 3]], [[Apple Vision Pro]], [[HTC Vive XR Elite]], [[Varjo XR-3]], [[Lynx R1]].<ref name="PassthroughExplained">Lang, Ben (2023-02-15). "VR Headset Passthrough AR Explained". Road to VR. Retrieved 2023-10-27. https://www.roadtovr.com/vr-headset-passthrough-ar-explained-quest-2-pro-index-vive-pro-2/</ref> | |||
==Components of HMDs== | ==Components of HMDs== | ||
=== | While varying significantly based on type and purpose, most modern HMDs incorporate several key components: | ||
*'''Display Panels & Optics''': Generate the visual output and direct it to the eyes (see Key Technical Specifications above). | |||
*'''[[Sensor]]s''': Detect movement, user input, and sometimes the environment. | |||
**[[Inertial Measurement Unit]] (IMU): Core component for rotational (3DoF) tracking. Essential for low-latency orientation updates. | |||
**[[Camera]]s: Crucial for modern HMDs. Can include: | |||
***Visible light cameras: Used for inside-out positional (6DoF) tracking, [[video passthrough]] (especially color passthrough), environment mapping, [[Hand tracking|hand tracking]]. | |||
***[[Infrared]] (IR) cameras: Often used for inside-out tracking (less susceptible to ambient light changes), controller tracking (detecting IR LEDs on controllers), and [[eye tracking]]. | |||
**[[Depth-sensing Cameras|Depth Sensors]]: (for example [[Time-of-flight camera|Time-of-Flight]], Structured Light, Active Stereo IR). Used in some AR/MR HMDs (like HoloLens, Vision Pro) for accurate [[Spatial analysis|spatial mapping]], environment understanding, hand tracking, and occlusion. | |||
**[[Eye tracking|Eye Tracking Cameras]]: Small internal IR cameras pointed at the user's eyes to monitor gaze direction and pupil characteristics. | |||
*'''Processors''': | |||
**CPU/GPU: Handle rendering, tracking calculations, application logic (essential in standalone HMDs, often a mobile [[System on a chip|SoC]] like [[Qualcomm Snapdragon Spaces|Snapdragon XR series]]). | |||
**Specialized Processors: May include [[Vision Processing Unit]]s (VPUs) or [[Neural Processing Unit]]s (NPUs) to efficiently handle computer vision tasks (SLAM, hand/eye tracking) or [[Artificial intelligence|AI]] workloads, offloading the main CPU/GPU. Microsoft's [[Holographic Processing Unit]] (HPU) in HoloLens is an example.<ref name="HoloLensSensors">Microsoft Learn. "HoloLens 2 hardware". Retrieved 2023-10-27. https://learn.microsoft.com/en-us/hololens/hololens2-hardware</ref> | |||
*'''Memory & Storage''': [[RAM]] for active processing and onboard [[Flash memory|storage]] (in standalone HMDs) for the operating system, applications, and media. | |||
*'''Audio System''': Integrated speakers, microphones, headphone jacks (see Key Technical Specifications above). | |||
*'''Connectivity Hardware''': Wi-Fi, [[Bluetooth]] radios (especially in standalone HMDs), [[USB]] ports, video input ports (in tethered HMDs). | |||
*'''Power System''': [[Battery]] (in standalone or wireless HMDs), power regulation circuitry, charging ports (often USB-C). | |||
*'''Mechanical Structure & Ergonomics''': Housing, straps, facial interface, IPD adjustment mechanisms, thermal management (fans, heat sinks). | |||
*'''Input Mechanisms''': How the user interacts with the system. | |||
**[[Motion Controllers]]: Handheld devices tracked in 3D space (usually 6DoF), typically including buttons, triggers, joysticks/touchpads, and [[Haptic technology|haptic feedback]]. Primary input for most VR systems (for example Meta Touch controllers, Valve Index controllers, PS VR2 Sense controllers). | |||
**[[Hand tracking]]: Camera-based systems that track the user's bare hands and finger movements without requiring controllers. Offers natural interaction but lacks physical buttons and haptic feedback. Increasingly standard on standalone VR/MR headsets (Quest series, Vision Pro). | |||
**[[Eye tracking]]: Used for gaze-based selection, foveated rendering, and social presence. | |||
**[[Voice Commands]]: Using built-in microphones and [[Speech recognition|speech recognition software]] for hands-free control. | |||
**[[Brain-Computer Interface|Brain-Computer Interfaces]] (BCI): Experimental interfaces reading neural signals, potentially via electrodes integrated into the HMD, for direct thought control. Still largely in research phases for consumer HMDs (for example [[NextMind]], [[CTRL-labs]] research). | |||
==Applications== | |||
HMDs enable a wide range of applications across various fields: | |||
* '''Gaming and Entertainment''': Immersive video games ([[Beat Saber]], [[Half-Life: Alyx]]), virtual cinemas, [[Social VR]] platforms ([[VRChat]], [[Rec Room]]), [[Location-based entertainment|location-based VR experiences]], [[Virtual tourism]]. | |||
* '''[[Training]] and [[Simulation]]''': Flight simulation, [[Surgical simulation|surgical training]], military exercises, emergency response training, complex machinery operation training, workplace safety drills.<ref name="VRSimulationTraining">Freina, Laura & Ott, Michela (2015). "Virtual reality for simulation and training". ''E-Learning and Digital Media''. '''12''' (3-4): 368-383. doi:10.1177/2042753015591756. Retrieved 2023-10-27. https://link.springer.com/article/10.1007/s10055-016-0293-5</ref> | |||
* '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. https://www.autodesk.com/solutions/virtual-reality</ref> | |||
* '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]). | |||
* '''Medical''': Surgical planning and visualization (overlaying [[medical imaging]] onto patients via AR), therapy (for example [[Virtual reality therapy|exposure therapy]] for phobias/[[PTSD]], pain management/distraction), rehabilitation, medical education. <ref name="VR_Medicine">Rizzo, Albert "Skip" & Kim, Giyoung (2019-11-15). "Applications of Virtual Reality for Clinical Neuropsychology: A Review". ''Journal of Medical Internet Research''. '''21''' (11): e14190. doi:10.2196/14190. PMID 31730019. PMC 6880643. Retrieved 2023-10-27. https://jmir.org/2019/11/e14190/</ref> | |||
* '''Education''': Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning. | |||
* '''[[Information visualization|Data Visualization]]''': Exploring complex datasets (for example financial data, scientific simulations) in interactive 3D space. | |||
* '''Military and Aviation''': Helmet-mounted displays providing flight data, targeting information, situational awareness overlays, night vision integration (for example [[Integrated Visual Augmentation System|IVAS]]). | |||
==Challenges and Limitations== | |||
Despite significant progress, HMD technology still faces challenges: | |||
=== | * '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("[[Retinal projector|retinal resolution]]" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, [[Mura defect|mura]], and motion blur remain ongoing goals.<ref name="Kim2019FoveatedAR">Kim, J.; Jeong, Y.; Stengel, M.; et al. (2019). "Foveated AR: dynamically-foveated augmented reality display". ''ACM Transactions on Graphics''. '''38''' (4): 1-15. doi:10.1145/3306346.3322983.</ref> | ||
* '''Comfort and Ergonomics''': Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.<ref name="TalsmaComfort2020">Talsma, S. W.; Usmani, S. A.; Chen, P. Y. (2020). "Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays". ''Journal of the Society for Information Display''. '''28''' (11): 841-850. doi:10.1002/jsid.943.</ref> | |||
* '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.<ref name="VAC_Review">Hoffman, David M.; Girshick, Ahna R.; Akeley, Kurt; Banks, Martin S. (2008-03-18). "The vergence-accommodation conflict: Practical consequences and solutions". ''Journal of Vision''. '''8''' (3): 33. doi:10.1167/8.3.33. Retrieved 2023-10-27. https://jov.arvojournals.org/article.aspx?articleid=2193631</ref> Solutions like [[Varifocal display|varifocal]] and [[Light field|light field]] displays are complex and still largely experimental. | |||
* '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and [[Vestibular system|vestibular]] input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.<ref name="Weech2019Cybersickness">Weech, S.; Kenny, S.; Barnett-Cowan, M. (2019). "Presence and cybersickness in virtual reality are negatively related: a review". ''Frontiers in Psychology''. '''10''': 158. doi:10.3389/fpsyg.2019.00158. PMID 30787884. PMC 6374254.</ref><ref name="vrsickness">Frontiers in Virtual Reality (20 Mar 2020). "Factors Associated With Virtual Reality Sickness in Head-Mounted Displays". Retrieved 2024-05-15. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/</ref> | |||
* '''Tracking Robustness''': Inside-out tracking can struggle in poorly lit or overly bright environments, on large featureless surfaces (blank walls), with reflective surfaces (mirrors), or during very fast head/body movements. Outside-in tracking requires external sensor setup and has a limited, fixed tracking volume. | |||
* '''Content Ecosystem''': The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches. | |||
* '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars. | |||
* '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': 1-19. doi:10.1145/3313831.3376101.</ref> | |||
* '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (for example [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): 1-11. doi:10.1038/s41598-017-14811-x.</ref><ref name="laser">Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers</ref> Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.<ref name="MadaryMetzingerEthics2016">Madary, M. & Metzinger, T. K. (2016). "Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology". ''Frontiers in Robotics and AI''. '''3''': 3. doi:10.3389/frobt.2016.00003.</ref> | |||
== | ==Future Trends and Developments== | ||
The HMD landscape continues to evolve rapidly, with several promising developments on the horizon: | |||
=== | ===Display Advancements=== | ||
*'''[[Varifocal display|Varifocal Displays]]''': Systems that dynamically adjust focal depth based on where the user is looking (using eye tracking) or based on scene content, addressing the vergence-accommodation conflict. Technologies include movable lenses/displays, [[Liquid crystal lens|liquid crystal lenses]], [[Alvarez lens|Alvarez lenses]], and multi-focal plane displays.<ref name="Rathinavel2018Varifocal">Rathinavel, K.; et al. (2018). "An extended depth-at-field volumetric near-eye augmented reality display". ''IEEE Transactions on Visualization and Computer Graphics''. '''24''' (11): 2857-2866. doi:10.1109/TVCG.2018.2868565.</ref> | |||
*'''[[Light field|Light Field Displays]]''': Generate a more complete representation of light, allowing the eye to focus naturally at different depths within the virtual scene. Still complex and computationally intensive.<ref name="Huang2015LightFieldStereoscope">Huang, F. C.; Chen, K.; Wetzstein, G. (2015). "The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues". ''ACM Transactions on Graphics''. '''34''' (4): 1-12. doi:10.1145/2766949.</ref> | |||
*'''[[Holographic display|Holographic Displays]]''': Aim to reconstruct the wavefront of light from a 3D scene, potentially offering the most natural 3D viewing experience without conflicts. True holographic HMDs are still highly experimental. | |||
*'''Higher resolution, brightness, efficiency via Micro-OLED and [[MicroLED]] maturation. | |||
===Form Factor Evolution=== | |||
*'''[[Lightweight Designs]]''': Advanced optics (pancake, [[Metalens|metalenses]], HOEs) and display technologies are enabling significantly thinner, lighter headsets (sub-300g or even sub-100g). | |||
*'''[[Smartglasses|AR Glasses]]''': The long-term goal for AR is achieving normal eyeglass form factors with all-day wearability and significant compute/display capabilities. Projects like [[Project Aria]] (Meta research) and rumored [[Apple smart glasses|Apple glasses]] point toward this future.<ref name="Delaney2021ARGlasses">Delaney, K. (2021). "The race toward human-centered AR glasses". ''IEEE Computer Graphics and Applications''. '''41''' (5): 112-115. doi:10.1109/MCG.2021.3097740.</ref> | |||
===Sensory Expansion=== | |||
*'''[[Haptic technology|Advanced Haptic Feedback]]''': Beyond simple controller rumble, providing more nuanced tactile sensations via gloves ([[HaptX]]), bodysuits ([[bHaptics]]), ultrasound ([[Ultraleap]]), or other actuators to simulate touch, texture, and impact. | |||
*'''[[Digital scent technology|Olfactory Displays]]''': Systems that generate scents synchronized with virtual environments to enhance immersion (for example [[OVR Technology]]). | |||
*'''[[Motion capture|Full-body Tracking]]''': Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers ([[Vive Tracker]]), webcam-based AI solutions, or integrated sensors. | |||
===Computational Capabilities=== | |||
*'''[[Edge computing|Edge/Cloud Computing]]''': Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (for example [[NVIDIA CloudXR]], [[Plutosphere]]).<ref name="Liu2019EdgeAR">Liu, L.; Li, H.; Gruteser, M. (2019). "Edge assisted real-time object detection for mobile augmented reality". ''Proceedings of the 25th Annual International Conference on Mobile Computing and Networking'': 1-16. doi:10.1145/3300061.3345431.</ref> | |||
*'''[[Artificial intelligence|AI Integration]]''': On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction. | |||
===Interfaces=== | |||
*'''Improved Hand/Eye/Face Tracking''': Higher fidelity tracking of expressions and micro-movements for more realistic avatars and nuanced control. | |||
*'''[[Brain-Computer Interface|Neural Interfaces]]''': Non-invasive BCIs (for example EMG wristbands, EEG sensors) may offer supplementary input channels in the future. | |||
== | ==Market Outlook== | ||
Market analysis firms like [[International Data Corporation|IDC]] report fluctuating but generally growing shipments of AR/VR headsets. For instance, IDC forecasted global shipments to reach 9.1 million units in 2024, with projected growth to 22.9 million by 2028, driven significantly by the adoption of mixed-reality capable devices and maturing technology in both consumer and enterprise segments.<ref name="idc2025" /> | |||
==See Also== | |||
* [[Augmented Reality]] (AR) | |||
* [[Binocular vision]] | |||
* [[Brain-Computer Interface]] | |||
* [[Degrees of Freedom]] (DoF) | |||
* [[Display technology]] | |||
* [[Eye tracking]] | |||
* [[Foveated rendering]] | |||
* [[Field of View]] (FOV) | |||
* [[Hand tracking]] | |||
* [[Haptic technology]] | |||
* [[Helmet-mounted display]] | |||
* [[Immersion (virtual reality)]] | |||
* [[Inside-out tracking]] | |||
* [[Latency (engineering)]] | |||
* [[Mixed Reality]] (MR) | |||
* [[Motion Controllers]] | |||
* [[Motion Sickness]] | |||
* [[Optical head-mounted display]] (OHMD) | |||
* [[Outside-in tracking]] | |||
* [[Pancake lens]] | |||
* [[Presence (virtual reality)]] | |||
* [[Smartglasses]] | |||
* [[Spatial Audio]] | |||
* [[Stereoscopy]] | |||
* [[Tracking system|Tracking]] (VR/AR context) | |||
* [[Vergence-accommodation conflict]] | |||
* [[Video passthrough]] | |||
* [[Virtual Reality]] (VR) | |||
* [[Virtual Reality Devices]] | |||
* [[Waveguide (optics)]] | |||
==References== | |||
<references /> | |||
[[Category:Terms]] | [[Category:Terms]] | ||
[[Category:Technical Terms]] | |||
[[Category:Hardware]] | |||
[[Category:Displays]] | |||
[[Category:Virtual Reality]] | |||
[[Category:Augmented Reality]] | |||
[[Category:Input devices]] | |||
[[Category:Head-mounted displays]] |
Latest revision as of 19:19, 1 May 2025
- See also: Terms and Technical Terms

A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet (see Helmet-mounted display), that has a small display optic in front of one (monocular HMD) or each eye (binocular HMD). HMDs serve various purposes, including gaming, aviation, engineering, medicine, and are the primary delivery systems for Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) experiences, particularly when supporting a seamless blend of physical and digital elements.[1][2]
HMDs function by presenting imagery, data, or a combination thereof directly to the wearer's visual field. Many modern HMDs are stereoscopic, featuring separate displays or distinct images rendered for each eye to create a sense of depth through binocular disparity. Examples include VR headsets like the Meta Quest 3 and Valve Index. Other HMDs, particularly earlier AR devices or specialized notification displays like the original Google Glass, may be monocular, presenting information over only one eye.[3]
The vast majority of consumer and enterprise VR and AR systems rely on HMDs. In AR applications, the display system is typically designed to be see-through, allowing digital information to be superimposed onto the user's view of the real world. These are often specifically termed Optical head-mounted displays (OHMDs), utilizing technologies like waveguides or beam splitters.[4] In VR applications, the display system is opaque, completely blocking the user's view of the real world and replacing it with a computer-generated virtual environment, aiming for high levels of immersion and presence.[5] Some modern VR HMDs incorporate external cameras to provide video passthrough capabilities, enabling a form of AR or "Mixed Reality" where the real world is viewed digitally on the opaque screens with virtual elements overlaid.
History
The concept of a head-mounted display dates back further than often realized. One of the earliest precursors was Morton Heilig's "Telesphere Mask" patented in 1960, a non-computerized, photographic-based stereoscopic viewing device intended for individual use.[6][7]
However, the first true HMD connected to a computer is widely credited to Ivan Sutherland and his student Bob Sproull at Harvard University and later the University of Utah, around 1968. Dubbed the "Sword of Damocles" due to its imposing size and the heavy machinery suspended from the ceiling required to support its weight and track head movement, it presented simple wireframe graphics in stereo. This system pioneered many concepts still fundamental to VR and AR today, including head tracking and stereoscopic viewing.[1][8]
Throughout the 1970s and 1980s, HMD development continued primarily in military (especially for aviator helmet-mounted displays) and academic research labs, driven by organizations like the US Air Force and NASA.[9] The late 1980s and early 1990s saw a "first wave" of commercial VR interest, with companies like VPL Research, founded by Jaron Lanier, popularizing the term "Virtual Reality" and developing HMDs like the "EyePhone". However, technology limitations (low resolution, high latency, limited processing power, high cost) prevented widespread adoption.[10] Nintendo's Virtual Boy (1995), while technically an HMD, used red LED displays and lacked head tracking, failing commercially but remaining a notable early attempt at consumer VR.[11]
The modern era of consumer VR HMDs was effectively kickstarted by Palmer Luckey's prototype Oculus Rift in the early 2010s, which demonstrated that high-quality, low-latency VR was becoming feasible with modern mobile display panels and sensors. Its subsequent Kickstarter success and acquisition by Facebook (now Meta) spurred renewed industry-wide investment.[12] This led to the release of numerous consumer HMDs:
- 2014 - Google Cardboard popularised low-cost, smartphone-driven VR viewers.[13]
- 2015 - Samsung Gear VR improved on the smartphone HMD concept with better optics and integrated controls.[14]
- 2016 - The consumer Oculus Rift (CV1) and HTC Vive established high-end, PC-tethered VR with wide FOV and robust external tracking systems.[15][16] PlayStation VR brought tethered VR to the console market.
- 2019 - Oculus Quest pioneered high-quality standalone (untethered) 6DoF VR using inside-out camera tracking (Oculus Insight).[17] Valve Index pushed fidelity in the PC VR space.
- 2023 - Meta Quest 3 adopted the pancake-style optics first introduced on the premium Meta Quest Pro (launched October 2022), and added high-resolution full-colour passthrough mixed reality plus a faster mobile chipset.[18][19][20]
- 2024 - Apple Vision Pro launched as a premium "spatial computer" featuring high-resolution Micro-OLED displays, advanced eye and hand tracking, and spatial video capabilities.[21]
Core Concepts and Principles
Visual Pathway
The fundamental operation of an HMD involves generating an image and directing it into the user's eyes. This typically follows a path:
- Central Processing Unit (CPU) / Graphics Processing Unit (GPU): Process application logic, handle tracking data, and render the images (frames) intended for display. In standalone HMDs, these processors are inside the headset; in tethered HMDs, they are in a connected PC or console.
- Display Panel(s): Small, high-resolution screens (for example LCD, OLED, Micro-OLED) receive the rendered images from the GPU. Binocular HMDs typically use either one panel displaying side-by-side images or two separate panels, one for each eye.
- Optics (Lenses): Placed between the display panels and the user's eyes, lenses serve multiple crucial functions:
- Magnification: They enlarge the small display image to fill a significant portion of the user's Field of View (FOV).
- Focus: They collimate the light or set the focal plane, typically at a distance of 1.5-2 meters or optical infinity, reducing eye strain compared to focusing on a screen inches away.
- Distortion Correction: Simple magnification often introduces optical distortion (like pincushion distortion). The rendered image is typically pre-distorted (barrel distortion) in software to counteract the lens distortion, resulting in a geometrically correct view for the user.[22]
- Eyes: The light (photons) carrying the image information passes through the lenses and enters the user's pupils, forming an image on the retina.
Stereoscopic Vision
Most VR HMDs and many AR HMDs are stereoscopic. They achieve the perception of three-dimensional depth by presenting slightly different images to each eye, mimicking how humans perceive depth in the real world through binocular disparity. The GPU renders the virtual scene from two slightly offset virtual camera positions, corresponding to the user's left and right eyes. When viewed simultaneously, the brain fuses these two images into a single 3D percept.[23] The distance between these virtual cameras should ideally match the user's Interpupillary distance (IPD) for accurate scale perception and comfort.
Tracking
Tracking the user's head movement is fundamental to creating immersive and interactive experiences, particularly in VR. As the user moves their head, the system updates the rendered images accordingly, making the virtual world appear stable and allowing the user to look around naturally. Failure to track accurately and with low latency can lead to disorientation and Motion sickness (often termed "cybersickness" in VR/AR contexts).[24] Tracking operates in multiple Degrees of Freedom (DoF):
- Rotational Tracking (3DoF): Tracks orientation changes: pitch (nodding yes), yaw (shaking no), and roll (tilting head side-to-side). This is the minimum required for a basic VR experience where the user can look around from a fixed viewpoint. It is typically achieved using an Inertial Measurement Unit (IMU) within the HMD, containing sensors like:
- Accelerometer: Measures linear acceleration (and gravity).
- Gyroscope: Measures angular velocity.
- Magnetometer: Measures the local magnetic field (like a compass), used to correct for gyroscope drift, especially in yaw. Sensor fusion algorithms combine data from these sensors to provide a stable orientation estimate.[25][26]
- Positional Tracking (6DoF): Tracks both orientation (3DoF) and translation (movement through space: forward/backward, left/right, up/down). This allows the user to physically walk around, lean, crouch, and dodge within the virtual environment, significantly enhancing immersion and interaction. 6DoF tracking is achieved through various methods:
- Outside-in tracking: External sensors (cameras or infrared emitters/detectors like Valve's Lighthouse system) are placed in the room to track markers (passive reflective or active IR LED) on the HMD and controllers. Examples: Original Oculus Rift (Constellation), HTC Vive/Valve Index (Lighthouse).[27][16]
- Inside-out tracking: Cameras mounted on the HMD itself observe the surrounding environment. Computer vision algorithms, often employing Simultaneous Localization and Mapping (SLAM) techniques, identify features in the room and track the HMD's movement relative to them. This eliminates the need for external sensors, making setup easier and enabling larger, unrestricted tracking volumes. Most modern standalone and many tethered HMDs use inside-out tracking. Examples: Meta Quest series, HTC Vive Cosmos, Windows Mixed Reality headsets.[28][17]
Latency
Motion-to-photon latency - the time delay between a user's physical movement and the corresponding visual update on the display. It is a critical factor for comfort and immersion. High latency is strongly correlated with cybersickness. Modern VR systems aim for latency below 20 milliseconds (ms), with many achieving closer to 10ms under optimal conditions.[29][30]
Key Technical Specifications
The quality and characteristics of an HMD are determined by numerous technical specifications:
Display Technology
The type of display panel used significantly impacts image quality. Common types include:
- LCD (Liquid Crystal Display): Often offers higher pixel density (reducing the screen-door effect) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and quantum dots for better color.[31]
- OLED (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "burn-in" over long periods and may use PenTile subpixel layouts affecting perceived sharpness.
- Micro-OLED / OLEDoS (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (for example Bigscreen Beyond, Apple Vision Pro).[32][33]
- MicroLED: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs.
Resolution
The number of pixels on the display(s), usually specified per eye (for example 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the screen-door effect (the visible grid pattern between pixels) and increases image sharpness. Pixels Per Degree (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally.
Refresh Rate
The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (for example 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.[34]
Field of View (FOV)
The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like Pimax headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (for example 30°-55°) due to the challenges of see-through optics.[35]
Optics / Lenses
The lenses used heavily influence FOV, image sharpness (center-to-edge), chromatic aberration, geometric distortion, and physical characteristics like size and weight.
- Aspheric Lenses: Simple, often used in early or budget HMDs. Can be bulky.
- Fresnel Lenses: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (for example Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "God rays" (stray light scattering off the ridges).
- Pancake Lenses: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.[36][37]
- Waveguides (AR): Used in many see-through OHMDs (for example HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using diffractive or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.[4][38]
- Beam Splitters / Birdbaths (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.[39]
- Holographic Optical Elements (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays.
Interpupillary distance (IPD) Adjustment
The distance between the centers of the pupils varies between individuals (typically 54-72mm for adults). HMDs need to accommodate this for optimal clarity, comfort, and correct stereo rendering. Adjustment can be:
- Physical/Manual: Lenses can be moved closer together or further apart, often via a slider or dial (for example Valve Index, Quest 3). Continuous adjustment allows finer tuning.
- Stepped: Some HMDs offer discrete IPD steps (for example original Quest, Quest 2).
- Software-based: The rendering viewpoint separation is adjusted in software (less common or effective for major mismatches without physical lens movement).
- Automatic: High-end systems might use eye-tracking to measure IPD and adjust automatically or prompt the user.
Eye tracking
Sensors (typically small infrared cameras) inside the HMD track the user's gaze direction. This enables:
- Foveated rendering: Rendering the area where the user is looking at full resolution, and the periphery at lower resolution, saving significant computational power.[40]
- Improved Social Interaction: Avatars can mimic the user's eye movements, enhancing realism in social VR.
- Automatic IPD adjustment.
- Gaze-based interaction as an input method.
- Examples: Meta Quest Pro, PlayStation VR2, HTC Vive Pro Eye, Apple Vision Pro.
Connectivity
How the HMD connects to the processing unit (if not standalone).
- Wired: Typically USB (often Type-C for power/data) and DisplayPort or HDMI for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement.
- Wireless: Uses Wi-Fi (often Wi-Fi 6/6E/7) or proprietary radio frequencies (for example older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: HTC Vive Wireless Adapter, Meta Air Link, Virtual Desktop.[41]
Audio
Sound is crucial for immersion. HMDs may feature:
- Spatial Audio: 3D audio rendering techniques that make sounds appear to come from specific locations in the virtual environment. Supported via various output methods.
- Integrated Speakers: Often open-ear near-field speakers built into the strap or near the ears, providing spatial audio without blocking external sounds (for example Valve Index, Quest series).
- Headphone Jack (3.5mm): Allows users to connect their own headphones or earbuds.
- Integrated Headphones: High-fidelity on-ear or over-ear headphones attached to the HMD (for example original Rift CV1, HP Reverb G2).
- Microphone Arrays: Multiple microphones for clear voice input, communication in multiplayer apps, voice commands, and potentially noise cancellation.
Ergonomics
Factors affecting comfort during extended use:
- Weight: Lighter is generally better (most consumer HMDs are 400-700g).
- Weight Distribution: Balanced weight (front-to-back) is often more important than total weight. Battery placement in standalone HMDs (for example rear-mounted) can improve balance.
- Strap Design: Different mechanisms (soft elastic straps, rigid "halo" straps, top straps) distribute pressure differently.
- Facial Interface: Foam or fabric padding, material breathability, light blocking. Options/space for glasses wearers or prescription lens inserts.
Types of HMDs
HMDs can be broadly categorized based on their functionality and required hardware:
Virtual Reality (VR) HMDs
- See also: Virtual Reality Devices
These devices aim to fully immerse the user in a virtual world, blocking out the real environment.
Discrete HMD (Tethered HMD)
These HMDs contain displays, optics, sensors, and audio, but rely on an external processing unit, typically a powerful PC or a game console, connected via cables (or sometimes a dedicated wireless adapter). They generally offer the highest fidelity graphics and performance due to leveraging powerful external GPUs.
- PC VR Examples: Valve Index, HTC Vive Pro 2, HP Reverb G2, original Oculus Rift, Oculus Rift S, Varjo Aero, Pimax series.
- Console VR Examples: PlayStation VR, PlayStation VR2 (connects to PlayStation consoles).
Integrated HMD (Standalone HMD)
Also known as All-in-One (AIO) HMDs, these devices contain all necessary components, displays, optics, sensors, processing (CPU/GPU, often based on mobile chipsets like Qualcomm Snapdragon XR series), memory, storage, battery, and tracking, within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end PC VR setups. Many standalone HMDs can optionally connect to a PC via cable (for example Meta Link) or wirelessly (for example Air Link, Virtual Desktop) to function as a PC VR headset.
- Examples: Meta Quest 2, Meta Quest 3, Meta Quest Pro, Pico 4, Pico Neo 3 Link, HTC Vive Focus 3, HTC Vive XR Elite.
Slide-on HMD (Smartphone HMD)
These were an early, low-cost entry point to VR, consisting of a simple enclosure (often plastic or cardboard) with lenses, into which a compatible smartphone was inserted. The smartphone provided the display, processing, and basic 3DoF tracking (using its internal IMU). While popular initially due to accessibility (for example Google Cardboard, Samsung Gear VR, Google Daydream View), they suffered from limitations like lower display quality, higher latency, potential overheating, limited interaction (often just a single button or touchpad), inconsistent experiences across different phones, and generally only 3DoF tracking. This category is now largely obsolete, superseded by standalone HMDs.
- Examples: Google Cardboard, Samsung Gear VR, Google Daydream View, Zeiss VR One, Merge VR/AR Goggles.
Augmented Reality (AR) / Mixed Reality (MR) HMDs
These devices overlay digital information onto the user's view of the real world, or blend real and virtual views.
Optical head-mounted display (OHMD)
These HMDs use transparent optical elements (like waveguides or beam splitters) placed in front of the user's eyes. A small projector or microdisplay generates the digital image, which is then directed through the optics and combined with the light from the real world, allowing the user to see both simultaneously. Achieving a wide FOV, high brightness, good opacity for virtual objects, and unobtrusive form factor are major challenges. Often targeted at enterprise, industrial, or specific professional use cases due to cost and complexity.
- Examples: Microsoft HoloLens, Microsoft HoloLens 2, Magic Leap 1, Magic Leap 2, Vuzix Blade, RealWear Navigator, Google Glass Enterprise Edition, Nreal Light (now XREAL Light). Simple notification-style displays like original Google Glass also fall under this category.
Video Passthrough AR/MR HMDs
These utilize opaque displays, essentially functioning like VR HMDs, but incorporate outward-facing cameras. The live video feed from these cameras is processed (often correcting for distortion and perspective) and displayed on the internal screens, with digital elements rendered on top. This allows users to see their surroundings digitally, blended with virtual content. Modern implementations increasingly use high-resolution, low-latency, color cameras, aiming to create a more seamless blend ("Mixed Reality"). While not optically transparent, they can offer wider FOV for the AR/MR content compared to many current OHMDs and better occlusion of real objects by virtual ones.
- Examples: Meta Quest Pro, Meta Quest 3, Apple Vision Pro, HTC Vive XR Elite, Varjo XR-3, Lynx R1.[42]
Components of HMDs
While varying significantly based on type and purpose, most modern HMDs incorporate several key components:
- Display Panels & Optics: Generate the visual output and direct it to the eyes (see Key Technical Specifications above).
- Sensors: Detect movement, user input, and sometimes the environment.
- Inertial Measurement Unit (IMU): Core component for rotational (3DoF) tracking. Essential for low-latency orientation updates.
- Cameras: Crucial for modern HMDs. Can include:
- Visible light cameras: Used for inside-out positional (6DoF) tracking, video passthrough (especially color passthrough), environment mapping, hand tracking.
- Infrared (IR) cameras: Often used for inside-out tracking (less susceptible to ambient light changes), controller tracking (detecting IR LEDs on controllers), and eye tracking.
- Depth Sensors: (for example Time-of-Flight, Structured Light, Active Stereo IR). Used in some AR/MR HMDs (like HoloLens, Vision Pro) for accurate spatial mapping, environment understanding, hand tracking, and occlusion.
- Eye Tracking Cameras: Small internal IR cameras pointed at the user's eyes to monitor gaze direction and pupil characteristics.
- Processors:
- CPU/GPU: Handle rendering, tracking calculations, application logic (essential in standalone HMDs, often a mobile SoC like Snapdragon XR series).
- Specialized Processors: May include Vision Processing Units (VPUs) or Neural Processing Units (NPUs) to efficiently handle computer vision tasks (SLAM, hand/eye tracking) or AI workloads, offloading the main CPU/GPU. Microsoft's Holographic Processing Unit (HPU) in HoloLens is an example.[43]
- Memory & Storage: RAM for active processing and onboard storage (in standalone HMDs) for the operating system, applications, and media.
- Audio System: Integrated speakers, microphones, headphone jacks (see Key Technical Specifications above).
- Connectivity Hardware: Wi-Fi, Bluetooth radios (especially in standalone HMDs), USB ports, video input ports (in tethered HMDs).
- Power System: Battery (in standalone or wireless HMDs), power regulation circuitry, charging ports (often USB-C).
- Mechanical Structure & Ergonomics: Housing, straps, facial interface, IPD adjustment mechanisms, thermal management (fans, heat sinks).
- Input Mechanisms: How the user interacts with the system.
- Motion Controllers: Handheld devices tracked in 3D space (usually 6DoF), typically including buttons, triggers, joysticks/touchpads, and haptic feedback. Primary input for most VR systems (for example Meta Touch controllers, Valve Index controllers, PS VR2 Sense controllers).
- Hand tracking: Camera-based systems that track the user's bare hands and finger movements without requiring controllers. Offers natural interaction but lacks physical buttons and haptic feedback. Increasingly standard on standalone VR/MR headsets (Quest series, Vision Pro).
- Eye tracking: Used for gaze-based selection, foveated rendering, and social presence.
- Voice Commands: Using built-in microphones and speech recognition software for hands-free control.
- Brain-Computer Interfaces (BCI): Experimental interfaces reading neural signals, potentially via electrodes integrated into the HMD, for direct thought control. Still largely in research phases for consumer HMDs (for example NextMind, CTRL-labs research).
Applications
HMDs enable a wide range of applications across various fields:
- Gaming and Entertainment: Immersive video games (Beat Saber, Half-Life: Alyx), virtual cinemas, Social VR platforms (VRChat, Rec Room), location-based VR experiences, Virtual tourism.
- Training and Simulation: Flight simulation, surgical training, military exercises, emergency response training, complex machinery operation training, workplace safety drills.[44]
- Design and Engineering: Computer-Aided Design (CAD) review and collaboration, architectural visualization (virtual walkthroughs), virtual prototyping, ergonomic assessments, Digital twin interaction.[45]
- Telepresence and Collaboration: Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces (Spatial, Horizon Workrooms).
- Medical: Surgical planning and visualization (overlaying medical imaging onto patients via AR), therapy (for example exposure therapy for phobias/PTSD, pain management/distraction), rehabilitation, medical education. [46]
- Education: Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning.
- Data Visualization: Exploring complex datasets (for example financial data, scientific simulations) in interactive 3D space.
- Military and Aviation: Helmet-mounted displays providing flight data, targeting information, situational awareness overlays, night vision integration (for example IVAS).
Challenges and Limitations
Despite significant progress, HMD technology still faces challenges:
- Visual Fidelity: Achieving resolution and clarity that matches human vision ("retinal resolution" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, mura, and motion blur remain ongoing goals.[47]
- Comfort and Ergonomics: Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.[48]
- Vergence-accommodation conflict: In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.[49] Solutions like varifocal and light field displays are complex and still largely experimental.
- Motion Sickness / Cybersickness: While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and vestibular input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.[50][51]
- Tracking Robustness: Inside-out tracking can struggle in poorly lit or overly bright environments, on large featureless surfaces (blank walls), with reflective surfaces (mirrors), or during very fast head/body movements. Outside-in tracking requires external sensor setup and has a limited, fixed tracking volume.
- Content Ecosystem: The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches.
- Cost: High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars.
- Social Acceptance: Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.[52]
- Health and Safety: Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (for example IEC 60825-1 for lasers in depth sensors) must be followed.[53][54] Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.[55]
Future Trends and Developments
The HMD landscape continues to evolve rapidly, with several promising developments on the horizon:
Display Advancements
- Varifocal Displays: Systems that dynamically adjust focal depth based on where the user is looking (using eye tracking) or based on scene content, addressing the vergence-accommodation conflict. Technologies include movable lenses/displays, liquid crystal lenses, Alvarez lenses, and multi-focal plane displays.[56]
- Light Field Displays: Generate a more complete representation of light, allowing the eye to focus naturally at different depths within the virtual scene. Still complex and computationally intensive.[57]
- Holographic Displays: Aim to reconstruct the wavefront of light from a 3D scene, potentially offering the most natural 3D viewing experience without conflicts. True holographic HMDs are still highly experimental.
- Higher resolution, brightness, efficiency via Micro-OLED and MicroLED maturation.
Form Factor Evolution
- Lightweight Designs: Advanced optics (pancake, metalenses, HOEs) and display technologies are enabling significantly thinner, lighter headsets (sub-300g or even sub-100g).
- AR Glasses: The long-term goal for AR is achieving normal eyeglass form factors with all-day wearability and significant compute/display capabilities. Projects like Project Aria (Meta research) and rumored Apple glasses point toward this future.[58]
Sensory Expansion
- Advanced Haptic Feedback: Beyond simple controller rumble, providing more nuanced tactile sensations via gloves (HaptX), bodysuits (bHaptics), ultrasound (Ultraleap), or other actuators to simulate touch, texture, and impact.
- Olfactory Displays: Systems that generate scents synchronized with virtual environments to enhance immersion (for example OVR Technology).
- Full-body Tracking: Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers (Vive Tracker), webcam-based AI solutions, or integrated sensors.
Computational Capabilities
- Edge/Cloud Computing: Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (for example NVIDIA CloudXR, Plutosphere).[59]
- AI Integration: On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction.
Interfaces
- Improved Hand/Eye/Face Tracking: Higher fidelity tracking of expressions and micro-movements for more realistic avatars and nuanced control.
- Neural Interfaces: Non-invasive BCIs (for example EMG wristbands, EEG sensors) may offer supplementary input channels in the future.
Market Outlook
Market analysis firms like IDC report fluctuating but generally growing shipments of AR/VR headsets. For instance, IDC forecasted global shipments to reach 9.1 million units in 2024, with projected growth to 22.9 million by 2028, driven significantly by the adoption of mixed-reality capable devices and maturing technology in both consumer and enterprise segments.[2]
See Also
- Augmented Reality (AR)
- Binocular vision
- Brain-Computer Interface
- Degrees of Freedom (DoF)
- Display technology
- Eye tracking
- Foveated rendering
- Field of View (FOV)
- Hand tracking
- Haptic technology
- Helmet-mounted display
- Immersion (virtual reality)
- Inside-out tracking
- Latency (engineering)
- Mixed Reality (MR)
- Motion Controllers
- Motion Sickness
- Optical head-mounted display (OHMD)
- Outside-in tracking
- Pancake lens
- Presence (virtual reality)
- Smartglasses
- Spatial Audio
- Stereoscopy
- Tracking (VR/AR context)
- Vergence-accommodation conflict
- Video passthrough
- Virtual Reality (VR)
- Virtual Reality Devices
- Waveguide (optics)
References
- ↑ Jump up to: 1.0 1.1 Sutherland, Ivan E. (1968-12-09). "A head-mounted three dimensional display". ACM Digital Library. Retrieved 2023-10-27. https://dl.acm.org/doi/10.1145/1476589.1476686
- ↑ Jump up to: 2.0 2.1 IDC (25 March 2025). "Growth Expected to Pause for AR/VR Headsets, according to IDC". Retrieved 2025-05-15. https://www.idc.com/getdoc.jsp?containerId=prUS53278025
- ↑ Heinrich, Jerome (assignee: Google Inc.) (2014-07-29). "Wearable display device". Google Patents. Retrieved 2023-10-27. https://patents.google.com/patent/US8791879B1/en
- ↑ Jump up to: 4.0 4.1 Kress, Bernard C. & Starner, Thad (2018-11-01). "Optical see-through head-mounted displays: a review". Applied Optics. 57 (31): 9311-9325. doi:10.1364/AO.57.009311. Retrieved 2023-10-27. https://www.osapublishing.org/ao/abstract.cfm?uri=ao-57-31-9311
- ↑ Slater, Mel & Sanchez-Vives, Maria V. (2016). "Chapter 1: Immersive Virtual Reality". Enhancing Our Lives with Immersive Virtual Reality. Elsevier. ISBN 978-0128046377.
- ↑ Heilig, Morton L. (1960-10-04). "Stereoscopic-television apparatus for individual use". Google Patents. Retrieved 2023-10-27. https://patents.google.com/patent/US2955156A/en
- ↑ USPTO (28 June 1960). "US 2,955,156 — Stereoscopic-television apparatus for individual use". Retrieved 2024-05-15. https://patents.google.com/patent/US2955156A
- ↑ Wikipedia (20 April 2025). "The Sword of Damocles". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/The_Sword_of_Damocles_(virtual_reality)
- ↑ Fisher, S. S.; McGreevy, M.; Humphries, J.; Robinett, W. (1986-01-01). "Virtual Environment Display System". NASA Technical Reports Server. Retrieved 2023-10-27. https://ntrs.nasa.gov/citations/19860018487
- ↑ Rheingold, Howard (1991). Virtual Reality. Simon & Schuster. ISBN 978-0671693633.
- ↑ Edwards, Benj (2015-08-21). "Unraveling The Enigma Of Nintendo’s Virtual Boy, 20 Years Later". Fast Company. Retrieved 2023-10-27. https://www.fastcompany.com/3050016/unraveling-the-enigma-of-nintendos-virtual-boy-20-years-later
- ↑ Kickstarter. "Oculus Rift: Step Into the Game". Retrieved 2023-10-27. https://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game
- ↑ Time Magazine (28 Jan 2016). "Google’s New Head of Virtual Reality on What They’re Planning Next". Retrieved 2024-05-15. https://time.com/4193755/google-cardboard-virtual-reality-clay-bavor-vr/
- ↑ Wikipedia (24 Apr 2025). "Samsung Gear VR". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/Samsung_Gear_VR
- ↑ Wikipedia (15 Apr 2025). "Oculus Rift". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/Oculus_Rift
- ↑ Jump up to: 16.0 16.1 Valve Software (12 Feb 2025). "Valve Index Base Stations". Retrieved 2024-05-15. https://www.valvesoftware.com/index/base-stations
- ↑ Jump up to: 17.0 17.1 Meta Reality Labs (14 Aug 2019). "The Story Behind Oculus Insight Technology". Retrieved 2024-05-15. https://tech.facebook.com/reality-labs/2019/8/the-story-behind-oculus-insight-technology/
- ↑ "Meta Connect 2022: Meta Quest Pro, More Social VR and a Look Into the Future". 11 October 2022. https://about.fb.com/news/2022/10/meta-quest-pro-social-vr-connect-2022/.
- ↑ "Quest 3 Features Confirmed in First Hands-on". 12 June 2023. https://www.roadtovr.com/quest-3-features-hands-on-preview/.
- ↑ "Quest 3 Review: Excellent VR With Limited Mixed Reality". 16 October 2023. https://www.uploadvr.com/quest-3-review/.
- ↑ Apple Newsroom (08 Jan 2024). "Apple Vision Pro available in the U.S. on February 2". Retrieved 2024-05-15. https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/
- ↑ Oculus Developer Documentation. "Distortion Correction". Retrieved 2023-10-27. https://developer.oculus.com/documentation/native/pc/dg-render-distortion/
- ↑ Howard, Ian P. & Rogers, Brian J. (1995). Binocular Vision and Stereopsis. Oxford University Press. ISBN 978-0195084764.
- ↑ LaViola Jr., Joseph J. (2000). "A discussion of cybersickness in virtual environments". ACM SIGCHI Bulletin. 32 (1): 47-56. doi:10.1145/333329.333033. Retrieved 2023-10-27. https://ieeexplore.ieee.org/document/947376
- ↑ Pell, Oliver (2017-07-12). "Use of IMU in Virtual Reality Systems". Analog Dialogue, Analog Devices. Retrieved 2023-10-27. https://www.analog.com/en/technical-articles/imu-in-virtual-reality-systems.html
- ↑ Wikipedia (20 Apr 2025). "Inertial measurement unit". Retrieved 2024-05-15. https://en.wikipedia.org/wiki/Inertial_measurement_unit
- ↑ XinReality Wiki. "Lighthouse". Retrieved 2023-10-27. https://xinreality.com/wiki/Lighthouse
- ↑ Yousif, K.; Bab-Hadiashar, A.; Hand, S. (2019-07-30). "A Review on SLAM Techniques for Virtual and Augmented Reality Applications". Sensors. 19 (15): 3338. doi:10.3390/s19153338. PMC 6696193. Retrieved 2023-10-27. https://www.mdpi.com/1424-8220/19/15/3338
- ↑ Abrash, Michael (2014-01-15). "What VR could, should, and almost certainly will be within two years". Steam Dev Days. Retrieved 2024-05-15. https://www.youtube.com/watch?v=G-2dQoeqVVo
- ↑ MDPI Sensors (10 Aug 2022). "A Study on Sensor System Latency in VR Motion Sickness". Retrieved 2024-05-15. https://www.mdpi.com/2224-2708/10/3/53
- ↑ AR/VR Tips. "LCD vs OLED VR Headsets: Which Screen is Best?". Retrieved 2023-10-27. https://arvrtips.com/lcd-vs-oled-vr-headsets/
- ↑ OLED-Info. "MicroOLED displays". Retrieved 2023-10-27. https://www.oled-info.com/microoled
- ↑ Systems Contractor News (23 Apr 2025). "Dual micro-OLED displays grow within the AR/VR headset market". Retrieved 2024-05-15. https://www.svconline.com/proav-today/dual-micro-oled-displays-grow-within-the-ar-vr-headset-market
- ↑ Abrash, Michael (2014-07-28). "Understanding Low Persistence on the DK2". Oculus Developer Blog. Retrieved 2023-10-27. https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/
- ↑ VR Compare. "Headset Feature: Field of View". Retrieved 2023-10-27. https://vr-compare.com/headsetfeature/fieldofview
- ↑ Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/
- ↑ Expand Reality (05 Oct 2023). "Pancake vs Fresnel Lenses in VR Headsets". Retrieved 2024-05-15. https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr
- ↑ Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides
- ↑ Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/
- ↑ NVIDIA Developer. "NVIDIA Variable Rate Shading (VRS) & Foveated Rendering". Retrieved 2023-10-27. https://developer.nvidia.com/vrworks/graphics/foveatedrendering
- ↑ Heaney, David (2022-01-20). "Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless". UploadVR. Retrieved 2023-10-27. https://uploadvr.com/wireless-pc-vr-comparison/
- ↑ Lang, Ben (2023-02-15). "VR Headset Passthrough AR Explained". Road to VR. Retrieved 2023-10-27. https://www.roadtovr.com/vr-headset-passthrough-ar-explained-quest-2-pro-index-vive-pro-2/
- ↑ Microsoft Learn. "HoloLens 2 hardware". Retrieved 2023-10-27. https://learn.microsoft.com/en-us/hololens/hololens2-hardware
- ↑ Freina, Laura & Ott, Michela (2015). "Virtual reality for simulation and training". E-Learning and Digital Media. 12 (3-4): 368-383. doi:10.1177/2042753015591756. Retrieved 2023-10-27. https://link.springer.com/article/10.1007/s10055-016-0293-5
- ↑ Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. https://www.autodesk.com/solutions/virtual-reality
- ↑ Rizzo, Albert "Skip" & Kim, Giyoung (2019-11-15). "Applications of Virtual Reality for Clinical Neuropsychology: A Review". Journal of Medical Internet Research. 21 (11): e14190. doi:10.2196/14190. PMID 31730019. PMC 6880643. Retrieved 2023-10-27. https://jmir.org/2019/11/e14190/
- ↑ Kim, J.; Jeong, Y.; Stengel, M.; et al. (2019). "Foveated AR: dynamically-foveated augmented reality display". ACM Transactions on Graphics. 38 (4): 1-15. doi:10.1145/3306346.3322983.
- ↑ Talsma, S. W.; Usmani, S. A.; Chen, P. Y. (2020). "Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays". Journal of the Society for Information Display. 28 (11): 841-850. doi:10.1002/jsid.943.
- ↑ Hoffman, David M.; Girshick, Ahna R.; Akeley, Kurt; Banks, Martin S. (2008-03-18). "The vergence-accommodation conflict: Practical consequences and solutions". Journal of Vision. 8 (3): 33. doi:10.1167/8.3.33. Retrieved 2023-10-27. https://jov.arvojournals.org/article.aspx?articleid=2193631
- ↑ Weech, S.; Kenny, S.; Barnett-Cowan, M. (2019). "Presence and cybersickness in virtual reality are negatively related: a review". Frontiers in Psychology. 10: 158. doi:10.3389/fpsyg.2019.00158. PMID 30787884. PMC 6374254.
- ↑ Frontiers in Virtual Reality (20 Mar 2020). "Factors Associated With Virtual Reality Sickness in Head-Mounted Displays". Retrieved 2024-05-15. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/
- ↑ Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems: 1-19. doi:10.1145/3313831.3376101.
- ↑ Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". Scientific Reports. 7 (1): 1-11. doi:10.1038/s41598-017-14811-x.
- ↑ Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers
- ↑ Madary, M. & Metzinger, T. K. (2016). "Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology". Frontiers in Robotics and AI. 3: 3. doi:10.3389/frobt.2016.00003.
- ↑ Huang, F. C.; Chen, K.; Wetzstein, G. (2015). "The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues". ACM Transactions on Graphics. 34 (4): 1-12. doi:10.1145/2766949.
- ↑ Delaney, K. (2021). "The race toward human-centered AR glasses". IEEE Computer Graphics and Applications. 41 (5): 112-115. doi:10.1109/MCG.2021.3097740.
- ↑ Liu, L.; Li, H.; Gruteser, M. (2019). "Edge assisted real-time object detection for mobile augmented reality". Proceedings of the 25th Annual International Conference on Mobile Computing and Networking: 1-16. doi:10.1145/3300061.3345431.