Head-mounted display: Difference between revisions
Appearance
Xinreality (talk | contribs) No edit summary |
Xinreality (talk | contribs) No edit summary |
||
Line 1: | Line 1: | ||
{{DISPLAYTITLE:Head-mounted display}} | |||
{{Good article}} | {{Good article}} | ||
A '''head-mounted display''' ('''HMD''') is a [[display]] [[device]], worn on the head or as part of a [[helmet]] (see Helmet-mounted display), that has a small display optic in front of one ([[monocular]] HMD) or each eye ([[binocular]] HMD). HMDs serve various purposes, including gaming, aviation, engineering, medicine, and are the primary delivery systems for [[Virtual Reality]] (VR) | {{short description|Near-eye wearable display for virtual, augmented and mixed reality}} | ||
A '''head-mounted display''' ('''HMD''') is a [[display]] [[device]], worn on the head or as part of a [[helmet]] (see [[Helmet-mounted display]]), that has a small display optic in front of one ([[monocular]] HMD) or each eye ([[binocular]] HMD). HMDs serve various purposes, including gaming, aviation, engineering, medicine, and are the primary delivery systems for [[Virtual Reality]] (VR), [[Augmented Reality]] (AR), and [[Mixed Reality]] (MR) experiences, particularly when supporting a seamless blend of physical and digital elements.<ref name="Sutherland1968">{{cite web |url=https://dl.acm.org/doi/10.1145/1476589.1476686 |title=A head-mounted three dimensional display |last=Sutherland |first=Ivan E. |date=1968-12-09 |website=ACM Digital Library |access-date=2023-10-27}}</ref><ref name="idc2025">{{cite web|url=https://www.idc.com/getdoc.jsp?containerId=prUS53278025|title=Growth Expected to Pause for AR/VR Headsets, according to IDC|publisher=IDC|date=25 March 2025}}</ref> | |||
HMDs function by presenting imagery, data, or a combination thereof directly to the wearer's visual field. Many modern HMDs are [[stereoscopic]], featuring separate displays or distinct images rendered for each eye to create a sense of depth through [[binocular disparity]]. Examples include VR headsets like the [[Meta Quest 3]] and [[Valve Index]]. Other HMDs, particularly earlier AR devices or specialized notification displays like the original [[Google Glass]], may be monocular, presenting information over only one eye.<ref name="GoogleGlassPatent">{{cite web |url=https://patents.google.com/patent/US8791879B1/en |title=Wearable display device |last=Heinrich |first=Jerome |assignee=Google Inc. |date=2014-07-29 |website=Google Patents |access-date=2023-10-27}}</ref> | HMDs function by presenting imagery, data, or a combination thereof directly to the wearer's visual field. Many modern HMDs are [[stereoscopic]], featuring separate displays or distinct images rendered for each eye to create a sense of depth through [[binocular disparity]]. Examples include VR headsets like the [[Meta Quest 3]] and [[Valve Index]]. Other HMDs, particularly earlier AR devices or specialized notification displays like the original [[Google Glass]], may be monocular, presenting information over only one eye.<ref name="GoogleGlassPatent">{{cite web |url=https://patents.google.com/patent/US8791879B1/en |title=Wearable display device |last=Heinrich |first=Jerome |assignee=Google Inc. |date=2014-07-29 |website=Google Patents |access-date=2023-10-27}}</ref> | ||
Line 7: | Line 10: | ||
==History== | ==History== | ||
The concept of a head-mounted display dates back further than often realized. One of the earliest precursors was Morton Heilig's "Telesphere Mask" patented in 1960, a non-computerized, photographic-based stereoscopic viewing device.<ref name="HeiligPatent">{{cite web |url=https://patents.google.com/patent/US2955156A/en |title=Stereoscopic-television apparatus for individual use |last=Heilig |first=Morton L. |date=1960-10-04 |website=Google Patents |access-date=2023-10-27}}</ref> | The concept of a head-mounted display dates back further than often realized. One of the earliest precursors was Morton Heilig's "Telesphere Mask" patented in 1960, a non-computerized, photographic-based stereoscopic viewing device intended for individual use.<ref name="HeiligPatent">{{cite web |url=https://patents.google.com/patent/US2955156A/en |title=Stereoscopic-television apparatus for individual use |last=Heilig |first=Morton L. |date=1960-10-04 |website=Google Patents |access-date=2023-10-27}}</ref><ref name="heilig1960">{{cite web|url=https://patents.google.com/patent/US2955156A|title=US 2,955,156 — Stereoscopic-television apparatus for individual use|publisher=USPTO|date=28 June 1960}}</ref> | ||
However, the first true HMD connected to a computer is widely credited to [[Ivan Sutherland]] and his student Bob Sproull at Harvard University and later the University of Utah, around 1968. Dubbed the "[[Sword of Damocles]]" due to its imposing size and the heavy machinery suspended from the ceiling required to support its weight and track head movement, it presented simple wireframe graphics in stereo. This system pioneered many concepts still fundamental to VR and AR today, including head tracking and stereoscopic viewing.<ref name="Sutherland1968"/> | However, the first true HMD connected to a computer is widely credited to [[Ivan Sutherland]] and his student Bob Sproull at Harvard University and later the University of Utah, around 1968. Dubbed the "[[Sword of Damocles]]" due to its imposing size and the heavy machinery suspended from the ceiling required to support its weight and track head movement, it presented simple wireframe graphics in stereo. This system pioneered many concepts still fundamental to VR and AR today, including head tracking and stereoscopic viewing.<ref name="Sutherland1968"/><ref name="sutherland1968">{{cite web|url=https://en.wikipedia.org/wiki/The_Sword_of_Damocles_(virtual_reality)|title=The Sword of Damocles|publisher=Wikipedia|date=20 April 2025}}</ref> | ||
Throughout the 1970s and 1980s, HMD development continued primarily in military (especially for aviator helmet-mounted displays) and academic research labs, driven by organizations like the US Air Force and [[NASA]].<ref name="NASA_HMD">{{cite web |url=https://ntrs.nasa.gov/citations/19860018487 |title=Virtual Environment Display System |last=Fisher |first=S. S. |last2=McGreevy |first2=M. |last3=Humphries |first3=J. |last4=Robinett |first4=W. |date=1986-01-01 |website=NASA Technical Reports Server |access-date=2023-10-27}}</ref> The late 1980s and early 1990s saw a "first wave" of commercial VR interest, with companies like VPL Research, founded by [[Jaron Lanier]], popularizing the term "Virtual Reality" and developing HMDs like the "EyePhone". However, technology limitations (low [[resolution]], high [[latency]], limited processing power, high cost) prevented widespread adoption.<ref name="RheingoldVR">{{cite book |last=Rheingold |first=Howard |title=Virtual Reality |publisher=Simon & Schuster |year=1991 |isbn=978-0671693633}}</ref> Nintendo's [[Virtual Boy]] (1995), while technically an HMD, used red LED displays and lacked head tracking, failing commercially but remaining a notable early attempt at consumer VR.<ref name="VirtualBoyHistory">{{cite web |url=https://www.fastcompany.com/3050016/unraveling-the-enigma-of-nintendos-virtual-boy-20-years-later |title=Unraveling The Enigma Of Nintendo’s Virtual Boy, 20 Years Later |last=Edwards |first=Benj |date=2015-08-21 |website=Fast Company |access-date=2023-10-27}}</ref> | Throughout the 1970s and 1980s, HMD development continued primarily in military (especially for aviator helmet-mounted displays) and academic research labs, driven by organizations like the US Air Force and [[NASA]].<ref name="NASA_HMD">{{cite web |url=https://ntrs.nasa.gov/citations/19860018487 |title=Virtual Environment Display System |last=Fisher |first=S. S. |last2=McGreevy |first2=M. |last3=Humphries |first3=J. |last4=Robinett |first4=W. |date=1986-01-01 |website=NASA Technical Reports Server |access-date=2023-10-27}}</ref> The late 1980s and early 1990s saw a "first wave" of commercial VR interest, with companies like VPL Research, founded by [[Jaron Lanier]], popularizing the term "Virtual Reality" and developing HMDs like the "[[EyePhone]]". However, technology limitations (low [[resolution]], high [[latency]], limited processing power, high cost) prevented widespread adoption.<ref name="RheingoldVR">{{cite book |last=Rheingold |first=Howard |title=Virtual Reality |publisher=Simon & Schuster |year=1991 |isbn=978-0671693633}}</ref> Nintendo's [[Virtual Boy]] (1995), while technically an HMD, used red LED displays and lacked head tracking, failing commercially but remaining a notable early attempt at consumer VR.<ref name="VirtualBoyHistory">{{cite web |url=https://www.fastcompany.com/3050016/unraveling-the-enigma-of-nintendos-virtual-boy-20-years-later |title=Unraveling The Enigma Of Nintendo’s Virtual Boy, 20 Years Later |last=Edwards |first=Benj |date=2015-08-21 |website=Fast Company |access-date=2023-10-27}}</ref> | ||
The modern era of consumer VR HMDs was effectively kickstarted by Palmer Luckey's prototype [[Oculus Rift]] in the early 2010s, which demonstrated that high-quality, low-latency VR was becoming feasible with modern mobile display panels and [[sensor]]s. Its subsequent Kickstarter success and acquisition by Facebook (now Meta) spurred renewed industry-wide investment.<ref name="OculusKickstarter">{{cite web |url=https://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game |title=Oculus Rift: Step Into the Game |website=Kickstarter |access-date=2023-10-27}}</ref> This led to the release of numerous consumer HMDs | The modern era of consumer VR HMDs was effectively kickstarted by [[Palmer Luckey]]'s prototype [[Oculus Rift]] in the early 2010s, which demonstrated that high-quality, low-latency VR was becoming feasible with modern mobile display panels and [[sensor]]s. Its subsequent Kickstarter success and acquisition by [[Facebook]] (now [[Meta Platforms|Meta]]) spurred renewed industry-wide investment.<ref name="OculusKickstarter">{{cite web |url=https://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game |title=Oculus Rift: Step Into the Game |website=Kickstarter |access-date=2023-10-27}}</ref> This led to the release of numerous consumer HMDs: | ||
* 2014 – [[Google Cardboard]] popularised low-cost, smartphone-driven VR viewers.<ref name="cardboard2014">{{cite web|url=https://time.com/4193755/google-cardboard-virtual-reality-clay-bavor-vr/|title=Google’s New Head of Virtual Reality on What They’re Planning Next|publisher=Time Magazine|date=28 Jan 2016}}</ref> | |||
* 2015 – [[Samsung Gear VR]] improved on the smartphone HMD concept with better optics and integrated controls.<ref name="gearvr2015">{{cite web|url=https://en.wikipedia.org/wiki/Samsung_Gear_VR|title=Samsung Gear VR|publisher=Wikipedia|date=24 Apr 2025}}</ref> | |||
* 2016 – The consumer [[Oculus Rift]] (CV1) and [[HTC Vive]] established high-end, PC-tethered VR with wide FOV and robust external tracking systems.<ref name="rift2016">{{cite web|url=https://en.wikipedia.org/wiki/Oculus_Rift|title=Oculus Rift|publisher=Wikipedia|date=15 Apr 2025}}</ref><ref name="lighthouse">{{cite web|url=https://www.valvesoftware.com/index/base-stations|title=Valve Index Base Stations|publisher=Valve Software|date=12 Feb 2025}}</ref> [[PlayStation VR]] brought tethered VR to the console market. | |||
* 2019 – [[Oculus Quest]] pioneered high-quality standalone (untethered) 6DoF VR using inside-out camera tracking (Oculus Insight).<ref name="insight2019">{{cite web|url=https://tech.facebook.com/reality-labs/2019/8/the-story-behind-oculus-insight-technology/|title=The Story Behind Oculus Insight Technology|publisher=Meta Reality Labs|date=14 Aug 2019}}</ref> [[Valve Index]] pushed fidelity in the PC VR space. | |||
* 2023 – [[Meta Quest 3]] debuted [[Pancake lens|pancake lenses]], high-resolution color passthrough MR, and improved processing.<ref name="quest3">{{cite web|url=https://en.wikipedia.org/wiki/Meta_Quest_3|title=Meta Quest 3|publisher=Wikipedia|date=18 Apr 2025}}</ref> [[PlayStation VR2]] integrated eye tracking into the console VR experience. | |||
* 2024 – [[Apple Vision Pro]] launched as a premium "spatial computer" featuring high-resolution Micro-OLED displays, advanced eye and hand tracking, and spatial video capabilities.<ref name="visionpro">{{cite web|url=https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/|title=Apple Vision Pro available in the U.S. on February 2|publisher=Apple Newsroom|date=08 Jan 2024}}</ref> | |||
==Core Concepts and Principles== | ==Core Concepts and Principles== | ||
Line 25: | Line 34: | ||
* Focus: They collimate the light or set the focal plane, typically at a distance of 1.5-2 meters or optical infinity, reducing eye strain compared to focusing on a screen inches away. | * Focus: They collimate the light or set the focal plane, typically at a distance of 1.5-2 meters or optical infinity, reducing eye strain compared to focusing on a screen inches away. | ||
* Distortion Correction: Simple magnification often introduces optical distortion (like pincushion distortion). The rendered image is typically pre-distorted (barrel distortion) in software to counteract the lens distortion, resulting in a geometrically correct view for the user.<ref name="LensDistortionVR">{{cite web |url=https://developer.oculus.com/documentation/native/pc/dg-render-distortion/ |title=Distortion Correction |website=Oculus Developer Documentation |access-date=2023-10-27}}</ref> | * Distortion Correction: Simple magnification often introduces optical distortion (like pincushion distortion). The rendered image is typically pre-distorted (barrel distortion) in software to counteract the lens distortion, resulting in a geometrically correct view for the user.<ref name="LensDistortionVR">{{cite web |url=https://developer.oculus.com/documentation/native/pc/dg-render-distortion/ |title=Distortion Correction |website=Oculus Developer Documentation |access-date=2023-10-27}}</ref> | ||
4. '''Eyes''': The light carrying the image information passes through the lenses and enters the user's pupils, forming an image on the retina. | 4. '''Eyes''': The light (photons) carrying the image information passes through the lenses and enters the user's pupils, forming an image on the retina. | ||
===Stereoscopic Vision=== | ===Stereoscopic Vision=== | ||
Line 36: | Line 45: | ||
* [[Accelerometer]]: Measures linear acceleration (and gravity). | * [[Accelerometer]]: Measures linear acceleration (and gravity). | ||
* [[Gyroscope]]: Measures angular velocity. | * [[Gyroscope]]: Measures angular velocity. | ||
* [[Magnetometer]]: Measures the local magnetic field (like a compass), used to correct for gyroscope drift, especially in yaw. Sensor fusion algorithms combine data from these sensors to provide a stable orientation estimate.<ref name="IMU_VR">{{cite web |url=https://www.analog.com/en/technical-articles/imu-in-virtual-reality-systems.html |title=Use of IMU in Virtual Reality Systems |last=Pell |first=Oliver |date=2017-07-12 |website=Analog Dialogue |publisher=Analog Devices |access-date=2023-10-27}}</ref> | * [[Magnetometer]]: Measures the local magnetic field (like a compass), used to correct for gyroscope drift, especially in yaw. [[Sensor fusion]] algorithms combine data from these sensors to provide a stable orientation estimate.<ref name="IMU_VR">{{cite web |url=https://www.analog.com/en/technical-articles/imu-in-virtual-reality-systems.html |title=Use of IMU in Virtual Reality Systems |last=Pell |first=Oliver |date=2017-07-12 |website=Analog Dialogue |publisher=Analog Devices |access-date=2023-10-27}}</ref><ref name="imu">{{cite web|url=https://en.wikipedia.org/wiki/Inertial_measurement_unit|title=Inertial measurement unit|publisher=Wikipedia|date=20 Apr 2025}}</ref> | ||
* '''[[Positional Tracking]] (6DoF)''': Tracks both orientation (3DoF) and translation (movement through space: forward/backward, left/right, up/down). This allows the user to physically walk around, lean, crouch, and dodge within the virtual environment, significantly enhancing immersion and interaction. 6DoF tracking is achieved through various methods: | * '''[[Positional Tracking]] (6DoF)''': Tracks both orientation (3DoF) and translation (movement through space: forward/backward, left/right, up/down). This allows the user to physically walk around, lean, crouch, and dodge within the virtual environment, significantly enhancing immersion and interaction. 6DoF tracking is achieved through various methods: | ||
* '''[[Outside-in tracking]]''': External sensors (cameras or infrared emitters/detectors like [[Lighthouse (tracking system)|Valve's Lighthouse system]]) are placed in the room to track markers (passive reflective or active IR LED) on the HMD and controllers. Examples: Original Oculus Rift (Constellation), HTC Vive/Valve Index (Lighthouse).<ref name="LighthouseExplained">{{cite web |url=https://xinreality.com/wiki/Lighthouse |title=Lighthouse |website=XinReality Wiki |access-date=2023-10-27}}</ref> | * '''[[Outside-in tracking]]''': External sensors (cameras or infrared emitters/detectors like [[Lighthouse (tracking system)|Valve's Lighthouse system]]) are placed in the room to track markers (passive reflective or active IR LED) on the HMD and controllers. Examples: Original Oculus Rift (Constellation), HTC Vive/Valve Index (Lighthouse).<ref name="LighthouseExplained">{{cite web |url=https://xinreality.com/wiki/Lighthouse |title=Lighthouse |website=XinReality Wiki |access-date=2023-10-27}}</ref><ref name="lighthouse" /> | ||
* '''[[Inside-out tracking]]''': Cameras mounted on the HMD itself observe the surrounding environment. Computer vision algorithms, often employing [[Simultaneous Localization and Mapping]] (SLAM) techniques, identify features in the room and track the HMD's movement relative to them. This eliminates the need for external sensors, making setup easier and enabling larger, unrestricted tracking volumes. Most modern standalone and many tethered HMDs use inside-out tracking. Examples: Meta Quest series, HTC Vive Cosmos, Windows Mixed Reality headsets.<ref name="SLAM_VRAR">{{cite journal |url=https://www.mdpi.com/1424-8220/19/15/3338 |title=A Review on SLAM Techniques for Virtual and Augmented Reality Applications |last1=Yousif |first1=K |last2=Bab-Hadiashar |first2=A |last3=Hand |first3=S |date=2019-07-30 |journal=Sensors |volume=19 |issue=15 |pages=3338 |doi=10.3390/s19153338 |pmc=6696193 |access-date=2023-10-27}}</ref> | * '''[[Inside-out tracking]]''': Cameras mounted on the HMD itself observe the surrounding environment. Computer vision algorithms, often employing [[Simultaneous Localization and Mapping]] (SLAM) techniques, identify features in the room and track the HMD's movement relative to them. This eliminates the need for external sensors, making setup easier and enabling larger, unrestricted tracking volumes. Most modern standalone and many tethered HMDs use inside-out tracking. Examples: Meta Quest series, HTC Vive Cosmos, Windows Mixed Reality headsets.<ref name="SLAM_VRAR">{{cite journal |url=https://www.mdpi.com/1424-8220/19/15/3338 |title=A Review on SLAM Techniques for Virtual and Augmented Reality Applications |last1=Yousif |first1=K |last2=Bab-Hadiashar |first2=A |last3=Hand |first3=S |date=2019-07-30 |journal=Sensors |volume=19 |issue=15 |pages=3338 |doi=10.3390/s19153338 |pmc=6696193 |access-date=2023-10-27}}</ref><ref name="insight2019" /> | ||
===Latency=== | |||
[[Motion-to-photon latency]] – the time delay between a user's physical movement and the corresponding visual update on the display – is a critical factor for comfort and immersion. High latency is strongly correlated with cybersickness. Modern VR systems aim for latency below 20 milliseconds (ms), with many achieving closer to 10ms under optimal conditions.<ref name="AbrashMTP">{{cite web |url=https://www.youtube.com/watch?v=G-2dQoeqVVo |title=What VR could, should, and almost certainly will be within two years |last=Abrash |first=Michael |publisher=Steam Dev Days |date=2014-01-15 |access-date=2024-05-15}}</ref><ref name="latency2022">{{cite web|url=https://www.mdpi.com/2224-2708/10/3/53|title=A Study on Sensor System Latency in VR Motion Sickness|publisher=MDPI Sensors|date=10 Aug 2022}}</ref> | |||
==Key Technical Specifications== | ==Key Technical Specifications== | ||
Line 45: | Line 57: | ||
* '''Display Technology''': The type of display panel used significantly impacts image quality. Common types include: | * '''Display Technology''': The type of display panel used significantly impacts image quality. Common types include: | ||
* [[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies.<ref name="LCDvsOLED_VR">{{cite web |url=https://arvrtips.com/lcd-vs-oled-vr-headsets/ |title=LCD vs OLED VR Headsets: Which Screen is Best? |website=AR/VR Tips |access-date=2023-10-27}}</ref> | * [[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and [[Quantum dot display|quantum dots]] for better color.<ref name="LCDvsOLED_VR">{{cite web |url=https://arvrtips.com/lcd-vs-oled-vr-headsets/ |title=LCD vs OLED VR Headsets: Which Screen is Best? |website=AR/VR Tips |access-date=2023-10-27}}</ref> | ||
* [[OLED]] (Organic Light-Emitting Diode): Provides true blacks ( | * [[OLED]] (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "[[Screen burn-in|burn-in]]" over long periods and may use [[PenTile matrix family|PenTile]] [[Subixel rendering|subpixel layouts]] affecting perceived sharpness. | ||
* [[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (e.g., [[Bigscreen Beyond]], | * [[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (e.g., [[Bigscreen Beyond]], [[Apple Vision Pro]]).<ref name="MicroOLED_Intro">{{cite web |url=https://www.oled-info.com/microoled |title=MicroOLED displays |website=OLED-Info |access-date=2023-10-27}}</ref><ref name="microoled2025">{{cite web|url=https://www.svconline.com/proav-today/dual-micro-oled-displays-grow-within-the-ar-vr-headset-market|title=Dual micro-OLED displays grow within the AR/VR headset market|publisher=Systems Contractor News|date=23 Apr 2025}}</ref> | ||
* '''[[Resolution]]''': The number of [[pixel]]s on the display(s), usually specified per eye (e.g., | * [[MicroLED]]: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs. | ||
* '''[[Refresh Rate]]''': The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (e.g., 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. Low persistence displays (where pixels are | * '''[[Resolution]]''': The number of [[pixel]]s on the display(s), usually specified per eye (e.g., 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the [[screen-door effect]] (the visible grid pattern between pixels) and increases image sharpness. [[Pixels Per Degree]] (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally. | ||
* '''[[Field of View]] (FOV)''': The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically | * '''[[Refresh Rate]]''': The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (e.g., 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.<ref name="LowPersistence">{{cite web |url=https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/ |title=Understanding Low Persistence on the DK2 |last=Abrash |first=Michael |date=2014-07-28 |website=Oculus Developer Blog |access-date=2023-10-27}}</ref> | ||
* '''Optics / [[Lens|Lenses]]''': The lenses used heavily influence FOV, image sharpness (center-to-edge), chromatic aberration, geometric distortion, and physical characteristics like size and weight. | * '''[[Field of View]] (FOV)''': The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like [[Pimax]] headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (e.g., 30°-55°) due to the challenges of see-through optics.<ref name="VR_FOV_Comparison">{{cite web |url=https://vr-compare.com/headsetfeature/fieldofview |title=Headset Feature: Field of View |website=VR Compare |access-date=2023-10-27}}</ref> | ||
* '''Optics / [[Lens|Lenses]]''': The lenses used heavily influence FOV, image sharpness (center-to-edge), [[Chromatic aberration|chromatic aberration]], geometric distortion, and physical characteristics like size and weight. | |||
* [[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | * [[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | ||
* [[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (e.g., Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | * [[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (e.g., Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | ||
* [[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">{{cite web |url=https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/ |title=VR Optics (Part 1) – Brief History and Pancake Lenses |last=Guttag |first=Karl |date=2021-12-09 |website=KGOnTech |access-date=2023-10-27}}</ref> | * [[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">{{cite web |url=https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/ |title=VR Optics (Part 1) – Brief History and Pancake Lenses |last=Guttag |first=Karl |date=2021-12-09 |website=KGOnTech |access-date=2023-10-27}}</ref><ref name="optics2023">{{cite web|url=https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr|title=Pancake vs Fresnel Lenses in VR Headsets|publisher=Expand Reality|date=05 Oct 2023}}</ref> | ||
* [[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (e.g., HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using diffractive or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/> | * [[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (e.g., HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">{{cite web|url=https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides|title=Ride the Wave: AR Devices Rely on Waveguides|publisher=Radiant Vision Systems|date=11 Jan 2022}}</ref> | ||
* [[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">{{cite web |url=https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ |title=HoloLens 2 (HL2) and AR Optics in General (Part 1) |last=Guttag |first=Karl |date=2019-04-01 |website=KGOnTech |access-date=2023-10-27}}</ref> | * [[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">{{cite web |url=https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ |title=HoloLens 2 (HL2) and AR Optics in General (Part 1) |last=Guttag |first=Karl |date=2019-04-01 |website=KGOnTech |access-date=2023-10-27}}</ref> | ||
* '''[[Interpupillary distance]] (IPD) Adjustment''': The distance between the centers of the pupils varies between individuals. HMDs need to accommodate this for optimal clarity, comfort, and correct stereo rendering. Adjustment can be: | * [[Holographic Optical Elements]] (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays. | ||
* Physical/Manual: Lenses can be moved closer together or further apart, often via a slider or dial (e.g., Valve Index, Quest 2). | * '''[[Interpupillary distance]] (IPD) Adjustment''': The distance between the centers of the pupils varies between individuals (typically 54-72mm for adults). HMDs need to accommodate this for optimal clarity, comfort, and correct stereo rendering. Adjustment can be: | ||
* Software-based: The rendering viewpoint separation is adjusted in software (less common or effective for major mismatches | * Physical/Manual: Lenses can be moved closer together or further apart, often via a slider or dial (e.g., Valve Index, Quest 3). Continuous adjustment allows finer tuning. | ||
* Stepped: Some HMDs offer discrete IPD steps (e.g., original Quest, Quest 2). | |||
* Automatic: High-end systems might use eye-tracking to measure IPD and adjust automatically. | * Software-based: The rendering viewpoint separation is adjusted in software (less common or effective for major mismatches without physical lens movement). | ||
* '''[[Eye tracking]]''': Sensors inside the HMD track the user's gaze direction. This enables: | * Automatic: High-end systems might use eye-tracking to measure IPD and adjust automatically or prompt the user. | ||
* Foveated | * '''[[Eye tracking]]''': Sensors (typically small infrared cameras) inside the HMD track the user's gaze direction. This enables: | ||
* Improved Social Interaction: | * [[Foveated rendering]]: Rendering the area where the user is looking at full resolution, and the periphery at lower resolution, saving significant computational power.<ref name="FoveatedRenderingNvidia">{{cite web |url=https://developer.nvidia.com/vrworks/graphics/foveatedrendering |title=NVIDIA Variable Rate Shading (VRS) & Foveated Rendering |website=NVIDIA Developer |access-date=2023-10-27}}</ref> | ||
* Improved Social Interaction: [[Avatar]]s can mimic the user's eye movements, enhancing realism in social VR. | |||
* Automatic IPD adjustment. | * Automatic IPD adjustment. | ||
* Gaze-based interaction. | * Gaze-based interaction as an input method. | ||
Examples: [[Meta Quest Pro]], [[PlayStation VR2]], [[HTC Vive Pro Eye]]. | Examples: [[Meta Quest Pro]], [[PlayStation VR2]], [[HTC Vive Pro Eye]], [[Apple Vision Pro]]. | ||
* '''Connectivity''': How the HMD connects to the processing unit (if not standalone). | * '''Connectivity''': How the HMD connects to the processing unit (if not standalone). | ||
* Wired: Typically [[USB]] (often Type-C) and [[DisplayPort]] or [[HDMI]] for high bandwidth video | * Wired: Typically [[USB]] (often Type-C for power/data) and [[DisplayPort]] or [[HDMI]] for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement. | ||
* Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E) or proprietary radio frequencies (e.g., WiGig) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">{{cite web |url=https://uploadvr.com/wireless-pc-vr-comparison/ |title=Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless |last=Heaney |first=David |date=2022-01-20 |website=UploadVR |access-date=2023-10-27}}</ref> | * Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E/7) or proprietary radio frequencies (e.g., older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">{{cite web |url=https://uploadvr.com/wireless-pc-vr-comparison/ |title=Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless |last=Heaney |first=David |date=2022-01-20 |website=UploadVR |access-date=2023-10-27}}</ref> | ||
* '''Audio''': Sound is crucial for immersion. HMDs may feature: | * '''Audio''': Sound is crucial for immersion. HMDs may feature: | ||
* Integrated [[Speaker]]s: Often open-ear speakers built into the strap or near the ears, providing | * [[Spatial Audio]]: 3D audio rendering techniques that make sounds appear to come from specific locations in the virtual environment. Supported via various output methods. | ||
* [[Headphone]] Jack: Allows users to connect their own headphones. | * Integrated [[Speaker]]s: Often open-ear [[Near-field communication|near-field]] speakers built into the strap or near the ears, providing spatial audio without blocking external sounds (e.g., Valve Index, Quest series). | ||
* Integrated Headphones: High-fidelity on-ear or over-ear headphones attached to the HMD (e.g., | * [[Headphone]] Jack (3.5mm): Allows users to connect their own headphones or earbuds. | ||
* Integrated Headphones: High-fidelity on-ear or over-ear headphones attached to the HMD (e.g., original Rift CV1, HP Reverb G2). | |||
* [[Microphone Arrays]]: Multiple microphones for clear voice input, communication in multiplayer apps, voice commands, and potentially noise cancellation. | |||
* '''[[Ergonomics]]''': Factors affecting comfort during extended use: | * '''[[Ergonomics]]''': Factors affecting comfort during extended use: | ||
* Weight: Lighter is generally better. | * Weight: Lighter is generally better (most consumer HMDs are 400-700g). | ||
* Weight Distribution: Balanced weight (front-to-back) is often more important than total weight. Battery placement in standalone HMDs (e.g., rear-mounted) can improve balance. | * Weight Distribution: Balanced weight (front-to-back) is often more important than total weight. Battery placement in standalone HMDs (e.g., rear-mounted) can improve balance. | ||
* Strap Design: Different mechanisms (soft straps, rigid "halo" straps) distribute pressure differently. | * Strap Design: Different mechanisms (soft elastic straps, rigid "halo" straps, top straps) distribute pressure differently. | ||
* Facial Interface: Foam padding, material breathability, light blocking. Options for glasses wearers. | * Facial Interface: Foam or fabric padding, material breathability, light blocking. Options/space for glasses wearers or [[prescription lens]] inserts. | ||
==Types of HMDs== | ==Types of HMDs== | ||
Line 90: | Line 106: | ||
====[[Discrete HMD]] (Tethered HMD)==== | ====[[Discrete HMD]] (Tethered HMD)==== | ||
These HMDs contain displays, optics, sensors, and audio, but rely on an external processing unit – typically a powerful [[Personal Computer|PC]] or a game [[console]] – connected via cables (or sometimes a dedicated wireless adapter). They generally offer the highest fidelity graphics and performance due to leveraging powerful external GPUs. | These HMDs contain displays, optics, sensors, and audio, but rely on an external processing unit – typically a powerful [[Personal Computer|PC]] or a game [[console]] – connected via cables (or sometimes a dedicated wireless adapter). They generally offer the highest fidelity graphics and performance due to leveraging powerful external GPUs. | ||
* '''PC VR Examples''': [[Valve Index]], [[HTC Vive Pro 2]], [[HP Reverb G2]], original [[Oculus Rift]], [[Oculus Rift S]]. | * '''PC VR Examples''': [[Valve Index]], [[HTC Vive Pro 2]], [[HP Reverb G2]], original [[Oculus Rift]], [[Oculus Rift S]], [[Varjo Aero]], [[Pimax]] series. | ||
* '''Console VR Examples''': [[PlayStation VR]], [[PlayStation VR2]] (connects to PlayStation consoles). | * '''Console VR Examples''': [[PlayStation VR]], [[PlayStation VR2]] (connects to PlayStation consoles). | ||
====[[Integrated HMD]] (Standalone HMD)==== | ====[[Integrated HMD]] (Standalone HMD)==== | ||
Also known as All-in-One (AIO) HMDs, these devices contain all necessary components – displays, optics, sensors, processing (CPU/GPU, often based on mobile chipsets like Qualcomm Snapdragon XR series), storage, battery, and tracking – within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end PC VR setups. Many standalone HMDs can optionally connect to a PC via cable (e.g., Meta Link) or wirelessly (e.g., Air Link, Virtual Desktop) to function as a PC VR headset. | Also known as All-in-One (AIO) HMDs, these devices contain all necessary components – displays, optics, sensors, processing (CPU/GPU, often based on mobile chipsets like Qualcomm Snapdragon XR series), memory, storage, battery, and tracking – within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end PC VR setups. Many standalone HMDs can optionally connect to a PC via cable (e.g., Meta Link) or wirelessly (e.g., Air Link, Virtual Desktop) to function as a PC VR headset. | ||
* '''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]]. | * '''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[Pico Neo 3 Link]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]]. | ||
====[[Slide-on HMD]] (Smartphone HMD)==== | ====[[Slide-on HMD]] (Smartphone HMD)==== | ||
These were an early, low-cost entry point to VR, consisting of a simple enclosure (often plastic or cardboard) with lenses, into which a compatible [[smartphone]] was inserted. The smartphone provided the display, processing, and basic 3DoF tracking (using its internal IMU). While popular initially due to accessibility (e.g., [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]]), they suffered from limitations like lower display quality, higher latency, potential overheating, limited interaction (often just a single button or touchpad), | These were an early, low-cost entry point to VR, consisting of a simple enclosure (often plastic or cardboard) with lenses, into which a compatible [[smartphone]] was inserted. The smartphone provided the display, processing, and basic 3DoF tracking (using its internal IMU). While popular initially due to accessibility (e.g., [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]]), they suffered from limitations like lower display quality, higher latency, potential overheating, limited interaction (often just a single button or touchpad), inconsistent experiences across different phones, and generally only 3DoF tracking. This category is now largely obsolete, superseded by standalone HMDs. | ||
* '''Examples''': [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]]. | * '''Examples''': [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]], [[Zeiss VR One]], [[Merge VR/AR Goggles]]. | ||
===Augmented Reality (AR) HMDs=== | ===Augmented Reality (AR) / Mixed Reality (MR) HMDs=== | ||
These devices overlay digital information onto the user's view of the real world. | These devices overlay digital information onto the user's view of the real world, or blend real and virtual views. | ||
====[[Optical head-mounted display]] (OHMD)==== | ====[[Optical head-mounted display]] (OHMD)==== | ||
These HMDs use transparent optical elements (like waveguides or beam splitters) placed in front of the user's eyes. A small projector or microdisplay generates the digital image, which is then directed through the optics and combined with the light from the real world, allowing the user to see both simultaneously. Achieving a wide FOV, high brightness, good opacity for virtual objects, and unobtrusive form factor are major challenges. Often targeted at enterprise, industrial, or specific professional use cases due to cost and complexity. | These HMDs use transparent optical elements (like waveguides or beam splitters) placed in front of the user's eyes. A small projector or microdisplay generates the digital image, which is then directed through the optics and combined with the light from the real world, allowing the user to see both simultaneously. Achieving a wide FOV, high brightness, good opacity for virtual objects, and unobtrusive form factor are major challenges. Often targeted at enterprise, industrial, or specific professional use cases due to cost and complexity. | ||
* '''Examples''': [[Microsoft HoloLens]], [[Microsoft HoloLens 2]], [[Magic Leap 1]], [[Magic Leap 2]], [[Vuzix Blade]], [[RealWear Navigator]]. Simple notification-style displays like [[Google Glass]] also fall under this category. | * '''Examples''': [[Microsoft HoloLens]], [[Microsoft HoloLens 2]], [[Magic Leap 1]], [[Magic Leap 2]], [[Vuzix Blade]], [[RealWear Navigator]], [[Google Glass Enterprise Edition]], [[Nreal Light]] (now [[XREAL Light]]). Simple notification-style displays like original [[Google Glass]] also fall under this category. | ||
====Video Passthrough AR HMDs==== | ====Video Passthrough AR/MR HMDs==== | ||
These utilize opaque displays, essentially functioning like VR HMDs, but incorporate outward-facing cameras. The live video feed from these cameras is processed and displayed on the internal screens, with digital elements rendered on top. This allows users to see their surroundings digitally. Modern implementations increasingly use high-resolution, low-latency, color cameras, aiming to create a more seamless blend ("Mixed Reality"). While not optically transparent, they can offer wider FOV for the AR content compared to many current OHMDs. | These utilize opaque displays, essentially functioning like VR HMDs, but incorporate outward-facing cameras. The live video feed from these cameras is processed (often correcting for distortion and perspective) and displayed on the internal screens, with digital elements rendered on top. This allows users to see their surroundings digitally, blended with virtual content. Modern implementations increasingly use high-resolution, low-latency, color cameras, aiming to create a more seamless blend ("Mixed Reality"). While not optically transparent, they can offer wider FOV for the AR/MR content compared to many current OHMDs and better occlusion of real objects by virtual ones. | ||
* '''Examples''': [[Meta Quest Pro]], [[Meta Quest 3]], [[Apple Vision Pro]], [[HTC Vive XR Elite]].<ref name="PassthroughExplained">{{cite web |url=https://www.roadtovr.com/vr-headset-passthrough-ar-explained-quest-2-pro-index-vive-pro-2/ |title=VR Headset Passthrough AR Explained |last=Lang |first=Ben |date=2023-02-15 |website=Road to VR |access-date=2023-10-27}}</ref> | * '''Examples''': [[Meta Quest Pro]], [[Meta Quest 3]], [[Apple Vision Pro]], [[HTC Vive XR Elite]], [[Varjo XR-3]], [[Lynx R1]].<ref name="PassthroughExplained">{{cite web |url=https://www.roadtovr.com/vr-headset-passthrough-ar-explained-quest-2-pro-index-vive-pro-2/ |title=VR Headset Passthrough AR Explained |last=Lang |first=Ben |date=2023-02-15 |website=Road to VR |access-date=2023-10-27}}</ref> | ||
==Components of HMDs== | ==Components of HMDs== | ||
While varying significantly based on type and purpose, most modern HMDs incorporate several key components: | While varying significantly based on type and purpose, most modern HMDs incorporate several key components: | ||
* '''Display Panels''': Generate the visual output | * '''Display Panels & Optics''': Generate the visual output and direct it to the eyes (see Key Technical Specifications above). | ||
* '''[[Sensor]]s''': Detect movement, user input, and sometimes the environment. | |||
* '''[[Sensor]]s''': Detect movement and sometimes the environment. | * [[Inertial Measurement Unit]] (IMU): Core component for rotational (3DoF) tracking. Essential for low-latency orientation updates. | ||
* [[Inertial Measurement Unit]] (IMU): | * [[Camera]]s: Crucial for modern HMDs. Can include: | ||
* [[Camera]]s: Used for inside-out positional (6DoF) tracking, [[video passthrough]], [[hand tracking]] | * Visible light cameras: Used for inside-out positional (6DoF) tracking, [[video passthrough]] (especially color passthrough), environment mapping, [[Hand tracking|hand tracking]]. | ||
* Depth Sensors (e.g., Time-of-Flight, Structured Light) | * [[Infrared]] (IR) cameras: Often used for inside-out tracking (less susceptible to ambient light changes), controller tracking (detecting IR LEDs on controllers), and [[eye tracking]]. | ||
* Eye Tracking Cameras: Small internal cameras pointed at the user's eyes. | * [[Depth-sensing Cameras|Depth Sensors]]: (e.g., [[Time-of-flight camera|Time-of-Flight]], Structured Light, Active Stereo IR). Used in some AR/MR HMDs (like HoloLens, Vision Pro) for accurate [[Spatial analysis|spatial mapping]], environment understanding, hand tracking, and occlusion. | ||
* [[Eye tracking|Eye Tracking Cameras]]: Small internal IR cameras pointed at the user's eyes to monitor gaze direction and pupil characteristics. | |||
* '''Processors''': | * '''Processors''': | ||
* CPU/GPU: Handle rendering, tracking calculations, application logic (essential in standalone HMDs). | * CPU/GPU: Handle rendering, tracking calculations, application logic (essential in standalone HMDs, often a mobile [[System on a chip|SoC]] like [[Qualcomm Snapdragon Spaces|Snapdragon XR series]]). | ||
* Specialized Processors ( | * Specialized Processors: May include [[Vision Processing Unit]]s (VPUs) or [[Neural Processing Unit]]s (NPUs) to efficiently handle computer vision tasks (SLAM, hand/eye tracking) or [[Artificial intelligence|AI]] workloads, offloading the main CPU/GPU. Microsoft's [[Holographic Processing Unit]] (HPU) in HoloLens is an example.<ref name="HoloLensSensors">{{cite web |url=https://learn.microsoft.com/en-us/hololens/hololens2-hardware |title=HoloLens 2 hardware |website=Microsoft Learn |access-date=2023-10-27}}</ref> | ||
* '''Memory & Storage''': RAM for active processing and onboard storage (in standalone HMDs) for the operating system, applications, and media. | * '''Memory & Storage''': [[RAM]] for active processing and onboard [[Flash memory|storage]] (in standalone HMDs) for the operating system, applications, and media. | ||
* '''Audio System''': Integrated speakers, microphones, headphone jacks. | * '''Audio System''': Integrated speakers, microphones, headphone jacks (see Key Technical Specifications above). | ||
* '''Connectivity Hardware''': Wi-Fi, Bluetooth radios (especially in standalone HMDs), USB ports, video input ports (in tethered HMDs). | * '''Connectivity Hardware''': Wi-Fi, [[Bluetooth]] radios (especially in standalone HMDs), [[USB]] ports, video input ports (in tethered HMDs). | ||
* '''Power System''': Battery (in standalone or wireless HMDs), power regulation circuitry. | * '''Power System''': [[Battery]] (in standalone or wireless HMDs), power regulation circuitry, charging ports (often USB-C). | ||
* '''Mechanical Structure & Ergonomics''': Housing, straps, facial interface, IPD adjustment mechanisms. | * '''Mechanical Structure & Ergonomics''': Housing, straps, facial interface, IPD adjustment mechanisms, thermal management (fans, heat sinks). | ||
* '''Input Mechanisms''': How the user interacts with the system. | |||
* [[Motion Controllers]]: Handheld devices tracked in 3D space (usually 6DoF), typically including buttons, triggers, joysticks/touchpads, and [[Haptic technology|haptic feedback]]. Primary input for most VR systems (e.g., Meta Touch controllers, Valve Index controllers, PS VR2 Sense controllers). | |||
* [[Hand tracking]]: Camera-based systems that track the user's bare hands and finger movements without requiring controllers. Offers natural interaction but lacks physical buttons and haptic feedback. Increasingly standard on standalone VR/MR headsets (Quest series, Vision Pro). | |||
* [[Eye tracking]]: Used for gaze-based selection, foveated rendering, and social presence. | |||
* [[Voice Commands]]: Using built-in microphones and [[Speech recognition|speech recognition software]] for hands-free control. | |||
* [[Brain-Computer Interface|Brain-Computer Interfaces]] (BCI): Experimental interfaces reading neural signals, potentially via electrodes integrated into the HMD, for direct thought control. Still largely in research phases for consumer HMDs (e.g., [[NextMind]], [[CTRL-labs]] research). | |||
==Applications== | ==Applications== | ||
HMDs enable a wide range of applications across various fields: | HMDs enable a wide range of applications across various fields: | ||
* '''Gaming and Entertainment''': Immersive video games, virtual cinemas, | * '''Gaming and Entertainment''': Immersive video games ([[Beat Saber]], [[Half-Life: Alyx]]), virtual cinemas, [[Social VR]] platforms ([[VRChat]], [[Rec Room]]), [[Location-based entertainment|location-based VR experiences]], [[Virtual tourism]]. | ||
* '''[[Training]] and [[Simulation]]''': Flight simulation, surgical training, military exercises, emergency response training, complex machinery operation training.<ref name="VRSimulationTraining">{{cite journal |url=https://link.springer.com/article/10.1007/s10055-016-0293-5 |title=Virtual reality for simulation and training |last=Freina |first=Laura |last2=Ott |first2=Michela |date=2015 |journal=E-Learning and Digital Media |volume=12 |issue=3-4 |pages=368–383 |doi=10.1177/2042753015591756 |access-date=2023-10-27}}</ref> | * '''[[Training]] and [[Simulation]]''': Flight simulation, [[Surgical simulation|surgical training]], military exercises, emergency response training, complex machinery operation training, workplace safety drills.<ref name="VRSimulationTraining">{{cite journal |url=https://link.springer.com/article/10.1007/s10055-016-0293-5 |title=Virtual reality for simulation and training |last=Freina |first=Laura |last2=Ott |first2=Michela |date=2015 |journal=E-Learning and Digital Media |volume=12 |issue=3-4 |pages=368–383 |doi=10.1177/2042753015591756 |access-date=2023-10-27}}</ref> | ||
* '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review, architectural visualization, virtual prototyping, ergonomic assessments.<ref name="VR_CAD">{{cite web |url=https://www.autodesk.com/solutions/virtual-reality |title=Virtual Reality in Design and Manufacturing |website=Autodesk |access-date=2023-10-27}}</ref> | * '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">{{cite web |url=https://www.autodesk.com/solutions/virtual-reality |title=Virtual Reality in Design and Manufacturing |website=Autodesk |access-date=2023-10-27}}</ref> | ||
* '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings, remote assistance (especially using AR overlays), shared virtual workspaces. | * '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]). | ||
* '''Medical''': Surgical planning and visualization, therapy (e.g., exposure therapy for phobias, pain management), | * '''Medical''': Surgical planning and visualization (overlaying [[medical imaging]] onto patients via AR), therapy (e.g., [[Virtual reality therapy|exposure therapy]] for phobias/[[PTSD]], pain management/distraction), rehabilitation, medical education. <ref name="VR_Medicine">{{cite journal |url=https://jmir.org/2019/11/e14190/ |title=Applications of Virtual Reality for Clinical Neuropsychology: A Review |last1=Rizzo |first1=Albert "Skip" |last2=Kim |first2=Giyoung |date=2019-11-15 |journal=Journal of Medical Internet Research |volume=21 |issue=11 |pages=e14190 |doi=10.2196/14190 |pmid=31730019 |pmc=6880643 |access-date=2023-10-27}}</ref> | ||
* '''Education''': Virtual field trips, interactive science experiments, historical reconstructions. | * '''Education''': Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning. | ||
* '''Data Visualization''': Exploring complex datasets in 3D space. | * '''[[Information visualization|Data Visualization]]''': Exploring complex datasets (e.g., financial data, scientific simulations) in interactive 3D space. | ||
* '''Military and Aviation''': Helmet-mounted displays providing flight data, targeting information, night vision. | * '''Military and Aviation''': Helmet-mounted displays providing flight data, targeting information, situational awareness overlays, night vision integration (e.g., [[Integrated Visual Augmentation System|IVAS]]). | ||
==Challenges and | ==Challenges and Limitations== | ||
Despite significant progress, HMD technology still faces challenges: | Despite significant progress, HMD technology still faces challenges: | ||
* '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("retinal resolution"), wider FOV without distortion, higher brightness (especially for AR), and eliminating artifacts like screen-door effect, god rays, and motion blur remain ongoing goals. | * '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("[[Retinal projector|retinal resolution]]" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, [[Mura defect|mura]], and motion blur remain ongoing goals.<ref name="Kim2019FoveatedAR">{{cite journal |title=Foveated AR: dynamically-foveated augmented reality display |last1=Kim |first1=J. |last2=Jeong |first2=Y. |last3=Stengel |first3=M. |etal=et al. |date=2019 |journal=ACM Transactions on Graphics |volume=38 |issue=4 |pages=1–15 |doi=10.1145/3306346.3322983}}</ref> | ||
* '''Comfort and Ergonomics''': Reducing weight, improving balance, managing heat dissipation, accommodating glasses, and finding comfortable long-term wear solutions are critical for broader adoption. | * '''Comfort and Ergonomics''': Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.<ref name="TalsmaComfort2020">{{cite journal |title=Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays |last1=Talsma |first1=S. W. |last2=Usmani |first2=S. A. |last3=Chen |first3=P. Y. |date=2020 |journal=Journal of the Society for Information Display |volume=28 |issue=11 |pages=841–850 |doi=10.1002/jsid.943}}</ref> | ||
* '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain and | * '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.<ref name="VAC_Review">{{cite journal |url=https://jov.arvojournals.org/article.aspx?articleid=2193631 |title=The vergence-accommodation conflict: Practical consequences and solutions |last=Hoffman |first=David M. |last2=Girshick |first2=Ahna R. |last3=Akeley |first3=Kurt |last4=Banks |first4=Martin S. |date=2008-03-18 |journal=Journal of Vision |volume=8 |issue=3 |pages=33 |doi=10.1167/8.3.33 |access-date=2023-10-27}}</ref> Solutions like [[Varifocal display|varifocal]] and [[Light field|light field]] displays are complex and still largely experimental. | ||
* '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems | * '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and [[Vestibular system|vestibular]] input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.<ref name="Weech2019Cybersickness">{{cite journal |title=Presence and cybersickness in virtual reality are negatively related: a review |last1=Weech |first1=S. |last2=Kenny |first2=S. |last3=Barnett-Cowan |first3=M. |date=2019 |journal=Frontiers in Psychology |volume=10 |pages=158 |doi=10.3389/fpsyg.2019.00158 |pmid=30787884 |pmc=6374254}}</ref><ref name="vrsickness">{{cite web|url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/|title=Factors Associated With Virtual Reality Sickness in Head-Mounted Displays|publisher=Frontiers in Virtual Reality|date=20 Mar 2020}}</ref> | ||
* '''Tracking Robustness''': Inside-out tracking can struggle in poorly lit environments, on featureless surfaces, or during very fast movements. | * '''Tracking Robustness''': Inside-out tracking can struggle in poorly lit or overly bright environments, on large featureless surfaces (blank walls), with reflective surfaces (mirrors), or during very fast head/body movements. Outside-in tracking requires external sensor setup and has a limited, fixed tracking volume. | ||
* '''Content Ecosystem''': The availability of compelling applications and experiences is crucial for driving HMD adoption. | * '''Content Ecosystem''': The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches. | ||
* '''Cost''': High-end HMDs remain expensive, although standalone VR headsets have become more affordable. | * '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars. | ||
* '''Social Acceptance''': Wearing bulky headsets in public or | * '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">{{cite conference |title=Social acceptability in HCI: A survey of methods, measures, and design strategies |last1=Koelle |first1=M. |last2=Ananthanarayan |first2=S. |last3=Boll |first3=S. |booktitle=Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems |year=2020 |pages=1–19 |doi=10.1145/3313831.3376101}}</ref> | ||
* '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (e.g., [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">{{cite journal |title=Ocular effects of virtual reality headset wear in young adults |last1=Turnbull |first1=P. R. |last2=Phillips |first2=J. R. |date=2017 |journal=Scientific Reports |volume=7 |issue=1 |pages=1–11 |doi=10.1038/s41598-017-14811-x}}</ref><ref name="laser">{{cite web|url=https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers|title=ANSI Z136.1 — Safe Use of Lasers|publisher=Laser Institute of America|date=02 Dec 2023}}</ref> Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.<ref name="MadaryMetzingerEthics2016">{{cite journal |title=Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology |last1=Madary |first1=M. |last2=Metzinger |first2=T. K. |date=2016 |journal=Frontiers in Robotics and AI |volume=3 |pages=3 |doi=10.3389/frobt.2016.00003}}</ref> | |||
==Future Trends and Developments== | |||
The HMD landscape continues to evolve rapidly, with several promising developments on the horizon: | |||
* '''Display Advancements''': | |||
* [[Varifocal display|Varifocal Displays]]: Systems that dynamically adjust focal depth based on where the user is looking (using eye tracking) or based on scene content, addressing the vergence-accommodation conflict. Technologies include movable lenses/displays, [[Liquid crystal lens|liquid crystal lenses]], [[Alvarez lens|Alvarez lenses]], and multi-focal plane displays.<ref name="Rathinavel2018Varifocal">{{cite journal |title=An extended depth-at-field volumetric near-eye augmented reality display |last1=Rathinavel |first1=K. |etal=et al. |date=2018 |journal=IEEE Transactions on Visualization and Computer Graphics |volume=24 |issue=11 |pages=2857–2866 |doi=10.1109/TVCG.2018.2868565}}</ref> | |||
* [[Light field|Light Field Displays]]: Generate a more complete representation of light, allowing the eye to focus naturally at different depths within the virtual scene. Still complex and computationally intensive.<ref name="Huang2015LightFieldStereoscope">{{cite journal |title=The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues |last1=Huang |first1=F. C. |last2=Chen |first2=K. |last3=Wetzstein |first3=G. |date=2015 |journal=ACM Transactions on Graphics |volume=34 |issue=4 |pages=1–12 |doi=10.1145/2766949}}</ref> | |||
* [[Holographic display|Holographic Displays]]: Aim to reconstruct the wavefront of light from a 3D scene, potentially offering the most natural 3D viewing experience without conflicts. True holographic HMDs are still highly experimental. | |||
* Higher resolution, brightness, efficiency via Micro-OLED and [[MicroLED]] maturation. | |||
* '''Form Factor Evolution''': | |||
* [[Lightweight Designs]]: Advanced optics (pancake, [[Metalens|metalenses]], HOEs) and display technologies are enabling significantly thinner, lighter headsets (sub-300g or even sub-100g). | |||
* [[Smartglasses|AR Glasses]]: The long-term goal for AR is achieving normal eyeglass form factors with all-day wearability and significant compute/display capabilities. Projects like [[Project Aria]] (Meta research) and rumored [[Apple smart glasses|Apple glasses]] point toward this future.<ref name="Delaney2021ARGlasses">{{cite journal |title=The race toward human-centered AR glasses |last=Delaney |first=K. |date=2021 |journal=IEEE Computer Graphics and Applications |volume=41 |issue=5 |pages=112–115 |doi=10.1109/MCG.2021.3097740}}</ref> | |||
* '''Sensory Expansion''': | |||
* [[Haptic technology|Advanced Haptic Feedback]]: Beyond simple controller rumble, providing more nuanced tactile sensations via gloves ([[HaptX]]), bodysuits ([[bHaptics]]), ultrasound ([[Ultraleap]]), or other actuators to simulate touch, texture, and impact. | |||
* [[Digital scent technology|Olfactory Displays]]: Systems that generate scents synchronized with virtual environments to enhance immersion (e.g., [[OVR Technology]]). | |||
* [[Motion capture|Full-body Tracking]]: Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers ([[Vive Tracker]]), webcam-based AI solutions, or integrated sensors. | |||
* '''Computational Capabilities''': | |||
* [[Edge computing|Edge/Cloud Computing]]: Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (e.g., [[NVIDIA CloudXR]], [[Plutosphere]]).<ref name="Liu2019EdgeAR">{{cite conference |title=Edge assisted real-time object detection for mobile augmented reality |last1=Liu |first1=L. |last2=Li |first2=H. |last3=Gruteser |first3=M. |booktitle=Proceedings of the 25th Annual International Conference on Mobile Computing and Networking |year=2019 |pages=1–16 |doi=10.1145/3300061.3345431}}</ref> | |||
* [[Artificial intelligence|AI Integration]]: On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction. | |||
* '''Interfaces''': | |||
* Improved Hand/Eye/Face Tracking: Higher fidelity tracking of expressions and micro-movements for more realistic avatars and nuanced control. | |||
* [[Brain-Computer Interface|Neural Interfaces]]: Non-invasive BCIs (e.g., EMG wristbands, EEG sensors) may offer supplementary input channels in the future. | |||
==Market Outlook== | |||
Market analysis firms like [[International Data Corporation|IDC]] report fluctuating but generally growing shipments of AR/VR headsets. For instance, IDC forecasted global shipments to reach 9.1 million units in 2024, with projected growth to 22.9 million by 2028, driven significantly by the adoption of mixed-reality capable devices and maturing technology in both consumer and enterprise segments.<ref name="idc2025" /> | |||
==References== | ==References== | ||
Line 172: | Line 208: | ||
==See Also== | ==See Also== | ||
* [[Augmented Reality]] (AR) | * [[Augmented Reality]] (AR) | ||
* [[ | * [[Binocular vision]] | ||
* [[ | * [[Brain-Computer Interface]] | ||
* [[Degrees of Freedom]] (DoF) | * [[Degrees of Freedom]] (DoF) | ||
* [[ | * [[Display technology]] | ||
* [[Eye tracking]] | * [[Eye tracking]] | ||
* [[Foveated rendering]] | * [[Foveated rendering]] | ||
* [[Field of View]] (FOV) | |||
* [[Hand tracking]] | |||
* [[Haptic technology]] | * [[Haptic technology]] | ||
* [[Helmet-mounted display]] | * [[Helmet-mounted display]] | ||
* [[Immersion (virtual reality)]] | |||
* [[Inside-out tracking]] | |||
* [[Latency (engineering)]] | |||
* [[Mixed Reality]] (MR) | |||
* [[Motion Controllers]] | |||
* [[Motion Sickness]] | |||
* [[Optical head-mounted display]] (OHMD) | |||
* [[Outside-in tracking]] | |||
* [[Pancake lens]] | |||
* [[Presence (virtual reality)]] | |||
* [[Smartglasses]] | |||
* [[Spatial Audio]] | |||
* [[Stereoscopy]] | |||
* [[Tracking system|Tracking]] (VR/AR context) | |||
* [[Vergence-accommodation conflict]] | |||
* [[Video passthrough]] | |||
* [[Virtual Reality]] (VR) | |||
* [[Virtual Reality Devices]] | |||
* [[Waveguide (optics)]] | |||
[[Category:Hardware]] | [[Category:Hardware]] | ||
Line 197: | Line 244: | ||
[[Category:Augmented Reality]] | [[Category:Augmented Reality]] | ||
[[Category:Input devices]] | [[Category:Input devices]] | ||
[[Category:Head-mounted displays]] | |||
[[Category:Terms]] | [[Category:Terms]] |