Head-mounted display: Difference between revisions
Appearance
Xinreality (talk | contribs) No edit summary |
Xinreality (talk | contribs) No edit summary |
||
Line 45: | Line 45: | ||
*[[Magnetometer]]: Measures the local magnetic field (like a compass), used to correct for gyroscope drift, especially in yaw. [[Sensor fusion]] algorithms combine data from these sensors to provide a stable orientation estimate.<ref name="IMU_VR">Pell, Oliver (2017-07-12). "Use of IMU in Virtual Reality Systems". Analog Dialogue, Analog Devices. Retrieved 2023-10-27. [https://www.analog.com/en/technical-articles/imu-in-virtual-reality-systems.html]</ref><ref name="imu">Wikipedia (20 Apr 2025). "Inertial measurement unit". Retrieved 2024-05-15. [https://en.wikipedia.org/wiki/Inertial_measurement_unit]</ref> | *[[Magnetometer]]: Measures the local magnetic field (like a compass), used to correct for gyroscope drift, especially in yaw. [[Sensor fusion]] algorithms combine data from these sensors to provide a stable orientation estimate.<ref name="IMU_VR">Pell, Oliver (2017-07-12). "Use of IMU in Virtual Reality Systems". Analog Dialogue, Analog Devices. Retrieved 2023-10-27. [https://www.analog.com/en/technical-articles/imu-in-virtual-reality-systems.html]</ref><ref name="imu">Wikipedia (20 Apr 2025). "Inertial measurement unit". Retrieved 2024-05-15. [https://en.wikipedia.org/wiki/Inertial_measurement_unit]</ref> | ||
*'''[[Positional Tracking]] (6DoF)''': Tracks both orientation (3DoF) and translation (movement through space: forward/backward, left/right, up/down). This allows the user to physically walk around, lean, crouch, and dodge within the virtual environment, significantly enhancing immersion and interaction. 6DoF tracking is achieved through various methods: | *'''[[Positional Tracking]] (6DoF)''': Tracks both orientation (3DoF) and translation (movement through space: forward/backward, left/right, up/down). This allows the user to physically walk around, lean, crouch, and dodge within the virtual environment, significantly enhancing immersion and interaction. 6DoF tracking is achieved through various methods: | ||
**'''[[Outside-in tracking]]''': External sensors (cameras or infrared emitters/detectors like [[Lighthouse (tracking system)|Valve's Lighthouse system]]) are placed in the room to track markers (passive reflective or active IR LED) on the HMD and controllers. Examples: Original Oculus Rift (Constellation), HTC Vive/Valve Index (Lighthouse).<ref name="LighthouseExplained">XinReality Wiki. "Lighthouse". Retrieved 2023-10-27. [https://xinreality.com/wiki/Lighthouse | **'''[[Outside-in tracking]]''': External sensors (cameras or infrared emitters/detectors like [[Lighthouse (tracking system)|Valve's Lighthouse system]]) are placed in the room to track markers (passive reflective or active IR LED) on the HMD and controllers. Examples: Original Oculus Rift (Constellation), HTC Vive/Valve Index (Lighthouse).<ref name="LighthouseExplained">XinReality Wiki. "Lighthouse". Retrieved 2023-10-27. [https://xinreality.com/wiki/Lighthouse]</ref><ref name="lighthouse" /> | ||
**'''[[Inside-out tracking]]''': Cameras mounted on the HMD itself observe the surrounding environment. Computer vision algorithms, often employing [[Simultaneous Localization and Mapping]] (SLAM) techniques, identify features in the room and track the HMD's movement relative to them. This eliminates the need for external sensors, making setup easier and enabling larger, unrestricted tracking volumes. Most modern standalone and many tethered HMDs use inside-out tracking. Examples: Meta Quest series, HTC Vive Cosmos, Windows Mixed Reality headsets.<ref name="SLAM_VRAR">Yousif, K.; Bab-Hadiashar, A.; Hand, S. (2019-07-30). "A Review on SLAM Techniques for Virtual and Augmented Reality Applications". ''Sensors''. '''19''' (15): 3338. doi:10.3390/s19153338. PMC 6696193. Retrieved 2023-10-27. [https://www.mdpi.com/1424-8220/19/15/3338 | **'''[[Inside-out tracking]]''': Cameras mounted on the HMD itself observe the surrounding environment. Computer vision algorithms, often employing [[Simultaneous Localization and Mapping]] (SLAM) techniques, identify features in the room and track the HMD's movement relative to them. This eliminates the need for external sensors, making setup easier and enabling larger, unrestricted tracking volumes. Most modern standalone and many tethered HMDs use inside-out tracking. Examples: Meta Quest series, HTC Vive Cosmos, Windows Mixed Reality headsets.<ref name="SLAM_VRAR">Yousif, K.; Bab-Hadiashar, A.; Hand, S. (2019-07-30). "A Review on SLAM Techniques for Virtual and Augmented Reality Applications". ''Sensors''. '''19''' (15): 3338. doi:10.3390/s19153338. PMC 6696193. Retrieved 2023-10-27. [https://www.mdpi.com/1424-8220/19/15/3338]</ref><ref name="insight2019" /> | ||
===Latency=== | ===Latency=== | ||
[[Motion-to-photon latency]] - the time delay between a user's physical movement and the corresponding visual update on the display. It is a critical factor for comfort and immersion. High latency is strongly correlated with cybersickness. Modern VR systems aim for latency below 20 milliseconds (ms), with many achieving closer to 10ms under optimal conditions.<ref name="AbrashMTP">Abrash, Michael (2014-01-15). "What VR could, should, and almost certainly will be within two years". Steam Dev Days. Retrieved 2024-05-15. [https://www.youtube.com/watch?v=G-2dQoeqVVo | [[Motion-to-photon latency]] - the time delay between a user's physical movement and the corresponding visual update on the display. It is a critical factor for comfort and immersion. High latency is strongly correlated with cybersickness. Modern VR systems aim for latency below 20 milliseconds (ms), with many achieving closer to 10ms under optimal conditions.<ref name="AbrashMTP">Abrash, Michael (2014-01-15). "What VR could, should, and almost certainly will be within two years". Steam Dev Days. Retrieved 2024-05-15. [https://www.youtube.com/watch?v=G-2dQoeqVVo]</ref><ref name="latency2022">MDPI Sensors (10 Aug 2022). "A Study on Sensor System Latency in VR Motion Sickness". Retrieved 2024-05-15. [https://www.mdpi.com/2224-2708/10/3/53]</ref> | ||
==Key Technical Specifications== | ==Key Technical Specifications== | ||
Line 56: | Line 56: | ||
===Display Technology=== | ===Display Technology=== | ||
The type of display panel used significantly impacts image quality. Common types include: | The type of display panel used significantly impacts image quality. Common types include: | ||
*[[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and [[Quantum dot display|quantum dots]] for better color.<ref name="LCDvsOLED_VR">AR/VR Tips. "LCD vs OLED VR Headsets: Which Screen is Best?". Retrieved 2023-10-27. [https://arvrtips.com/lcd-vs-oled-vr-headsets/ | *[[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and [[Quantum dot display|quantum dots]] for better color.<ref name="LCDvsOLED_VR">AR/VR Tips. "LCD vs OLED VR Headsets: Which Screen is Best?". Retrieved 2023-10-27. [https://arvrtips.com/lcd-vs-oled-vr-headsets/]</ref> | ||
*[[OLED]] (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "[[Screen burn-in|burn-in]]" over long periods and may use [[PenTile matrix family|PenTile]] [[Subixel rendering|subpixel layouts]] affecting perceived sharpness. | *[[OLED]] (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "[[Screen burn-in|burn-in]]" over long periods and may use [[PenTile matrix family|PenTile]] [[Subixel rendering|subpixel layouts]] affecting perceived sharpness. | ||
*[[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (for example [[Bigscreen Beyond]], [[Apple Vision Pro]]).<ref name="MicroOLED_Intro">OLED-Info. "MicroOLED displays". Retrieved 2023-10-27. [https://www.oled-info.com/microoled | *[[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (for example [[Bigscreen Beyond]], [[Apple Vision Pro]]).<ref name="MicroOLED_Intro">OLED-Info. "MicroOLED displays". Retrieved 2023-10-27. [https://www.oled-info.com/microoled]</ref><ref name="microoled2025">Systems Contractor News (23 Apr 2025). "Dual micro-OLED displays grow within the AR/VR headset market". Retrieved 2024-05-15. [https://www.svconline.com/proav-today/dual-micro-oled-displays-grow-within-the-ar-vr-headset-market]</ref> | ||
*[[MicroLED]]: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs. | *[[MicroLED]]: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs. | ||
===[[Resolution]]=== | ===[[Resolution]]=== | ||
The number of [[pixel]]s on the display(s), usually specified per eye (for example 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the [[screen-door effect]] (the visible grid pattern between pixels) and increases image sharpness. [[Pixels Per Degree]] (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally. | The number of [[pixel]]s on the display(s), usually specified per eye (for example 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the [[screen-door effect]] (the visible grid pattern between pixels) and increases image sharpness. [[Pixels Per Degree]] (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally. | ||
===[[Refresh Rate]]=== | ===[[Refresh Rate]]=== | ||
The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (for example 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.<ref name="LowPersistence">Abrash, Michael (2014-07-28). "Understanding Low Persistence on the DK2". Oculus Developer Blog. Retrieved 2023-10-27. [https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/ | The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (for example 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.<ref name="LowPersistence">Abrash, Michael (2014-07-28). "Understanding Low Persistence on the DK2". Oculus Developer Blog. Retrieved 2023-10-27. [https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/]</ref> | ||
===[[Field of View]] (FOV)=== | ===[[Field of View]] (FOV)=== | ||
The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like [[Pimax]] headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (for example 30°-55°) due to the challenges of see-through optics.<ref name="VR_FOV_Comparison">VR Compare. "Headset Feature: Field of View". Retrieved 2023-10-27. [https://vr-compare.com/headsetfeature/fieldofview | The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like [[Pimax]] headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (for example 30°-55°) due to the challenges of see-through optics.<ref name="VR_FOV_Comparison">VR Compare. "Headset Feature: Field of View". Retrieved 2023-10-27. [https://vr-compare.com/headsetfeature/fieldofview]</ref> | ||
===Optics / [[Lens|Lenses]]=== | ===Optics / [[Lens|Lenses]]=== | ||
The lenses used heavily influence FOV, image sharpness (center-to-edge), [[Chromatic aberration|chromatic aberration]], geometric distortion, and physical characteristics like size and weight. | The lenses used heavily influence FOV, image sharpness (center-to-edge), [[Chromatic aberration|chromatic aberration]], geometric distortion, and physical characteristics like size and weight. | ||
*[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | *[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | ||
*[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (for example Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | *[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (for example Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | ||
*[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/ | *[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/]</ref><ref name="optics2023">Expand Reality (05 Oct 2023). "Pancake vs Fresnel Lenses in VR Headsets". Retrieved 2024-05-15. [https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr]</ref> | ||
*[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (for example HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. [https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides | *[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (for example HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. [https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides]</ref> | ||
*[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ | *[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/]</ref> | ||
*[[Holographic Optical Elements]] (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays. | *[[Holographic Optical Elements]] (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays. | ||
===[[Interpupillary distance]] (IPD) Adjustment=== | ===[[Interpupillary distance]] (IPD) Adjustment=== | ||
Line 83: | Line 83: | ||
===[[Eye tracking]]=== | ===[[Eye tracking]]=== | ||
Sensors (typically small infrared cameras) inside the HMD track the user's gaze direction. This enables: | Sensors (typically small infrared cameras) inside the HMD track the user's gaze direction. This enables: | ||
*[[Foveated rendering]]: Rendering the area where the user is looking at full resolution, and the periphery at lower resolution, saving significant computational power.<ref name="FoveatedRenderingNvidia">NVIDIA Developer. "NVIDIA Variable Rate Shading (VRS) & Foveated Rendering". Retrieved 2023-10-27. [https://developer.nvidia.com/vrworks/graphics/foveatedrendering | *[[Foveated rendering]]: Rendering the area where the user is looking at full resolution, and the periphery at lower resolution, saving significant computational power.<ref name="FoveatedRenderingNvidia">NVIDIA Developer. "NVIDIA Variable Rate Shading (VRS) & Foveated Rendering". Retrieved 2023-10-27. [https://developer.nvidia.com/vrworks/graphics/foveatedrendering]</ref> | ||
*Improved Social Interaction: [[Avatar]]s can mimic the user's eye movements, enhancing realism in social VR. | *Improved Social Interaction: [[Avatar]]s can mimic the user's eye movements, enhancing realism in social VR. | ||
*Automatic IPD adjustment. | *Automatic IPD adjustment. | ||
Line 91: | Line 91: | ||
How the HMD connects to the processing unit (if not standalone). | How the HMD connects to the processing unit (if not standalone). | ||
*Wired: Typically [[USB]] (often Type-C for power/data) and [[DisplayPort]] or [[HDMI]] for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement. | *Wired: Typically [[USB]] (often Type-C for power/data) and [[DisplayPort]] or [[HDMI]] for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement. | ||
*Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E/7) or proprietary radio frequencies (for example older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">Heaney, David (2022-01-20). "Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless". UploadVR. Retrieved 2023-10-27. [https://uploadvr.com/wireless-pc-vr-comparison/ | *Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E/7) or proprietary radio frequencies (for example older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">Heaney, David (2022-01-20). "Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless". UploadVR. Retrieved 2023-10-27. [https://uploadvr.com/wireless-pc-vr-comparison/]</ref> | ||
===Audio=== | ===Audio=== | ||
Sound is crucial for immersion. HMDs may feature: | Sound is crucial for immersion. HMDs may feature: | ||
Line 134: | Line 134: | ||
====Video Passthrough AR/MR HMDs==== | ====Video Passthrough AR/MR HMDs==== | ||
These utilize opaque displays, essentially functioning like VR HMDs, but incorporate outward-facing cameras. The live video feed from these cameras is processed (often correcting for distortion and perspective) and displayed on the internal screens, with digital elements rendered on top. This allows users to see their surroundings digitally, blended with virtual content. Modern implementations increasingly use high-resolution, low-latency, color cameras, aiming to create a more seamless blend ("Mixed Reality"). While not optically transparent, they can offer wider FOV for the AR/MR content compared to many current OHMDs and better occlusion of real objects by virtual ones. | These utilize opaque displays, essentially functioning like VR HMDs, but incorporate outward-facing cameras. The live video feed from these cameras is processed (often correcting for distortion and perspective) and displayed on the internal screens, with digital elements rendered on top. This allows users to see their surroundings digitally, blended with virtual content. Modern implementations increasingly use high-resolution, low-latency, color cameras, aiming to create a more seamless blend ("Mixed Reality"). While not optically transparent, they can offer wider FOV for the AR/MR content compared to many current OHMDs and better occlusion of real objects by virtual ones. | ||
* '''Examples''': [[Meta Quest Pro]], [[Meta Quest 3]], [[Apple Vision Pro]], [[HTC Vive XR Elite]], [[Varjo XR-3]], [[Lynx R1]].<ref name="PassthroughExplained">Lang, Ben (2023-02-15). "VR Headset Passthrough AR Explained". Road to VR. Retrieved 2023-10-27. [https://www.roadtovr.com/vr-headset-passthrough-ar-explained-quest-2-pro-index-vive-pro-2/ | * '''Examples''': [[Meta Quest Pro]], [[Meta Quest 3]], [[Apple Vision Pro]], [[HTC Vive XR Elite]], [[Varjo XR-3]], [[Lynx R1]].<ref name="PassthroughExplained">Lang, Ben (2023-02-15). "VR Headset Passthrough AR Explained". Road to VR. Retrieved 2023-10-27. [https://www.roadtovr.com/vr-headset-passthrough-ar-explained-quest-2-pro-index-vive-pro-2/]</ref> | ||
==Components of HMDs== | ==Components of HMDs== | ||
Line 149: | Line 149: | ||
*'''Processors''': | *'''Processors''': | ||
**CPU/GPU: Handle rendering, tracking calculations, application logic (essential in standalone HMDs, often a mobile [[System on a chip|SoC]] like [[Qualcomm Snapdragon Spaces|Snapdragon XR series]]). | **CPU/GPU: Handle rendering, tracking calculations, application logic (essential in standalone HMDs, often a mobile [[System on a chip|SoC]] like [[Qualcomm Snapdragon Spaces|Snapdragon XR series]]). | ||
**Specialized Processors: May include [[Vision Processing Unit]]s (VPUs) or [[Neural Processing Unit]]s (NPUs) to efficiently handle computer vision tasks (SLAM, hand/eye tracking) or [[Artificial intelligence|AI]] workloads, offloading the main CPU/GPU. Microsoft's [[Holographic Processing Unit]] (HPU) in HoloLens is an example.<ref name="HoloLensSensors">Microsoft Learn. "HoloLens 2 hardware". Retrieved 2023-10-27. [https://learn.microsoft.com/en-us/hololens/hololens2-hardware | **Specialized Processors: May include [[Vision Processing Unit]]s (VPUs) or [[Neural Processing Unit]]s (NPUs) to efficiently handle computer vision tasks (SLAM, hand/eye tracking) or [[Artificial intelligence|AI]] workloads, offloading the main CPU/GPU. Microsoft's [[Holographic Processing Unit]] (HPU) in HoloLens is an example.<ref name="HoloLensSensors">Microsoft Learn. "HoloLens 2 hardware". Retrieved 2023-10-27. [https://learn.microsoft.com/en-us/hololens/hololens2-hardware]</ref> | ||
*'''Memory & Storage''': [[RAM]] for active processing and onboard [[Flash memory|storage]] (in standalone HMDs) for the operating system, applications, and media. | *'''Memory & Storage''': [[RAM]] for active processing and onboard [[Flash memory|storage]] (in standalone HMDs) for the operating system, applications, and media. | ||
*'''Audio System''': Integrated speakers, microphones, headphone jacks (see Key Technical Specifications above). | *'''Audio System''': Integrated speakers, microphones, headphone jacks (see Key Technical Specifications above). | ||
Line 166: | Line 166: | ||
* '''Gaming and Entertainment''': Immersive video games ([[Beat Saber]], [[Half-Life: Alyx]]), virtual cinemas, [[Social VR]] platforms ([[VRChat]], [[Rec Room]]), [[Location-based entertainment|location-based VR experiences]], [[Virtual tourism]]. | * '''Gaming and Entertainment''': Immersive video games ([[Beat Saber]], [[Half-Life: Alyx]]), virtual cinemas, [[Social VR]] platforms ([[VRChat]], [[Rec Room]]), [[Location-based entertainment|location-based VR experiences]], [[Virtual tourism]]. | ||
* '''[[Training]] and [[Simulation]]''': Flight simulation, [[Surgical simulation|surgical training]], military exercises, emergency response training, complex machinery operation training, workplace safety drills.<ref name="VRSimulationTraining">Freina, Laura & Ott, Michela (2015). "Virtual reality for simulation and training". ''E-Learning and Digital Media''. '''12''' (3-4): 368-383. doi:10.1177/2042753015591756. Retrieved 2023-10-27. [https://link.springer.com/article/10.1007/s10055-016-0293-5 | * '''[[Training]] and [[Simulation]]''': Flight simulation, [[Surgical simulation|surgical training]], military exercises, emergency response training, complex machinery operation training, workplace safety drills.<ref name="VRSimulationTraining">Freina, Laura & Ott, Michela (2015). "Virtual reality for simulation and training". ''E-Learning and Digital Media''. '''12''' (3-4): 368-383. doi:10.1177/2042753015591756. Retrieved 2023-10-27. [https://link.springer.com/article/10.1007/s10055-016-0293-5]</ref> | ||
* '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. [https://www.autodesk.com/solutions/virtual-reality | * '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. [https://www.autodesk.com/solutions/virtual-reality]</ref> | ||
* '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]). | * '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]). | ||
* '''Medical''': Surgical planning and visualization (overlaying [[medical imaging]] onto patients via AR), therapy (for example [[Virtual reality therapy|exposure therapy]] for phobias/[[PTSD]], pain management/distraction), rehabilitation, medical education. <ref name="VR_Medicine">Rizzo, Albert "Skip" & Kim, Giyoung (2019-11-15). "Applications of Virtual Reality for Clinical Neuropsychology: A Review". ''Journal of Medical Internet Research''. '''21''' (11): e14190. doi:10.2196/14190. PMID 31730019. PMC 6880643. Retrieved 2023-10-27. [https://jmir.org/2019/11/e14190/ | * '''Medical''': Surgical planning and visualization (overlaying [[medical imaging]] onto patients via AR), therapy (for example [[Virtual reality therapy|exposure therapy]] for phobias/[[PTSD]], pain management/distraction), rehabilitation, medical education. <ref name="VR_Medicine">Rizzo, Albert "Skip" & Kim, Giyoung (2019-11-15). "Applications of Virtual Reality for Clinical Neuropsychology: A Review". ''Journal of Medical Internet Research''. '''21''' (11): e14190. doi:10.2196/14190. PMID 31730019. PMC 6880643. Retrieved 2023-10-27. [https://jmir.org/2019/11/e14190/]</ref> | ||
* '''Education''': Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning. | * '''Education''': Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning. | ||
* '''[[Information visualization|Data Visualization]]''': Exploring complex datasets (for example financial data, scientific simulations) in interactive 3D space. | * '''[[Information visualization|Data Visualization]]''': Exploring complex datasets (for example financial data, scientific simulations) in interactive 3D space. | ||
Line 179: | Line 179: | ||
* '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("[[Retinal projector|retinal resolution]]" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, [[Mura defect|mura]], and motion blur remain ongoing goals.<ref name="Kim2019FoveatedAR">Kim, J.; Jeong, Y.; Stengel, M.; et al. (2019). "Foveated AR: dynamically-foveated augmented reality display". ''ACM Transactions on Graphics''. '''38''' (4): 1-15. doi:10.1145/3306346.3322983.</ref> | * '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("[[Retinal projector|retinal resolution]]" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, [[Mura defect|mura]], and motion blur remain ongoing goals.<ref name="Kim2019FoveatedAR">Kim, J.; Jeong, Y.; Stengel, M.; et al. (2019). "Foveated AR: dynamically-foveated augmented reality display". ''ACM Transactions on Graphics''. '''38''' (4): 1-15. doi:10.1145/3306346.3322983.</ref> | ||
* '''Comfort and Ergonomics''': Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.<ref name="TalsmaComfort2020">Talsma, S. W.; Usmani, S. A.; Chen, P. Y. (2020). "Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays". ''Journal of the Society for Information Display''. '''28''' (11): 841-850. doi:10.1002/jsid.943.</ref> | * '''Comfort and Ergonomics''': Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.<ref name="TalsmaComfort2020">Talsma, S. W.; Usmani, S. A.; Chen, P. Y. (2020). "Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays". ''Journal of the Society for Information Display''. '''28''' (11): 841-850. doi:10.1002/jsid.943.</ref> | ||
* '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.<ref name="VAC_Review">Hoffman, David M.; Girshick, Ahna R.; Akeley, Kurt; Banks, Martin S. (2008-03-18). "The vergence-accommodation conflict: Practical consequences and solutions". ''Journal of Vision''. '''8''' (3): 33. doi:10.1167/8.3.33. Retrieved 2023-10-27. [https://jov.arvojournals.org/article.aspx?articleid=2193631 | * '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.<ref name="VAC_Review">Hoffman, David M.; Girshick, Ahna R.; Akeley, Kurt; Banks, Martin S. (2008-03-18). "The vergence-accommodation conflict: Practical consequences and solutions". ''Journal of Vision''. '''8''' (3): 33. doi:10.1167/8.3.33. Retrieved 2023-10-27. [https://jov.arvojournals.org/article.aspx?articleid=2193631]</ref> Solutions like [[Varifocal display|varifocal]] and [[Light field|light field]] displays are complex and still largely experimental. | ||
* '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and [[Vestibular system|vestibular]] input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.<ref name="Weech2019Cybersickness">Weech, S.; Kenny, S.; Barnett-Cowan, M. (2019). "Presence and cybersickness in virtual reality are negatively related: a review". ''Frontiers in Psychology''. '''10''': 158. doi:10.3389/fpsyg.2019.00158. PMID 30787884. PMC 6374254.</ref><ref name="vrsickness">Frontiers in Virtual Reality (20 Mar 2020). "Factors Associated With Virtual Reality Sickness in Head-Mounted Displays". Retrieved 2024-05-15. [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/ | * '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and [[Vestibular system|vestibular]] input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.<ref name="Weech2019Cybersickness">Weech, S.; Kenny, S.; Barnett-Cowan, M. (2019). "Presence and cybersickness in virtual reality are negatively related: a review". ''Frontiers in Psychology''. '''10''': 158. doi:10.3389/fpsyg.2019.00158. PMID 30787884. PMC 6374254.</ref><ref name="vrsickness">Frontiers in Virtual Reality (20 Mar 2020). "Factors Associated With Virtual Reality Sickness in Head-Mounted Displays". Retrieved 2024-05-15. [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/]</ref> | ||
* '''Tracking Robustness''': Inside-out tracking can struggle in poorly lit or overly bright environments, on large featureless surfaces (blank walls), with reflective surfaces (mirrors), or during very fast head/body movements. Outside-in tracking requires external sensor setup and has a limited, fixed tracking volume. | * '''Tracking Robustness''': Inside-out tracking can struggle in poorly lit or overly bright environments, on large featureless surfaces (blank walls), with reflective surfaces (mirrors), or during very fast head/body movements. Outside-in tracking requires external sensor setup and has a limited, fixed tracking volume. | ||
* '''Content Ecosystem''': The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches. | * '''Content Ecosystem''': The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches. | ||
* '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars. | * '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars. | ||
* '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': 1-19. doi:10.1145/3313831.3376101.</ref> | * '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': 1-19. doi:10.1145/3313831.3376101.</ref> | ||
* '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (for example [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): 1-11. doi:10.1038/s41598-017-14811-x.</ref><ref name="laser">Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. [https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers | * '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (for example [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): 1-11. doi:10.1038/s41598-017-14811-x.</ref><ref name="laser">Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. [https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers]</ref> Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.<ref name="MadaryMetzingerEthics2016">Madary, M. & Metzinger, T. K. (2016). "Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology". ''Frontiers in Robotics and AI''. '''3''': 3. doi:10.3389/frobt.2016.00003.</ref> | ||
==Future Trends and Developments== | ==Future Trends and Developments== |