Jump to content

Head-mounted display: Difference between revisions

No edit summary
m Text replacement - "e.g.," to "for example"
Line 27: Line 27:
The fundamental operation of an HMD involves generating an image and directing it into the user's eyes. This typically follows a path:
The fundamental operation of an HMD involves generating an image and directing it into the user's eyes. This typically follows a path:
#'''[[Central Processing Unit]] (CPU) / [[Graphics Processing Unit]] (GPU)''': Process application logic, handle [[tracking]] data, and render the images (frames) intended for display. In [[Integrated HMD|standalone HMDs]], these processors are inside the headset; in [[Discrete HMD|tethered HMDs]], they are in a connected PC or console.
#'''[[Central Processing Unit]] (CPU) / [[Graphics Processing Unit]] (GPU)''': Process application logic, handle [[tracking]] data, and render the images (frames) intended for display. In [[Integrated HMD|standalone HMDs]], these processors are inside the headset; in [[Discrete HMD|tethered HMDs]], they are in a connected PC or console.
#'''Display Panel(s)''': Small, high-resolution screens (e.g., [[LCD]], [[OLED]], [[Micro-OLED]]) receive the rendered images from the GPU. Binocular HMDs typically use either one panel displaying side-by-side images or two separate panels, one for each eye.
#'''Display Panel(s)''': Small, high-resolution screens (for example [[LCD]], [[OLED]], [[Micro-OLED]]) receive the rendered images from the GPU. Binocular HMDs typically use either one panel displaying side-by-side images or two separate panels, one for each eye.
#'''Optics ([[Lens|Lenses]])''': Placed between the display panels and the user's eyes, lenses serve multiple crucial functions:
#'''Optics ([[Lens|Lenses]])''': Placed between the display panels and the user's eyes, lenses serve multiple crucial functions:
#*Magnification: They enlarge the small display image to fill a significant portion of the user's [[Field of View]] (FOV).
#*Magnification: They enlarge the small display image to fill a significant portion of the user's [[Field of View]] (FOV).
Line 58: Line 58:
*[[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and [[Quantum dot display|quantum dots]] for better color.<ref name="LCDvsOLED_VR">AR/VR Tips. "LCD vs OLED VR Headsets: Which Screen is Best?". Retrieved 2023-10-27. [https://arvrtips.com/lcd-vs-oled-vr-headsets/ Link]</ref>
*[[LCD]] (Liquid Crystal Display): Often offers higher pixel density (reducing the [[screen-door effect]]) and potentially lower cost, but may have slower response times and lower contrast compared to OLED. Modern LCDs in VR often use fast-switching technologies and [[Quantum dot display|quantum dots]] for better color.<ref name="LCDvsOLED_VR">AR/VR Tips. "LCD vs OLED VR Headsets: Which Screen is Best?". Retrieved 2023-10-27. [https://arvrtips.com/lcd-vs-oled-vr-headsets/ Link]</ref>
*[[OLED]] (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "[[Screen burn-in|burn-in]]" over long periods and may use [[PenTile matrix family|PenTile]] [[Subixel rendering|subpixel layouts]] affecting perceived sharpness.
*[[OLED]] (Organic Light-Emitting Diode): Provides true blacks (infinite contrast ratio), vibrant colors, and very fast pixel response times (reducing motion blur or ghosting). Can be more susceptible to "[[Screen burn-in|burn-in]]" over long periods and may use [[PenTile matrix family|PenTile]] [[Subixel rendering|subpixel layouts]] affecting perceived sharpness.
*[[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (e.g., [[Bigscreen Beyond]], [[Apple Vision Pro]]).<ref name="MicroOLED_Intro">OLED-Info. "MicroOLED displays". Retrieved 2023-10-27. [https://www.oled-info.com/microoled Link]</ref><ref name="microoled2025">Systems Contractor News (23 Apr 2025). "Dual micro-OLED displays grow within the AR/VR headset market". Retrieved 2024-05-15. [https://www.svconline.com/proav-today/dual-micro-oled-displays-grow-within-the-ar-vr-headset-market Link]</ref>
*[[Micro-OLED]] / [[OLEDoS]] (OLED-on-Silicon): Very small, high-resolution OLED displays built directly onto silicon wafers. Offer extremely high pixel densities (PPD) and brightness, often used in high-end or compact HMDs (for example [[Bigscreen Beyond]], [[Apple Vision Pro]]).<ref name="MicroOLED_Intro">OLED-Info. "MicroOLED displays". Retrieved 2023-10-27. [https://www.oled-info.com/microoled Link]</ref><ref name="microoled2025">Systems Contractor News (23 Apr 2025). "Dual micro-OLED displays grow within the AR/VR headset market". Retrieved 2024-05-15. [https://www.svconline.com/proav-today/dual-micro-oled-displays-grow-within-the-ar-vr-headset-market Link]</ref>
*[[MicroLED]]: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs.
*[[MicroLED]]: An emerging technology promising high brightness, efficiency, contrast, and longevity, potentially surpassing both LCD and OLED for HMDs.
===[[Resolution]]===
===[[Resolution]]===
The number of [[pixel]]s on the display(s), usually specified per eye (e.g., 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the [[screen-door effect]] (the visible grid pattern between pixels) and increases image sharpness. [[Pixels Per Degree]] (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally.
The number of [[pixel]]s on the display(s), usually specified per eye (for example 2064 x 2208 per eye for Meta Quest 3) or sometimes as a total resolution. Higher resolution reduces the [[screen-door effect]] (the visible grid pattern between pixels) and increases image sharpness. [[Pixels Per Degree]] (PPD) is often a more perceptually relevant metric, combining resolution and FOV. Human visual acuity corresponds to roughly 60 PPD; current consumer VR is typically in the 20-35 PPD range, while high-end devices like Vision Pro exceed 40 PPD centrally.
===[[Refresh Rate]]===
===[[Refresh Rate]]===
The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (e.g., 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.<ref name="LowPersistence">Abrash, Michael (2014-07-28). "Understanding Low Persistence on the DK2". Oculus Developer Blog. Retrieved 2023-10-27. [https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/ Link]</ref>
The number of times per second the display updates the image, measured in Hertz (Hz). Higher refresh rates (for example 90Hz, 120Hz, 144Hz) lead to smoother motion, reduced flicker, and can help mitigate motion sickness. 90Hz is often considered a comfortable minimum for VR. Low persistence displays (where pixels are illuminated only for a fraction of the refresh cycle) are crucial in VR to reduce motion blur during head movements.<ref name="LowPersistence">Abrash, Michael (2014-07-28). "Understanding Low Persistence on the DK2". Oculus Developer Blog. Retrieved 2023-10-27. [https://developer.oculus.com/blog/understanding-low-persistence-on-the-dk2/ Link]</ref>


===[[Field of View]] (FOV)===
===[[Field of View]] (FOV)===
The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like [[Pimax]] headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (e.g., 30°-55°) due to the challenges of see-through optics.<ref name="VR_FOV_Comparison">VR Compare. "Headset Feature: Field of View". Retrieved 2023-10-27. [https://vr-compare.com/headsetfeature/fieldofview Link]</ref>
The extent of the visual field visible through the HMD, usually measured horizontally, vertically, and/or diagonally in degrees. Human binocular vision covers roughly 200-220° horizontally (with ~120° stereoscopic overlap). VR HMDs aim for a wide FOV (typically 100°-110° horizontally for consumer devices, sometimes wider like [[Pimax]] headsets) to enhance immersion. AR OHMDs often have a much narrower FOV (for example 30°-55°) due to the challenges of see-through optics.<ref name="VR_FOV_Comparison">VR Compare. "Headset Feature: Field of View". Retrieved 2023-10-27. [https://vr-compare.com/headsetfeature/fieldofview Link]</ref>
===Optics / [[Lens|Lenses]]===
===Optics / [[Lens|Lenses]]===
The lenses used heavily influence FOV, image sharpness (center-to-edge), [[Chromatic aberration|chromatic aberration]], geometric distortion, and physical characteristics like size and weight.
The lenses used heavily influence FOV, image sharpness (center-to-edge), [[Chromatic aberration|chromatic aberration]], geometric distortion, and physical characteristics like size and weight.
*[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky.
*[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky.
*[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (e.g., Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges).
*[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (for example Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges).
*[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/ Link]</ref><ref name="optics2023">Expand Reality (05 Oct 2023). "Pancake vs Fresnel Lenses in VR Headsets". Retrieved 2024-05-15. [https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr Link]</ref>
*[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/ Link]</ref><ref name="optics2023">Expand Reality (05 Oct 2023). "Pancake vs Fresnel Lenses in VR Headsets". Retrieved 2024-05-15. [https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr Link]</ref>
*[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (e.g., HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. [https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides Link]</ref>
*[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (for example HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. [https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides Link]</ref>
*[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ Link]</ref>
*[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ Link]</ref>
*[[Holographic Optical Elements]] (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays.
*[[Holographic Optical Elements]] (HOEs): Thin, lightweight optical components created using holographic recording techniques, capable of performing complex functions like focusing, diffusion, or beam steering. Used in some advanced AR displays.
===[[Interpupillary distance]] (IPD) Adjustment===
===[[Interpupillary distance]] (IPD) Adjustment===
The distance between the centers of the pupils varies between individuals (typically 54-72mm for adults). HMDs need to accommodate this for optimal clarity, comfort, and correct stereo rendering. Adjustment can be:
The distance between the centers of the pupils varies between individuals (typically 54-72mm for adults). HMDs need to accommodate this for optimal clarity, comfort, and correct stereo rendering. Adjustment can be:
*Physical/Manual: Lenses can be moved closer together or further apart, often via a slider or dial (e.g., Valve Index, Quest 3). Continuous adjustment allows finer tuning.
*Physical/Manual: Lenses can be moved closer together or further apart, often via a slider or dial (for example Valve Index, Quest 3). Continuous adjustment allows finer tuning.
*Stepped: Some HMDs offer discrete IPD steps (e.g., original Quest, Quest 2).
*Stepped: Some HMDs offer discrete IPD steps (for example original Quest, Quest 2).
**Software-based: The rendering viewpoint separation is adjusted in software (less common or effective for major mismatches without physical lens movement).
**Software-based: The rendering viewpoint separation is adjusted in software (less common or effective for major mismatches without physical lens movement).
*Automatic: High-end systems might use eye-tracking to measure IPD and adjust automatically or prompt the user.
*Automatic: High-end systems might use eye-tracking to measure IPD and adjust automatically or prompt the user.
Line 91: Line 91:
How the HMD connects to the processing unit (if not standalone).
How the HMD connects to the processing unit (if not standalone).
*Wired: Typically [[USB]] (often Type-C for power/data) and [[DisplayPort]] or [[HDMI]] for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement.
*Wired: Typically [[USB]] (often Type-C for power/data) and [[DisplayPort]] or [[HDMI]] for high bandwidth video. Offers highest fidelity and lowest latency but restricts movement.
*Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E/7) or proprietary radio frequencies (e.g., older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">Heaney, David (2022-01-20). "Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless". UploadVR. Retrieved 2023-10-27. [https://uploadvr.com/wireless-pc-vr-comparison/ Link]</ref>
*Wireless: Uses [[Wi-Fi]] (often Wi-Fi 6/6E/7) or proprietary radio frequencies (for example older WiGig solutions) to stream video and data. Offers freedom of movement but requires video compression (potentially affecting quality) and can introduce latency. Examples: [[HTC Vive Wireless Adapter]], [[Meta Air Link]], [[Virtual Desktop]].<ref name="WirelessVRComparison">Heaney, David (2022-01-20). "Wireless PC VR Comparison: Air Link vs Virtual Desktop vs Vive Wireless". UploadVR. Retrieved 2023-10-27. [https://uploadvr.com/wireless-pc-vr-comparison/ Link]</ref>
===Audio===
===Audio===
Sound is crucial for immersion. HMDs may feature:
Sound is crucial for immersion. HMDs may feature:
*[[Spatial Audio]]: 3D audio rendering techniques that make sounds appear to come from specific locations in the virtual environment. Supported via various output methods.
*[[Spatial Audio]]: 3D audio rendering techniques that make sounds appear to come from specific locations in the virtual environment. Supported via various output methods.
*Integrated [[Speaker]]s: Often open-ear [[Near-field communication|near-field]] speakers built into the strap or near the ears, providing spatial audio without blocking external sounds (e.g., Valve Index, Quest series).
*Integrated [[Speaker]]s: Often open-ear [[Near-field communication|near-field]] speakers built into the strap or near the ears, providing spatial audio without blocking external sounds (for example Valve Index, Quest series).
*[[Headphone]] Jack (3.5mm): Allows users to connect their own headphones or earbuds.
*[[Headphone]] Jack (3.5mm): Allows users to connect their own headphones or earbuds.
*Integrated Headphones: High-fidelity on-ear or over-ear headphones attached to the HMD (e.g., original Rift CV1, HP Reverb G2).
*Integrated Headphones: High-fidelity on-ear or over-ear headphones attached to the HMD (for example original Rift CV1, HP Reverb G2).
*[[Microphone Arrays]]: Multiple microphones for clear voice input, communication in multiplayer apps, voice commands, and potentially noise cancellation.
*[[Microphone Arrays]]: Multiple microphones for clear voice input, communication in multiplayer apps, voice commands, and potentially noise cancellation.
===[[Ergonomics]]===
===[[Ergonomics]]===
Factors affecting comfort during extended use:
Factors affecting comfort during extended use:
*Weight: Lighter is generally better (most consumer HMDs are 400-700g).
*Weight: Lighter is generally better (most consumer HMDs are 400-700g).
*Weight Distribution: Balanced weight (front-to-back) is often more important than total weight. Battery placement in standalone HMDs (e.g., rear-mounted) can improve balance.
*Weight Distribution: Balanced weight (front-to-back) is often more important than total weight. Battery placement in standalone HMDs (for example rear-mounted) can improve balance.
*Strap Design: Different mechanisms (soft elastic straps, rigid "halo" straps, top straps) distribute pressure differently.
*Strap Design: Different mechanisms (soft elastic straps, rigid "halo" straps, top straps) distribute pressure differently.
*Facial Interface: Foam or fabric padding, material breathability, light blocking. Options/space for glasses wearers or [[prescription lens]] inserts.
*Facial Interface: Foam or fabric padding, material breathability, light blocking. Options/space for glasses wearers or [[prescription lens]] inserts.
Line 118: Line 118:


====[[Integrated HMD]] (Standalone HMD)====
====[[Integrated HMD]] (Standalone HMD)====
Also known as All-in-One (AIO) HMDs, these devices contain all necessary components, [[displays]], [[optics]], [[sensors]], processing ([[CPU]]/[[GPU]], often based on mobile chipsets like [[Qualcomm Snapdragon XR]] series), [[memory]], [[storage]], [[battery]], and [[tracking]], within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end [[PC VR]] setups. Many [[standalone HMDs]] can optionally connect to a PC via cable (e.g., [[Meta Link]]) or wirelessly (e.g., [[Air Link]], [[Virtual Desktop]]) to function as a PC VR headset.
Also known as All-in-One (AIO) HMDs, these devices contain all necessary components, [[displays]], [[optics]], [[sensors]], processing ([[CPU]]/[[GPU]], often based on mobile chipsets like [[Qualcomm Snapdragon XR]] series), [[memory]], [[storage]], [[battery]], and [[tracking]], within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end [[PC VR]] setups. Many [[standalone HMDs]] can optionally connect to a PC via cable (for example [[Meta Link]]) or wirelessly (for example [[Air Link]], [[Virtual Desktop]]) to function as a PC VR headset.
*'''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[Pico Neo 3 Link]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]].
*'''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[Pico Neo 3 Link]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]].


====[[Slide-on HMD]] (Smartphone HMD)====
====[[Slide-on HMD]] (Smartphone HMD)====
These were an early, low-cost entry point to VR, consisting of a simple enclosure (often plastic or cardboard) with lenses, into which a compatible [[smartphone]] was inserted. The smartphone provided the display, processing, and basic 3[[DoF]] tracking (using its internal [[IMU]]). While popular initially due to accessibility (e.g., [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]]), they suffered from limitations like lower display quality, higher latency, potential overheating, limited interaction (often just a single button or touchpad), inconsistent experiences across different phones, and generally only 3DoF tracking. This category is now largely obsolete, superseded by standalone HMDs.
These were an early, low-cost entry point to VR, consisting of a simple enclosure (often plastic or cardboard) with lenses, into which a compatible [[smartphone]] was inserted. The smartphone provided the display, processing, and basic 3[[DoF]] tracking (using its internal [[IMU]]). While popular initially due to accessibility (for example [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]]), they suffered from limitations like lower display quality, higher latency, potential overheating, limited interaction (often just a single button or touchpad), inconsistent experiences across different phones, and generally only 3DoF tracking. This category is now largely obsolete, superseded by standalone HMDs.
*'''Examples''': [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]], [[Zeiss VR One]], [[Merge VR/AR Goggles]].
*'''Examples''': [[Google Cardboard]], [[Samsung Gear VR]], [[Google Daydream View]], [[Zeiss VR One]], [[Merge VR/AR Goggles]].


Line 145: Line 145:
***Visible light cameras: Used for inside-out positional (6DoF) tracking, [[video passthrough]] (especially color passthrough), environment mapping, [[Hand tracking|hand tracking]].
***Visible light cameras: Used for inside-out positional (6DoF) tracking, [[video passthrough]] (especially color passthrough), environment mapping, [[Hand tracking|hand tracking]].
***[[Infrared]] (IR) cameras: Often used for inside-out tracking (less susceptible to ambient light changes), controller tracking (detecting IR LEDs on controllers), and [[eye tracking]].
***[[Infrared]] (IR) cameras: Often used for inside-out tracking (less susceptible to ambient light changes), controller tracking (detecting IR LEDs on controllers), and [[eye tracking]].
**[[Depth-sensing Cameras|Depth Sensors]]: (e.g., [[Time-of-flight camera|Time-of-Flight]], Structured Light, Active Stereo IR). Used in some AR/MR HMDs (like HoloLens, Vision Pro) for accurate [[Spatial analysis|spatial mapping]], environment understanding, hand tracking, and occlusion.
**[[Depth-sensing Cameras|Depth Sensors]]: (for example [[Time-of-flight camera|Time-of-Flight]], Structured Light, Active Stereo IR). Used in some AR/MR HMDs (like HoloLens, Vision Pro) for accurate [[Spatial analysis|spatial mapping]], environment understanding, hand tracking, and occlusion.
**[[Eye tracking|Eye Tracking Cameras]]: Small internal IR cameras pointed at the user's eyes to monitor gaze direction and pupil characteristics.
**[[Eye tracking|Eye Tracking Cameras]]: Small internal IR cameras pointed at the user's eyes to monitor gaze direction and pupil characteristics.
*'''Processors''':
*'''Processors''':
Line 156: Line 156:
*'''Mechanical Structure & Ergonomics''': Housing, straps, facial interface, IPD adjustment mechanisms, thermal management (fans, heat sinks).
*'''Mechanical Structure & Ergonomics''': Housing, straps, facial interface, IPD adjustment mechanisms, thermal management (fans, heat sinks).
*'''Input Mechanisms''': How the user interacts with the system.
*'''Input Mechanisms''': How the user interacts with the system.
**[[Motion Controllers]]: Handheld devices tracked in 3D space (usually 6DoF), typically including buttons, triggers, joysticks/touchpads, and [[Haptic technology|haptic feedback]]. Primary input for most VR systems (e.g., Meta Touch controllers, Valve Index controllers, PS VR2 Sense controllers).
**[[Motion Controllers]]: Handheld devices tracked in 3D space (usually 6DoF), typically including buttons, triggers, joysticks/touchpads, and [[Haptic technology|haptic feedback]]. Primary input for most VR systems (for example Meta Touch controllers, Valve Index controllers, PS VR2 Sense controllers).
**[[Hand tracking]]: Camera-based systems that track the user's bare hands and finger movements without requiring controllers. Offers natural interaction but lacks physical buttons and haptic feedback. Increasingly standard on standalone VR/MR headsets (Quest series, Vision Pro).
**[[Hand tracking]]: Camera-based systems that track the user's bare hands and finger movements without requiring controllers. Offers natural interaction but lacks physical buttons and haptic feedback. Increasingly standard on standalone VR/MR headsets (Quest series, Vision Pro).
**[[Eye tracking]]: Used for gaze-based selection, foveated rendering, and social presence.
**[[Eye tracking]]: Used for gaze-based selection, foveated rendering, and social presence.
**[[Voice Commands]]: Using built-in microphones and [[Speech recognition|speech recognition software]] for hands-free control.
**[[Voice Commands]]: Using built-in microphones and [[Speech recognition|speech recognition software]] for hands-free control.
**[[Brain-Computer Interface|Brain-Computer Interfaces]] (BCI): Experimental interfaces reading neural signals, potentially via electrodes integrated into the HMD, for direct thought control. Still largely in research phases for consumer HMDs (e.g., [[NextMind]], [[CTRL-labs]] research).
**[[Brain-Computer Interface|Brain-Computer Interfaces]] (BCI): Experimental interfaces reading neural signals, potentially via electrodes integrated into the HMD, for direct thought control. Still largely in research phases for consumer HMDs (for example [[NextMind]], [[CTRL-labs]] research).


==Applications==
==Applications==
Line 169: Line 169:
*  '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. [https://www.autodesk.com/solutions/virtual-reality Link]</ref>
*  '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. [https://www.autodesk.com/solutions/virtual-reality Link]</ref>
*  '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]).
*  '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]).
*  '''Medical''': Surgical planning and visualization (overlaying [[medical imaging]] onto patients via AR), therapy (e.g., [[Virtual reality therapy|exposure therapy]] for phobias/[[PTSD]], pain management/distraction), rehabilitation, medical education. <ref name="VR_Medicine">Rizzo, Albert "Skip" & Kim, Giyoung (2019-11-15). "Applications of Virtual Reality for Clinical Neuropsychology: A Review". ''Journal of Medical Internet Research''. '''21''' (11): e14190. doi:10.2196/14190. PMID 31730019. PMC 6880643. Retrieved 2023-10-27. [https://jmir.org/2019/11/e14190/ Link]</ref>
*  '''Medical''': Surgical planning and visualization (overlaying [[medical imaging]] onto patients via AR), therapy (for example [[Virtual reality therapy|exposure therapy]] for phobias/[[PTSD]], pain management/distraction), rehabilitation, medical education. <ref name="VR_Medicine">Rizzo, Albert "Skip" & Kim, Giyoung (2019-11-15). "Applications of Virtual Reality for Clinical Neuropsychology: A Review". ''Journal of Medical Internet Research''. '''21''' (11): e14190. doi:10.2196/14190. PMID 31730019. PMC 6880643. Retrieved 2023-10-27. [https://jmir.org/2019/11/e14190/ Link]</ref>
*  '''Education''': Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning.
*  '''Education''': Virtual field trips, interactive science experiments (virtual labs), historical reconstructions, immersive language learning.
*  '''[[Information visualization|Data Visualization]]''': Exploring complex datasets (e.g., financial data, scientific simulations) in interactive 3D space.
*  '''[[Information visualization|Data Visualization]]''': Exploring complex datasets (for example financial data, scientific simulations) in interactive 3D space.
*  '''Military and Aviation''': Helmet-mounted displays providing flight data, targeting information, situational awareness overlays, night vision integration (e.g., [[Integrated Visual Augmentation System|IVAS]]).
*  '''Military and Aviation''': Helmet-mounted displays providing flight data, targeting information, situational awareness overlays, night vision integration (for example [[Integrated Visual Augmentation System|IVAS]]).


==Challenges and Limitations==
==Challenges and Limitations==
Line 185: Line 185:
*  '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars.
*  '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars.
*  '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': 1-19. doi:10.1145/3313831.3376101.</ref>
*  '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': 1-19. doi:10.1145/3313831.3376101.</ref>
*  '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (e.g., [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): 1-11. doi:10.1038/s41598-017-14811-x.</ref><ref name="laser">Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. [https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers Link]</ref> Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.<ref name="MadaryMetzingerEthics2016">Madary, M. & Metzinger, T. K. (2016). "Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology". ''Frontiers in Robotics and AI''. '''3''': 3. doi:10.3389/frobt.2016.00003.</ref>
*  '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (for example [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): 1-11. doi:10.1038/s41598-017-14811-x.</ref><ref name="laser">Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. [https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers Link]</ref> Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.<ref name="MadaryMetzingerEthics2016">Madary, M. & Metzinger, T. K. (2016). "Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology". ''Frontiers in Robotics and AI''. '''3''': 3. doi:10.3389/frobt.2016.00003.</ref>


==Future Trends and Developments==
==Future Trends and Developments==
Line 200: Line 200:
===Sensory Expansion===
===Sensory Expansion===
*'''[[Haptic technology|Advanced Haptic Feedback]]''': Beyond simple controller rumble, providing more nuanced tactile sensations via gloves ([[HaptX]]), bodysuits ([[bHaptics]]), ultrasound ([[Ultraleap]]), or other actuators to simulate touch, texture, and impact.
*'''[[Haptic technology|Advanced Haptic Feedback]]''': Beyond simple controller rumble, providing more nuanced tactile sensations via gloves ([[HaptX]]), bodysuits ([[bHaptics]]), ultrasound ([[Ultraleap]]), or other actuators to simulate touch, texture, and impact.
*'''[[Digital scent technology|Olfactory Displays]]''': Systems that generate scents synchronized with virtual environments to enhance immersion (e.g., [[OVR Technology]]).
*'''[[Digital scent technology|Olfactory Displays]]''': Systems that generate scents synchronized with virtual environments to enhance immersion (for example [[OVR Technology]]).
*'''[[Motion capture|Full-body Tracking]]''': Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers ([[Vive Tracker]]), webcam-based AI solutions, or integrated sensors.
*'''[[Motion capture|Full-body Tracking]]''': Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers ([[Vive Tracker]]), webcam-based AI solutions, or integrated sensors.
===Computational Capabilities===
===Computational Capabilities===
*'''[[Edge computing|Edge/Cloud Computing]]''': Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (e.g., [[NVIDIA CloudXR]], [[Plutosphere]]).<ref name="Liu2019EdgeAR">Liu, L.; Li, H.; Gruteser, M. (2019). "Edge assisted real-time object detection for mobile augmented reality". ''Proceedings of the 25th Annual International Conference on Mobile Computing and Networking'': 1-16. doi:10.1145/3300061.3345431.</ref>
*'''[[Edge computing|Edge/Cloud Computing]]''': Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (for example [[NVIDIA CloudXR]], [[Plutosphere]]).<ref name="Liu2019EdgeAR">Liu, L.; Li, H.; Gruteser, M. (2019). "Edge assisted real-time object detection for mobile augmented reality". ''Proceedings of the 25th Annual International Conference on Mobile Computing and Networking'': 1-16. doi:10.1145/3300061.3345431.</ref>
*'''[[Artificial intelligence|AI Integration]]''': On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction.
*'''[[Artificial intelligence|AI Integration]]''': On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction.
===Interfaces===
===Interfaces===
*'''Improved Hand/Eye/Face Tracking''': Higher fidelity tracking of expressions and micro-movements for more realistic avatars and nuanced control.
*'''Improved Hand/Eye/Face Tracking''': Higher fidelity tracking of expressions and micro-movements for more realistic avatars and nuanced control.
*'''[[Brain-Computer Interface|Neural Interfaces]]''': Non-invasive BCIs (e.g., EMG wristbands, EEG sensors) may offer supplementary input channels in the future.
*'''[[Brain-Computer Interface|Neural Interfaces]]''': Non-invasive BCIs (for example EMG wristbands, EEG sensors) may offer supplementary input channels in the future.


==Market Outlook==
==Market Outlook==