Head-mounted display: Difference between revisions
Appearance
Xinreality (talk | contribs) |
Xinreality (talk | contribs) No edit summary |
||
| Line 5: | Line 5: | ||
HMDs function by presenting imagery, data, or a combination thereof directly to the wearer's visual field. Many modern HMDs are [[stereoscopic]], featuring separate displays or distinct images rendered for each eye to create a sense of depth through [[binocular disparity]]. Examples include VR headsets like the [[Meta Quest 3]] and [[Valve Index]]. Other HMDs, particularly earlier AR devices or specialized notification displays like the original [[Google Glass]], may be monocular, presenting information over only one eye.<ref name="GoogleGlassPatent">Heinrich, Jerome (assignee: Google Inc.) (2014-07-29). "Wearable display device". Google Patents. Retrieved 2023-10-27. [https://patents.google.com/patent/US8791879B1/en Link]</ref> | HMDs function by presenting imagery, data, or a combination thereof directly to the wearer's visual field. Many modern HMDs are [[stereoscopic]], featuring separate displays or distinct images rendered for each eye to create a sense of depth through [[binocular disparity]]. Examples include VR headsets like the [[Meta Quest 3]] and [[Valve Index]]. Other HMDs, particularly earlier AR devices or specialized notification displays like the original [[Google Glass]], may be monocular, presenting information over only one eye.<ref name="GoogleGlassPatent">Heinrich, Jerome (assignee: Google Inc.) (2014-07-29). "Wearable display device". Google Patents. Retrieved 2023-10-27. [https://patents.google.com/patent/US8791879B1/en Link]</ref> | ||
The vast majority of consumer and enterprise VR and AR systems rely on HMDs. In AR applications, the display system is typically designed to be see-through, allowing digital information to be superimposed onto the user's view of the real world. These are often specifically termed [[Optical head-mounted display]]s (OHMDs), utilizing technologies like [[Waveguide (optics)|waveguides]] or [[beam splitter]]s.<ref name="AROpticsReview">Kress, Bernard C. & Starner, Thad (2018-11-01). "Optical see-through head-mounted displays: a review". ''Applied Optics''. '''57''' (31): | The vast majority of consumer and enterprise VR and AR systems rely on HMDs. In AR applications, the display system is typically designed to be see-through, allowing digital information to be superimposed onto the user's view of the real world. These are often specifically termed [[Optical head-mounted display]]s (OHMDs), utilizing technologies like [[Waveguide (optics)|waveguides]] or [[beam splitter]]s.<ref name="AROpticsReview">Kress, Bernard C. & Starner, Thad (2018-11-01). "Optical see-through head-mounted displays: a review". ''Applied Optics''. '''57''' (31): 9311-9325. doi:10.1364/AO.57.009311. Retrieved 2023-10-27. [https://www.osapublishing.org/ao/abstract.cfm?uri=ao-57-31-9311 Link]</ref> In VR applications, the display system is opaque, completely blocking the user's view of the real world and replacing it with a computer-generated virtual environment, aiming for high levels of [[immersion]] and [[presence]].<ref name="VRBookSlater">Slater, Mel & Sanchez-Vives, Maria V. (2016). "Chapter 1: Immersive Virtual Reality". ''Enhancing Our Lives with Immersive Virtual Reality''. Elsevier. ISBN 978-0128046377.</ref> Some modern VR HMDs incorporate external [[camera]]s to provide [[video passthrough]] capabilities, enabling a form of AR or "Mixed Reality" where the real world is viewed digitally on the opaque screens with virtual elements overlaid. | ||
==History== | ==History== | ||
| Line 15: | Line 15: | ||
The modern era of consumer VR HMDs was effectively kickstarted by [[Palmer Luckey]]'s prototype [[Oculus Rift]] in the early 2010s, which demonstrated that high-quality, low-latency VR was becoming feasible with modern mobile display panels and [[sensor]]s. Its subsequent Kickstarter success and acquisition by [[Facebook]] (now [[Meta Platforms|Meta]]) spurred renewed industry-wide investment.<ref name="OculusKickstarter">Kickstarter. "Oculus Rift: Step Into the Game". Retrieved 2023-10-27. [https://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game Link]</ref> This led to the release of numerous consumer HMDs: | The modern era of consumer VR HMDs was effectively kickstarted by [[Palmer Luckey]]'s prototype [[Oculus Rift]] in the early 2010s, which demonstrated that high-quality, low-latency VR was becoming feasible with modern mobile display panels and [[sensor]]s. Its subsequent Kickstarter success and acquisition by [[Facebook]] (now [[Meta Platforms|Meta]]) spurred renewed industry-wide investment.<ref name="OculusKickstarter">Kickstarter. "Oculus Rift: Step Into the Game". Retrieved 2023-10-27. [https://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game Link]</ref> This led to the release of numerous consumer HMDs: | ||
* | *2014 - [[Google Cardboard]] popularised low-cost, smartphone-driven VR viewers.<ref name="cardboard2014">Time Magazine (28 Jan 2016). "Google’s New Head of Virtual Reality on What They’re Planning Next". Retrieved 2024-05-15. [https://time.com/4193755/google-cardboard-virtual-reality-clay-bavor-vr/ Link]</ref> | ||
* | *2015 - [[Samsung Gear VR]] improved on the smartphone HMD concept with better optics and integrated controls.<ref name="gearvr2015">Wikipedia (24 Apr 2025). "Samsung Gear VR". Retrieved 2024-05-15. [https://en.wikipedia.org/wiki/Samsung_Gear_VR Link]</ref> | ||
* | *2016 - The consumer [[Oculus Rift]] (CV1) and [[HTC Vive]] established high-end, PC-tethered VR with wide FOV and robust external tracking systems.<ref name="rift2016">Wikipedia (15 Apr 2025). "Oculus Rift". Retrieved 2024-05-15. [https://en.wikipedia.org/wiki/Oculus_Rift Link]</ref><ref name="lighthouse">Valve Software (12 Feb 2025). "Valve Index Base Stations". Retrieved 2024-05-15. [https://www.valvesoftware.com/index/base-stations Link]</ref> [[PlayStation VR]] brought tethered VR to the console market. | ||
* | *2019 - [[Oculus Quest]] pioneered high-quality standalone (untethered) 6DoF VR using inside-out camera tracking (Oculus Insight).<ref name="insight2019">Meta Reality Labs (14 Aug 2019). "The Story Behind Oculus Insight Technology". Retrieved 2024-05-15. [https://tech.facebook.com/reality-labs/2019/8/the-story-behind-oculus-insight-technology/ Link]</ref> [[Valve Index]] pushed fidelity in the PC VR space. | ||
*2023 | *2023 - [[Meta Quest 3]] adopted the [[Pancake lens|pancake-style optics]] first introduced on the premium [[Meta Quest Pro]] (launched October 2022), and added high-resolution full-colour passthrough mixed reality plus a faster mobile chipset.<ref name="QuestProPancake">{{cite web |url=https://about.fb.com/news/2022/10/meta-quest-pro-social-vr-connect-2022/ |title=Meta Connect 2022: Meta Quest Pro, More Social VR and a Look Into the Future |website=Meta Newsroom |date=11 October 2022 |access-date=2025-04-29}}</ref><ref name="Quest3Features">{{cite web |url=https://www.roadtovr.com/quest-3-features-hands-on-preview/ |title=Quest 3 Features Confirmed in First Hands-on |website=Road to VR |date=12 June 2023 |access-date=2025-04-29}}</ref><ref name="Quest3Review">{{cite web |url=https://www.uploadvr.com/quest-3-review/ |title=Quest 3 Review: Excellent VR With Limited Mixed Reality |website=UploadVR |date=16 October 2023 |access-date=2025-04-29}}</ref> | ||
*2024 | *2024 - [[Apple Vision Pro]] launched as a premium "spatial computer" featuring high-resolution Micro-OLED displays, advanced eye and hand tracking, and spatial video capabilities.<ref name="visionpro">Apple Newsroom (08 Jan 2024). "Apple Vision Pro available in the U.S. on February 2". Retrieved 2024-05-15. [https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/ Link]</ref> | ||
==Core Concepts and Principles== | ==Core Concepts and Principles== | ||
| Line 38: | Line 38: | ||
===Tracking=== | ===Tracking=== | ||
Tracking the user's head movement is fundamental to creating immersive and interactive experiences, particularly in VR. As the user moves their head, the system updates the rendered images accordingly, making the virtual world appear stable and allowing the user to look around naturally. Failure to track accurately and with low latency can lead to disorientation and [[Motion sickness]] (often termed "cybersickness" in VR/AR contexts).<ref name="LaViolaMotionSickness">LaViola Jr., Joseph J. (2000). "A discussion of cybersickness in virtual environments". ''ACM SIGCHI Bulletin''. '''32''' (1): | Tracking the user's head movement is fundamental to creating immersive and interactive experiences, particularly in VR. As the user moves their head, the system updates the rendered images accordingly, making the virtual world appear stable and allowing the user to look around naturally. Failure to track accurately and with low latency can lead to disorientation and [[Motion sickness]] (often termed "cybersickness" in VR/AR contexts).<ref name="LaViolaMotionSickness">LaViola Jr., Joseph J. (2000). "A discussion of cybersickness in virtual environments". ''ACM SIGCHI Bulletin''. '''32''' (1): 47-56. doi:10.1145/333329.333033. Retrieved 2023-10-27. [https://ieeexplore.ieee.org/document/947376 Link]</ref> Tracking operates in multiple [[Degrees of Freedom]] (DoF): | ||
*'''[[Rotational Tracking]] (3DoF)''': Tracks orientation changes: pitch (nodding yes), yaw (shaking no), and roll (tilting head side-to-side). This is the minimum required for a basic VR experience where the user can look around from a fixed viewpoint. It is typically achieved using an [[Inertial Measurement Unit]] (IMU) within the HMD, containing sensors like: | *'''[[Rotational Tracking]] (3DoF)''': Tracks orientation changes: pitch (nodding yes), yaw (shaking no), and roll (tilting head side-to-side). This is the minimum required for a basic VR experience where the user can look around from a fixed viewpoint. It is typically achieved using an [[Inertial Measurement Unit]] (IMU) within the HMD, containing sensors like: | ||
| Line 49: | Line 49: | ||
===Latency=== | ===Latency=== | ||
[[Motion-to-photon latency]] | [[Motion-to-photon latency]] - the time delay between a user's physical movement and the corresponding visual update on the display. It is a critical factor for comfort and immersion. High latency is strongly correlated with cybersickness. Modern VR systems aim for latency below 20 milliseconds (ms), with many achieving closer to 10ms under optimal conditions.<ref name="AbrashMTP">Abrash, Michael (2014-01-15). "What VR could, should, and almost certainly will be within two years". Steam Dev Days. Retrieved 2024-05-15. [https://www.youtube.com/watch?v=G-2dQoeqVVo Link]</ref><ref name="latency2022">MDPI Sensors (10 Aug 2022). "A Study on Sensor System Latency in VR Motion Sickness". Retrieved 2024-05-15. [https://www.mdpi.com/2224-2708/10/3/53 Link]</ref> | ||
==Key Technical Specifications== | ==Key Technical Specifications== | ||
| Line 71: | Line 71: | ||
*[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | *[[Aspheric lens|Aspheric Lenses]]: Simple, often used in early or budget HMDs. Can be bulky. | ||
*[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (e.g., Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | *[[Fresnel lens|Fresnel Lenses]]: Use concentric rings to reduce thickness and weight compared to simple aspheric lenses while maintaining a short focal length. Common in many VR HMDs (e.g., Rift CV1, Vive, Quest 2), but can introduce visual artifacts like concentric rings and "[[God rays]]" (stray light scattering off the ridges). | ||
*[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) | *[[Pancake lens|Pancake Lenses]]: A newer, more complex folded optic design using polarization. Allow for significantly shorter distances between the display and lens, enabling much slimmer and lighter HMD designs. Often offer improved edge-to-edge clarity but can be less light-efficient, requiring brighter displays. Used in devices like Meta Quest Pro, Pico 4, Bigscreen Beyond.<ref name="PancakeOptics">Guttag, Karl (2021-12-09). "VR Optics (Part 1) - Brief History and Pancake Lenses". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2021/12/09/vr-optics-part-1-brief-history-and-pancake-lenses/ Link]</ref><ref name="optics2023">Expand Reality (05 Oct 2023). "Pancake vs Fresnel Lenses in VR Headsets". Retrieved 2024-05-15. [https://landing.expandreality.io/pancake-vs.-fresnel-lenses-in-vr-headsets-advanced-optics-for-vr Link]</ref> | ||
*[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (e.g., HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. [https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides Link]</ref> | *[[Waveguide (optics)|Waveguides]] (AR): Used in many see-through OHMDs (e.g., HoloLens, Magic Leap). Light from a microdisplay is injected into a thin piece of glass or plastic and then directed out towards the eye using [[Diffractive optics|diffractive]] or reflective elements, allowing the user to see the real world through the waveguide. Achieving wide FOV and high efficiency with waveguides is challenging.<ref name="AROpticsReview"/><ref name="waveguide2022">Radiant Vision Systems (11 Jan 2022). "Ride the Wave: AR Devices Rely on Waveguides". Retrieved 2024-05-15. [https://www.radiantvisionsystems.com/blog/ride-wave-augmented-reality-devices-rely-waveguides Link]</ref> | ||
*[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ Link]</ref> | *[[Beam splitter|Beam Splitters / Birdbaths]] (AR): A simpler see-through optic where a partially reflective mirror combines light from a display with the view of the real world. Often bulkier and may have a smaller FOV or less uniform transparency than waveguides. Used in devices like Google Glass (using a prism variant) and Nreal/XREAL Air.<ref name="BirdbathOptics">Guttag, Karl (2019-04-01). "HoloLens 2 (HL2) and AR Optics in General (Part 1)". KGOnTech. Retrieved 2023-10-27. [https://kguttag.com/2019/04/01/hololens-2-hl2-and-ar-optics-in-general-part-1/ Link]</ref> | ||
| Line 113: | Line 113: | ||
These devices aim to fully immerse the user in a virtual world, blocking out the real environment. | These devices aim to fully immerse the user in a virtual world, blocking out the real environment. | ||
====[[Discrete HMD]] (Tethered HMD)==== | ====[[Discrete HMD]] (Tethered HMD)==== | ||
These HMDs contain displays, optics, sensors, and audio, but rely on an external processing unit | These HMDs contain displays, optics, sensors, and audio, but rely on an external processing unit, typically a powerful [[Personal Computer|PC]] or a game [[console]], connected via cables (or sometimes a dedicated wireless adapter). They generally offer the highest fidelity graphics and performance due to leveraging powerful external GPUs. | ||
*'''PC VR Examples''': [[Valve Index]], [[HTC Vive Pro 2]], [[HP Reverb G2]], original [[Oculus Rift]], [[Oculus Rift S]], [[Varjo Aero]], [[Pimax]] series. | *'''PC VR Examples''': [[Valve Index]], [[HTC Vive Pro 2]], [[HP Reverb G2]], original [[Oculus Rift]], [[Oculus Rift S]], [[Varjo Aero]], [[Pimax]] series. | ||
*'''Console VR Examples''': [[PlayStation VR]], [[PlayStation VR2]] (connects to PlayStation consoles). | *'''Console VR Examples''': [[PlayStation VR]], [[PlayStation VR2]] (connects to PlayStation consoles). | ||
====[[Integrated HMD]] (Standalone HMD)==== | ====[[Integrated HMD]] (Standalone HMD)==== | ||
Also known as All-in-One (AIO) HMDs, these devices contain all necessary components | Also known as All-in-One (AIO) HMDs, these devices contain all necessary components, displays, optics, sensors, processing (CPU/GPU, often based on mobile chipsets like Qualcomm Snapdragon XR series), memory, storage, battery, and tracking, within the headset itself. They require no external PC or console, offering greater freedom of movement and ease of use. Processing power is typically lower than high-end PC VR setups. Many standalone HMDs can optionally connect to a PC via cable (e.g., Meta Link) or wirelessly (e.g., Air Link, Virtual Desktop) to function as a PC VR headset. | ||
*'''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[Pico Neo 3 Link]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]]. | *'''Examples''': [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest Pro]], [[Pico 4]], [[Pico Neo 3 Link]], [[HTC Vive Focus 3]], [[HTC Vive XR Elite]]. | ||
| Line 166: | Line 166: | ||
* '''Gaming and Entertainment''': Immersive video games ([[Beat Saber]], [[Half-Life: Alyx]]), virtual cinemas, [[Social VR]] platforms ([[VRChat]], [[Rec Room]]), [[Location-based entertainment|location-based VR experiences]], [[Virtual tourism]]. | * '''Gaming and Entertainment''': Immersive video games ([[Beat Saber]], [[Half-Life: Alyx]]), virtual cinemas, [[Social VR]] platforms ([[VRChat]], [[Rec Room]]), [[Location-based entertainment|location-based VR experiences]], [[Virtual tourism]]. | ||
* '''[[Training]] and [[Simulation]]''': Flight simulation, [[Surgical simulation|surgical training]], military exercises, emergency response training, complex machinery operation training, workplace safety drills.<ref name="VRSimulationTraining">Freina, Laura & Ott, Michela (2015). "Virtual reality for simulation and training". ''E-Learning and Digital Media''. '''12''' (3-4): | * '''[[Training]] and [[Simulation]]''': Flight simulation, [[Surgical simulation|surgical training]], military exercises, emergency response training, complex machinery operation training, workplace safety drills.<ref name="VRSimulationTraining">Freina, Laura & Ott, Michela (2015). "Virtual reality for simulation and training". ''E-Learning and Digital Media''. '''12''' (3-4): 368-383. doi:10.1177/2042753015591756. Retrieved 2023-10-27. [https://link.springer.com/article/10.1007/s10055-016-0293-5 Link]</ref> | ||
* '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. [https://www.autodesk.com/solutions/virtual-reality Link]</ref> | * '''Design and Engineering''': [[Computer-Aided Design]] (CAD) review and collaboration, [[Architectural rendering|architectural visualization]] (virtual walkthroughs), virtual prototyping, ergonomic assessments, [[Digital twin]] interaction.<ref name="VR_CAD">Autodesk. "Virtual Reality in Design and Manufacturing". Retrieved 2023-10-27. [https://www.autodesk.com/solutions/virtual-reality Link]</ref> | ||
* '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]). | * '''[[Telepresence]] and [[Virtual collaboration|Collaboration]]''': Virtual meetings with embodied avatars, remote assistance (especially using AR overlays for "see-what-I-see" guidance), shared virtual workspaces ([[Spatial (software)|Spatial]], [[Horizon Workrooms]]). | ||
| Line 177: | Line 177: | ||
Despite significant progress, HMD technology still faces challenges: | Despite significant progress, HMD technology still faces challenges: | ||
* '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("[[Retinal projector|retinal resolution]]" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, [[Mura defect|mura]], and motion blur remain ongoing goals.<ref name="Kim2019FoveatedAR">Kim, J.; Jeong, Y.; Stengel, M.; et al. (2019). "Foveated AR: dynamically-foveated augmented reality display". ''ACM Transactions on Graphics''. '''38''' (4): | * '''Visual Fidelity''': Achieving resolution and clarity that matches human vision ("[[Retinal projector|retinal resolution]]" ≈ 60 PPD), wider FOV without distortion or edge artifacts, higher brightness and contrast (especially for outdoor AR), better dynamic range, and eliminating artifacts like screen-door effect, god rays, [[Mura defect|mura]], and motion blur remain ongoing goals.<ref name="Kim2019FoveatedAR">Kim, J.; Jeong, Y.; Stengel, M.; et al. (2019). "Foveated AR: dynamically-foveated augmented reality display". ''ACM Transactions on Graphics''. '''38''' (4): 1-15. doi:10.1145/3306346.3322983.</ref> | ||
* '''Comfort and Ergonomics''': Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.<ref name="TalsmaComfort2020">Talsma, S. W.; Usmani, S. A.; Chen, P. Y. (2020). "Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays". ''Journal of the Society for Information Display''. '''28''' (11): | * '''Comfort and Ergonomics''': Reducing weight, improving balance (counterweights, lighter optics), managing heat dissipation, accommodating prescription glasses comfortably, and finding comfortable, hygienic long-term wear solutions (straps, facial interfaces) are critical for broader adoption.<ref name="TalsmaComfort2020">Talsma, S. W.; Usmani, S. A.; Chen, P. Y. (2020). "Critical factors in comfort, cognitive load, and performance for consumer head-mounted displays". ''Journal of the Society for Information Display''. '''28''' (11): 841-850. doi:10.1002/jsid.943.</ref> | ||
* '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.<ref name="VAC_Review">Hoffman, David M.; Girshick, Ahna R.; Akeley, Kurt; Banks, Martin S. (2008-03-18). "The vergence-accommodation conflict: Practical consequences and solutions". ''Journal of Vision''. '''8''' (3): 33. doi:10.1167/8.3.33. Retrieved 2023-10-27. [https://jov.arvojournals.org/article.aspx?articleid=2193631 Link]</ref> Solutions like [[Varifocal display|varifocal]] and [[Light field|light field]] displays are complex and still largely experimental. | * '''[[Vergence-accommodation conflict]]''': In most current HMDs, the eyes focus (accommodate) at a fixed distance determined by the optics, but converge based on the perceived depth of virtual objects. This mismatch can cause eye strain, fatigue, and inaccurate depth perception.<ref name="VAC_Review">Hoffman, David M.; Girshick, Ahna R.; Akeley, Kurt; Banks, Martin S. (2008-03-18). "The vergence-accommodation conflict: Practical consequences and solutions". ''Journal of Vision''. '''8''' (3): 33. doi:10.1167/8.3.33. Retrieved 2023-10-27. [https://jov.arvojournals.org/article.aspx?articleid=2193631 Link]</ref> Solutions like [[Varifocal display|varifocal]] and [[Light field|light field]] displays are complex and still largely experimental. | ||
* '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and [[Vestibular system|vestibular]] input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.<ref name="Weech2019Cybersickness">Weech, S.; Kenny, S.; Barnett-Cowan, M. (2019). "Presence and cybersickness in virtual reality are negatively related: a review". ''Frontiers in Psychology''. '''10''': 158. doi:10.3389/fpsyg.2019.00158. PMID 30787884. PMC 6374254.</ref><ref name="vrsickness">Frontiers in Virtual Reality (20 Mar 2020). "Factors Associated With Virtual Reality Sickness in Head-Mounted Displays". Retrieved 2024-05-15. [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/ Link]</ref> | * '''Motion Sickness / Cybersickness''': While greatly reduced compared to early systems due to low latency and high refresh rates, discrepancies between visual motion and [[Vestibular system|vestibular]] input, tracking inaccuracies, or poorly designed software can still induce nausea, dizziness, and discomfort in susceptible individuals.<ref name="Weech2019Cybersickness">Weech, S.; Kenny, S.; Barnett-Cowan, M. (2019). "Presence and cybersickness in virtual reality are negatively related: a review". ''Frontiers in Psychology''. '''10''': 158. doi:10.3389/fpsyg.2019.00158. PMID 30787884. PMC 6374254.</ref><ref name="vrsickness">Frontiers in Virtual Reality (20 Mar 2020). "Factors Associated With Virtual Reality Sickness in Head-Mounted Displays". Retrieved 2024-05-15. [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7145389/ Link]</ref> | ||
| Line 184: | Line 184: | ||
* '''Content Ecosystem''': The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches. | * '''Content Ecosystem''': The availability of high-quality, compelling, and diverse applications and experiences ("killer apps") is crucial for driving HMD adoption beyond early adopters and specific niches. | ||
* '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars. | * '''Cost''': High-end HMDs remain expensive (>$1000), although capable standalone VR headsets have become more affordable (~$300-$500). Advanced AR/MR devices often cost several thousand dollars. | ||
* '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': | * '''Social Acceptance''': Wearing bulky headsets, especially in public or social settings, remains a significant barrier for AR/MR aiming for all-day use. Privacy concerns related to onboard cameras are also relevant.<ref name="Koelle2020SocialAcceptability">Koelle, M.; Ananthanarayan, S.; Boll, S. (2020). "Social acceptability in HCI: A survey of methods, measures, and design strategies". ''Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems'': 1-19. doi:10.1145/3313831.3376101.</ref> | ||
* '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (e.g., [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): | * '''Health and Safety''': Long-term effects of prolonged use on vision (especially in children) are still being studied. Physical risks include collision with real-world objects while immersed. Eye safety standards (e.g., [[IEC 60825-1]] for lasers in depth sensors) must be followed.<ref name="Turnbull2017OcularEffects">Turnbull, P. R. & Phillips, J. R. (2017). "Ocular effects of virtual reality headset wear in young adults". ''Scientific Reports''. '''7''' (1): 1-11. doi:10.1038/s41598-017-14811-x.</ref><ref name="laser">Laser Institute of America (02 Dec 2023). "ANSI Z136.1 — Safe Use of Lasers". Retrieved 2024-05-15. [https://www.lia.org/resources/laser-safety-standards/ansi-z1361-safe-use-lasers Link]</ref> Psychological effects regarding immersion, dissociation, or addiction potential warrant consideration.<ref name="MadaryMetzingerEthics2016">Madary, M. & Metzinger, T. K. (2016). "Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology". ''Frontiers in Robotics and AI''. '''3''': 3. doi:10.3389/frobt.2016.00003.</ref> | ||
==Future Trends and Developments== | ==Future Trends and Developments== | ||
| Line 191: | Line 191: | ||
===Display Advancements=== | ===Display Advancements=== | ||
*'''[[Varifocal display|Varifocal Displays]]''': Systems that dynamically adjust focal depth based on where the user is looking (using eye tracking) or based on scene content, addressing the vergence-accommodation conflict. Technologies include movable lenses/displays, [[Liquid crystal lens|liquid crystal lenses]], [[Alvarez lens|Alvarez lenses]], and multi-focal plane displays.<ref name="Rathinavel2018Varifocal">Rathinavel, K.; et al. (2018). "An extended depth-at-field volumetric near-eye augmented reality display". ''IEEE Transactions on Visualization and Computer Graphics''. '''24''' (11): | *'''[[Varifocal display|Varifocal Displays]]''': Systems that dynamically adjust focal depth based on where the user is looking (using eye tracking) or based on scene content, addressing the vergence-accommodation conflict. Technologies include movable lenses/displays, [[Liquid crystal lens|liquid crystal lenses]], [[Alvarez lens|Alvarez lenses]], and multi-focal plane displays.<ref name="Rathinavel2018Varifocal">Rathinavel, K.; et al. (2018). "An extended depth-at-field volumetric near-eye augmented reality display". ''IEEE Transactions on Visualization and Computer Graphics''. '''24''' (11): 2857-2866. doi:10.1109/TVCG.2018.2868565.</ref> | ||
*'''[[Light field|Light Field Displays]]''': Generate a more complete representation of light, allowing the eye to focus naturally at different depths within the virtual scene. Still complex and computationally intensive.<ref name="Huang2015LightFieldStereoscope">Huang, F. C.; Chen, K.; Wetzstein, G. (2015). "The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues". ''ACM Transactions on Graphics''. '''34''' (4): | *'''[[Light field|Light Field Displays]]''': Generate a more complete representation of light, allowing the eye to focus naturally at different depths within the virtual scene. Still complex and computationally intensive.<ref name="Huang2015LightFieldStereoscope">Huang, F. C.; Chen, K.; Wetzstein, G. (2015). "The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues". ''ACM Transactions on Graphics''. '''34''' (4): 1-12. doi:10.1145/2766949.</ref> | ||
*'''[[Holographic display|Holographic Displays]]''': Aim to reconstruct the wavefront of light from a 3D scene, potentially offering the most natural 3D viewing experience without conflicts. True holographic HMDs are still highly experimental. | *'''[[Holographic display|Holographic Displays]]''': Aim to reconstruct the wavefront of light from a 3D scene, potentially offering the most natural 3D viewing experience without conflicts. True holographic HMDs are still highly experimental. | ||
*'''Higher resolution, brightness, efficiency via Micro-OLED and [[MicroLED]] maturation. | *'''Higher resolution, brightness, efficiency via Micro-OLED and [[MicroLED]] maturation. | ||
===Form Factor Evolution=== | ===Form Factor Evolution=== | ||
*'''[[Lightweight Designs]]''': Advanced optics (pancake, [[Metalens|metalenses]], HOEs) and display technologies are enabling significantly thinner, lighter headsets (sub-300g or even sub-100g). | *'''[[Lightweight Designs]]''': Advanced optics (pancake, [[Metalens|metalenses]], HOEs) and display technologies are enabling significantly thinner, lighter headsets (sub-300g or even sub-100g). | ||
*'''[[Smartglasses|AR Glasses]]''': The long-term goal for AR is achieving normal eyeglass form factors with all-day wearability and significant compute/display capabilities. Projects like [[Project Aria]] (Meta research) and rumored [[Apple smart glasses|Apple glasses]] point toward this future.<ref name="Delaney2021ARGlasses">Delaney, K. (2021). "The race toward human-centered AR glasses". ''IEEE Computer Graphics and Applications''. '''41''' (5): | *'''[[Smartglasses|AR Glasses]]''': The long-term goal for AR is achieving normal eyeglass form factors with all-day wearability and significant compute/display capabilities. Projects like [[Project Aria]] (Meta research) and rumored [[Apple smart glasses|Apple glasses]] point toward this future.<ref name="Delaney2021ARGlasses">Delaney, K. (2021). "The race toward human-centered AR glasses". ''IEEE Computer Graphics and Applications''. '''41''' (5): 112-115. doi:10.1109/MCG.2021.3097740.</ref> | ||
===Sensory Expansion=== | ===Sensory Expansion=== | ||
*'''[[Haptic technology|Advanced Haptic Feedback]]''': Beyond simple controller rumble, providing more nuanced tactile sensations via gloves ([[HaptX]]), bodysuits ([[bHaptics]]), ultrasound ([[Ultraleap]]), or other actuators to simulate touch, texture, and impact. | *'''[[Haptic technology|Advanced Haptic Feedback]]''': Beyond simple controller rumble, providing more nuanced tactile sensations via gloves ([[HaptX]]), bodysuits ([[bHaptics]]), ultrasound ([[Ultraleap]]), or other actuators to simulate touch, texture, and impact. | ||
| Line 203: | Line 203: | ||
*'''[[Motion capture|Full-body Tracking]]''': Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers ([[Vive Tracker]]), webcam-based AI solutions, or integrated sensors. | *'''[[Motion capture|Full-body Tracking]]''': Moving beyond head and hands to track limb and torso movements for more complete avatar embodiment, using external trackers ([[Vive Tracker]]), webcam-based AI solutions, or integrated sensors. | ||
===Computational Capabilities=== | ===Computational Capabilities=== | ||
*'''[[Edge computing|Edge/Cloud Computing]]''': Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (e.g., [[NVIDIA CloudXR]], [[Plutosphere]]).<ref name="Liu2019EdgeAR">Liu, L.; Li, H.; Gruteser, M. (2019). "Edge assisted real-time object detection for mobile augmented reality". ''Proceedings of the 25th Annual International Conference on Mobile Computing and Networking'': | *'''[[Edge computing|Edge/Cloud Computing]]''': Offloading demanding processing (rendering, AI) from standalone HMDs to nearby edge servers or the cloud to enable higher fidelity experiences while maintaining mobility (e.g., [[NVIDIA CloudXR]], [[Plutosphere]]).<ref name="Liu2019EdgeAR">Liu, L.; Li, H.; Gruteser, M. (2019). "Edge assisted real-time object detection for mobile augmented reality". ''Proceedings of the 25th Annual International Conference on Mobile Computing and Networking'': 1-16. doi:10.1145/3300061.3345431.</ref> | ||
*'''[[Artificial intelligence|AI Integration]]''': On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction. | *'''[[Artificial intelligence|AI Integration]]''': On-device AI for smarter environment understanding, more robust hand/eye tracking, predictive rendering, personalized experiences, intelligent virtual agents, and natural language interaction. | ||
===Interfaces=== | ===Interfaces=== | ||
| Line 211: | Line 211: | ||
==Market Outlook== | ==Market Outlook== | ||
Market analysis firms like [[International Data Corporation|IDC]] report fluctuating but generally growing shipments of AR/VR headsets. For instance, IDC forecasted global shipments to reach 9.1 million units in 2024, with projected growth to 22.9 million by 2028, driven significantly by the adoption of mixed-reality capable devices and maturing technology in both consumer and enterprise segments.<ref name="idc2025" /> | Market analysis firms like [[International Data Corporation|IDC]] report fluctuating but generally growing shipments of AR/VR headsets. For instance, IDC forecasted global shipments to reach 9.1 million units in 2024, with projected growth to 22.9 million by 2028, driven significantly by the adoption of mixed-reality capable devices and maturing technology in both consumer and enterprise segments.<ref name="idc2025" /> | ||
==See Also== | ==See Also== | ||
| Line 247: | Line 244: | ||
* [[Waveguide (optics)]] | * [[Waveguide (optics)]] | ||
==References== | |||
<references /> | |||
[[Category:Terms]] | [[Category:Terms]] | ||