Input: Difference between revisions
Appearance
Shadowdawn (talk | contribs) Created page with "{{stub}} {{See also|Input Devices}} Category:Terms" |
Xinreality (talk | contribs) No edit summary |
||
| (6 intermediate revisions by 2 users not shown) | |||
| Line 2: | Line 2: | ||
{{See also|Input Devices}} | {{See also|Input Devices}} | ||
'''Input''' in [[virtual reality]] ([[VR]]) and [[augmented reality]] ([[AR]]) refers to the various methods and technologies that allow a user to interact with, control, and provide data to a computer-generated environment.<ref name="forwork_meta_guide">Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/</ref> Unlike traditional computing that primarily relies on a [[keyboard and mouse]], [[extended reality]] (XR) input encompasses a wide spectrum of devices and techniques designed to create a sense of [[immersion]] and presence by translating a user's physical actions into digital ones.<ref name="naimark_io">VR / AR Fundamentals - 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e</ref> | |||
Input methods range from traditional devices like [[gamepad]]s to sophisticated [[motion controller]]s that track hand movements, and increasingly, to more natural interfaces such as controller-free [[hand tracking]], [[eye tracking]], and [[voice command]]s.<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref> Modern VR/AR systems typically support multiple input modalities simultaneously, allowing users to seamlessly switch between controllers, hand gestures, gaze-based selection, and voice commands depending on the task and context.<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref> | |||
==Definition and Technical Overview== | |||
Input in immersive environments refers to mechanisms that capture user actions and translate them into commands within virtual or augmented spaces. Unlike traditional computing interfaces limited to keyboards and mice, VR/AR input systems capture [[6DOF|six degrees of freedom]] for position and orientation tracking, hand and finger poses with 26+ joint positions, [[eye tracking|eye gaze vectors]] with sub-degree precision, and [[voice input|voice commands]] processed through [[natural language processing]].<ref name="fiveable"/> | |||
The technical architecture comprises three layers: hardware devices including [[sensors]], [[cameras]], [[controllers]], and [[tracking systems]] that capture user actions; transfer functions that convert human output into digital input through algorithms and [[machine learning]] models; and tracking systems that measure spatial position with metrics including [[degrees of freedom]], accuracy (typically 1-5mm for commercial systems), precision, update rate (60-120Hz), and [[latency]] (target below 20ms to prevent [[motion sickness]]).<ref name="pubmedvive">The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC5439658/</ref> | |||
Modern systems employ [[sensor fusion]], combining multiple data sources for robust tracking. A typical VR controller integrates infrared LEDs for [[optical tracking]], [[IMU|inertial measurement units]] with [[accelerometer|accelerometers]] and [[gyroscope|gyroscopes]] for motion sensing, [[capacitive sensor|capacitive sensors]] for touch detection, and force sensors for grip pressure.<ref name="sagejournal">The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. Sage Journals. https://journals.sagepub.com/doi/full/10.1177/2041669517708205</ref> | |||
==Historical Evolution== | |||
The history of input devices in VR and AR spans over a century, evolving from early stereoscopic viewers to sophisticated tracking systems and brain interfaces. Key milestones focus on improving interactivity, control precision, and user immersion. | |||
===Early Precursors (1838–1980s)=== | |||
The earliest roots of immersive input can be traced to the 19th century with the invention of the [[stereoscope]] by [[Charles Wheatstone]] in 1838. This device used twin mirrors to project a separate image to each eye, creating a sense of 3D depth and immersion from static images, establishing the core principle of stereoscopic vision that underpins modern VR headsets.<ref name="vrshistory">History Of Virtual Reality. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality/history.html</ref> | |||
The first interactive simulators emerged in the early 20th century: | |||
* '''1849''': [[David Brewster]] develops the lenticular stereoscope, the first portable 3D viewer using optics for user interaction with images.<ref name="hqsoftware_history">A Brief History of AR and VR: Virtual Reality Timeline. HQSoftware. https://hqsoftwarelab.com/blog/the-history-of-ar-and-vr-a-timeline-of-notable-milestones/</ref> | |||
* '''1929''': [[Edwin Link]] creates the [[Link Trainer]] flight simulator, an electromechanical flight simulator that responded to a pilot's manipulation of its controls, demonstrating an early form of interactive, simulation-based input.<ref name="vrshistory"/> | |||
* '''1952''': [[Morton Heilig]] invents [[Sensorama]], a multi-sensory machine with stereoscopic display and physical feedback for immersive interaction, an arcade-style cabinet that stimulated multiple senses including sight, sound, smell, and touch via a vibrating chair.<ref name="hqsoftware_history"/><ref name="coursera_history_vr">History of Virtual Reality: From the 1800s to the 21st Century. Coursera, July 12, 2023. https://www.coursera.org/articles/history-of-virtual-reality</ref> | |||
The direct lineage of modern VR input began in the 1960s: | |||
* '''1961''': Philco Corporation engineers developed the '''Headsight''', the first [[head-mounted display]] (HMD), which featured a magnetic motion tracking system where head movements would control a remote camera, allowing for intuitive remote viewing.<ref name="vrshistory"/> | |||
* '''1963''': [[Ivan Sutherland]] created the first interactive computer graphics input system with [[Sketchpad]], using a light pen for real-time line drawings on a TX-2 computer at [[MIT]].<ref name="ieeespectrum">The Tremendous VR and CG Systems-of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref> | |||
* '''1965''': Sutherland conceptualized the "[[Ultimate Display]]," a theoretical room that could simulate reality so perfectly that a user could not differentiate it from the real world, including not just visual and auditory simulation but also [[haptic technology|haptic feedback]] and interaction with virtual objects.<ref name="sutherland_ultimate_display">The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.</ref> | |||
* '''1968''': Sutherland and his student Bob Sproull built the first actual VR/AR HMD, nicknamed "[[Sword of Damocles]]." The device was connected to a computer and used a mechanical or ultrasonic head-tracking system to update the user's perspective in real-time as they moved their head, marking the first instance of interactive, computer-generated immersive graphics.<ref name="sutherland_hmd_paper">A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.</ref> | |||
* '''1969''': [[Myron Krueger]] develops responsive computer-generated environments, termed "artificial reality."<ref name="hqsoftware_history"/> | |||
* '''1974–1975''': Krueger builds [[Videoplace]], enabling tracker-free interaction in simulated environments and allowing interaction with virtual objects via gestures at University of Wisconsin-Madison.<ref name="hqsoftware_history"/><ref name="wikivr">Virtual reality. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref> | |||
===Commercial VR Peripherals Era (1980s–1990s)=== | |||
The 1980s saw the commercialization of specialized VR input devices: | |||
* '''1980''': [[Steve Mann]] creates a wearable computer with vision overlays for AR input.<ref name="wikiar">Augmented reality. Wikipedia. https://en.wikipedia.org/wiki/Augmented_reality</ref> | |||
* '''1982''': Daniel Sandin and Thomas DeFanti invented the '''Sayre Glove''', which used optical sensors to track finger movements.<ref name="vrshistory"/> | |||
* '''1984''': [[Jaron Lanier]] founds [[VPL Research]], developing the EyePhone HMD and DataGlove for gesture input.<ref name="hqsoftware_history"/> | |||
* '''1987''': Lanier coins "virtual reality"; [[VPL Research]] releases the [[DataGlove]], which used [[fiber optic]] sensors to detect finger flexure, tracking 256 positions per finger at $9,000 per glove. [[NASA]] adopted DataGloves for astronaut training simulations.<ref name="wikivrp">VPL Research. Wikipedia. https://en.wikipedia.org/wiki/VPL_Research</ref><ref name="lumen_and_forge_history">The History of Virtual Reality. Lumen & Forge. https://lumenandforge.com/the-history-of-virtual-reality</ref> | |||
* '''1989''': VPL produces DataGlove, licensed to [[Mattel]] for [[Power Glove]], bringing gesture input to consumers at $100. Despite nearly one million units sold, poor precision (only 4 finger positions versus DataGlove's 256) led to commercial failure. VPL develops DataSuit for full-body tracking.<ref name="hqsoftware_history"/><ref name="wikivr"/> | |||
* '''1990''': [[Tom Caudell]] coins "augmented reality" at Boeing, using HMDs for schematic overlays.<ref name="hqsoftware_history"/> | |||
* '''1991''': [[Virtuality Group]] creates VR arcade machines with controllers and trackers; [[Sega]] develops VR headset with inertial sensors.<ref name="hqsoftware_history"/> | |||
* '''1992''': Louis Rosenberg develops Virtual Fixtures AR system; Virtuality systems use exoskeleton gloves; [[CAVE]] created for multi-user interaction.<ref name="hqsoftware_history"/><ref name="wikivr"/> | |||
* '''1995''': [[Nintendo]]'s [[Virtual Boy]] for home VR gaming; University of Massachusetts develops vision-based tracking.<ref name="hqsoftware_history"/> | |||
===Modern VR Era (2000s–Present)=== | |||
* '''2000''': ARToolKit released for marker-based AR; ARQuake, first mobile AR game, uses HMD, tracker, GPS, and gun controller.<ref name="hqsoftware_history"/> | |||
* '''2003''': Sony's EyeToy for gesture and motion control via camera.<ref name="hqsoftware_history"/> | |||
* '''2010''': [[Palmer Luckey]] prototypes Oculus VR HMD with head tracking.<ref name="hqsoftware_history"/> | |||
* '''2012''': [[Oculus Rift]] Kickstarter campaign raised $2.4 million and catalyzed the modern VR era; [[Google Glass]] for optical AR.<ref name="hqsoftware_history"/> | |||
* '''2013''': [[Leap Motion]] Controller brought camera-based [[hand tracking]] to consumers as a USB peripheral with two monochromatic infrared cameras tracking hands at 200 frames per second within a 3-foot hemispherical area.<ref name="wikileap">Leap Motion. Wikipedia. https://en.wikipedia.org/wiki/Leap_Motion</ref><ref name="leapmedium">How Does the Leap Motion Controller Work? Medium. https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04</ref> | |||
* '''2015''': [[Google Cardboard]] uses smartphone sensors; [[Microsoft HoloLens]] announced with gesture input; [[OSVR]] by Razer for open-source tracking.<ref name="wikivr"/> | |||
* '''2016''': [[HTC Vive]] introduced [[Lighthouse tracking]] using [[base station|base stations]] that emit infrared laser sweeps, with photosensors on the headset and controllers calculating position from laser timing. This [[outside-in tracking]] approach achieved sub-millimeter accuracy across room-scale spaces.<ref name="pubmedvive"/> [[Oculus Touch]] controllers pioneered [[capacitive sensor|capacitive touch sensors]] for finger presence detection, allowing natural hand gestures like pointing or thumbs-up without full hand tracking.<ref name="tomsguide">Oculus Touch Controllers Are A Lighter and Better Touch Than HTC Vive. Tom's Guide. https://www.tomsguide.com/us/oculus-touch-controllers,review-4072.html</ref> [[PlayStation VR]] launched; ARCore and ARKit released.<ref name="hqsoftware_history"/> | |||
* '''2017''': Facebook announces brain interface; immersive VR therapy for phantom limb pain using myoelectric controls.<ref name="naimark_io"/> | |||
* '''2018''': [[Magic Leap One]] with hand tracking; Leap Motion's Project North Star open-source AR headset.<ref name="hqsoftware_history"/> | |||
* '''2019''': [[Meta Quest]] introduced hand tracking via software update in December 2019, marking the first mainstream standalone headset offering controller-free input. [[Valve Index]] Controllers featured 87 sensors per controller tracking hand position, finger positions, motion, and grip pressure with adjustable hand straps allowing users to completely open their hands without dropping the controllers.<ref name="valveindex">Controllers - Valve Index® - Upgrade your experience. Valve Corporation. https://www.valvesoftware.com/en/index/controllers</ref><ref name="wikiindex">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref> [[HoloLens 2]] released.<ref name="wikivr"/> | |||
* '''2020''': Meta Quest 2 with pancake lenses.<ref name="wikivr"/> | |||
* '''2022''': [[Meta Quest Pro]] with face/eye tracking and [[Touch Pro]] controllers achieving self-tracking with onboard cameras; [[PlayStation VR2]] with haptics.<ref name="wikivr"/> | |||
* '''2023''': Meta Quest 3 with "Direct Touch" update allowed tapping virtual interfaces directly with fingers; [[Apple Vision Pro]] with eye/hand tracking announced; WebXR for browser-based input.<ref name="hqsoftware_history"/><ref name="metahand">Hand tracking technology & haptic feedback. Meta for Work. https://forwork.meta.com/blog/hand-tracking-technology-and-haptic-feedback-mr/</ref> | |||
* '''2024''': [[Apple Vision Pro]] launched in February 2024 as the first major headset without bundled controllers, validating controller-free as a primary interaction paradigm. The system combines high-precision [[eye tracking]] for targeting with pinch gestures for confirmation, processed by the dedicated R1 chip with 12ms latency.<ref name="applevp">Introducing Apple Vision Pro: Apple's first spatial computer. Apple. https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/</ref> | |||
* '''2025''': AI integration enhances object recognition, natural language, and tracking for intuitive controls.<ref name="hqsoftware_history"/> | |||
==Core Concepts== | |||
Modern VR/AR input is built upon several fundamental concepts that define how users can move and interact within a virtual space. | |||
===Degrees of Freedom (DoF)=== | |||
[[Degrees of freedom]] refers to the number of ways a rigid body can move in 3D space. This is a critical concept for understanding the capabilities of a VR system's tracking.<ref name="google_dof">Degrees of freedom. Google VR. https://developers.google.com/vr/discover/degrees-of-freedom</ref> | |||
* '''Three Degrees of Freedom (3DoF):''' This allows for the tracking of rotational movement only, tracking when a user looks up/down (pitch), left/right (yaw), and tilts their head side-to-side (roll). A 3DoF headset or controller can track these rotations but cannot track the user's physical movement through space. Early mobile VR headsets like the [[Google Cardboard]] and [[Samsung Gear VR]] were 3DoF systems.<ref name="strivr_dof">6DoF vs 3DoF: Degrees of freedom in VR. Strivr. https://www.strivr.com/blog/6dof-vs-3dof-understanding-importance</ref> | |||
* '''Six Degrees of Freedom (6DoF):''' This tracks both rotational and translational movement. In addition to the three rotational axes, 6DoF systems can track movement forward/backward (surging), left/right (strafing), and up/down (elevating). This allows a user to physically walk around, duck, and lean within the virtual environment, which is essential for true immersion and is the standard for modern VR systems like the [[Meta Quest 3]] and [[Valve Index]].<ref name="varjo_dof">Degrees of freedom in VR/XR. Varjo. https://varjo.com/learning-hub/degrees-of-freedom-in-vr-xr/</ref> | |||
===Tracking Technologies=== | |||
Positional tracking is the technology that enables 6DoF by determining the real-time position and orientation of the headset and controllers. There are two primary methods for achieving this.<ref name="wikipedia_vr_methods">Virtual reality - Forms and methods. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref> | |||
* '''Outside-In Tracking:''' This method uses external sensors (for example cameras or infrared emitters called "base stations" or "lighthouses") placed in the physical environment to track the position of the headset and controllers. These external sensors monitor markers (often infrared LEDs) on the tracked devices. Systems like the original [[HTC Vive]], [[Oculus Rift CV1]], and the [[Valve Index]] use outside-in tracking. This method can provide highly accurate and stable tracking but requires a more complex setup and a dedicated play space.<ref name="unity_tracking">What is Inside-Out/Outside-In Tracking. Unity. https://unity.com/glossary/Insideout-outsidein-tracking</ref><ref name="wikipedia_valve_index">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref> | |||
* '''Inside-Out Tracking:''' This method places the tracking sensors, typically cameras, directly on the headset itself. These cameras observe the surrounding environment and use computer vision algorithms (such as [[SLAM|simultaneous localization and mapping]]) to calculate the headset's position and orientation relative to fixed points in the room.<ref name="pimax_tracking">Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr</ref> Controllers are tracked by these same headset cameras observing their infrared LEDs. This approach is used by all modern standalone headsets, such as the [[Meta Quest]] series and the [[Pico 4]], as it eliminates the need for external hardware, making setup much simpler and allowing the system to be used in any location.<ref name="zilliz_tracking">What types of tracking systems are used in VR (for example inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein</ref> | |||
* '''Self-Tracking (Inside-Out on Controller):''' A newer hybrid approach places cameras directly onto the controllers themselves, as seen with the [[Meta Quest Touch Pro]] controllers. Each controller has its own onboard cameras and a [[Qualcomm Snapdragon]] 662 processor per controller, allowing it to track its own position in 3D space independently of the headset's cameras. This provides more robust tracking, preventing loss of tracking when the controllers are outside the headset's field of view (for example behind the user's back).<ref name="meta_controllers_pro">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/</ref> | |||
===Input Modalities=== | |||
VR and AR input combine multiple modalities for effective interaction, including: | |||
* '''Gestures''': Recognizing predefined or natural hand movements. | |||
* '''Buttons and Triggers''': Physical inputs on controllers for discrete actions. | |||
* '''Haptics''': Tactile feedback to simulate touch or resistance. | |||
* '''Voice Commands''': Speech recognition for command input. | |||
* '''Gaze''': Eye tracking for targeting and selection. | |||
==Types of Input Methods== | |||
Input methods in VR and AR are categorized based on sensors, vision, sound, mind, and multimodal combinations. These enable detection of human effectors like hands, eyes, voice, and brain signals.<ref name="naimark_io"/><ref name="gitbook"/> | |||
===Motion-Tracked Controllers=== | |||
Motion controllers are handheld devices that translate the user's hand and finger movements into the virtual environment.<ref name="milvus_motion_controllers">What role do motion controllers play in VR, and how do you support them? Milvus. https://milvus.io/ai-quick-reference/what-role-do-motion-controllers-play-in-vr-and-how-do-you-support-them</ref> They are the most common input method for 6DoF VR, typically featuring a combination of buttons, triggers, thumbsticks, and tracking sensors.<ref name="synergyxr_controllers_review">VR Controllers: A Comprehensive Review. SynergyXR. https://synergyxr.com/resources/learn/blogs/vr-controllers-a-comprehensive-review/</ref> | |||
{| class="wikitable" | |||
|+ Comparison of Major VR Motion Controllers | |||
! Feature | |||
! Meta Quest 2 Touch | |||
! Meta Quest Touch Plus | |||
! Meta Quest Touch Pro | |||
! [[Valve Index Controller]] | |||
! [[PlayStation VR2 Sense]] | |||
|- | |||
! Primary System(s) | |||
| [[Meta Quest 2]] | |||
| [[Meta Quest 3]] | |||
| [[Meta Quest Pro]], Quest 2/3 | |||
| [[Valve Index]], any [[SteamVR]] system | |||
| [[PlayStation VR2]] with [[PlayStation 5]] | |||
|- | |||
! Tracking Method | |||
| Inside-out (via headset)<ref name="wevolver_quest2">Meta Quest 2. Wevolver. https://www.wevolver.com/specs/meta-quest-2</ref> | |||
| Inside-out (via headset)<ref name="wikipedia_quest3">Meta Quest 3. Wikipedia. https://en.wikipedia.org/wiki/Meta_Quest_3</ref> | |||
| Self-contained inside-out<ref name="meta_touchpro_accessories">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/quest/accessories/quest-touch-pro-controllers-and-charging-dock/</ref> | |||
| Outside-in ([[Lighthouse tracking|Lighthouse]])<ref name="valveindex"/> | |||
| Inside-out (via headset)<ref name="wikipsvr2">PlayStation VR2. Wikipedia. https://en.wikipedia.org/wiki/PlayStation_VR2</ref> | |||
|- | |||
! Weight (w/ battery) | |||
| ~150 g<ref name="reddit_controller_weights">Controller weights comparison... How do you have the stamina to swing fast? Reddit. https://www.reddit.com/r/beatsaber/comments/1b0zmo9/controller_weights_comparisonhow_do_you_have_the/</ref> | |||
| ~126 g<ref name="gsmarena_quest3_review">Meta Quest 3 Review. GSMArena. https://www.gsmarena.com/meta_quest_3_review-news-60375.php</ref> | |||
| ~164 g<ref name="reddit_quest2_weight">The Quest Touch Pro Controllers weigh ~164g each. Reddit. https://www.reddit.com/r/oculus/comments/y545yh/the_quest_touch_pro_controllers_weigh_164g_each/</ref> | |||
| ~196 g<ref name="reddit_controller_weights"/> | |||
| ~168 g<ref name="reddit_controller_weights"/> | |||
|- | |||
! Haptics | |||
| Standard vibrotactile | |||
| TruTouch variable haptics<ref name="meta_touchplus_specs">Meta Quest Touch Plus Controller. Meta. https://www.meta.com/quest/accessories/quest-touch-plus-controller/</ref> | |||
| Localized TruTouch haptics<ref name="meta_touchpro_accessories"/> | |||
| HD LRA haptics<ref name="valveindex"/> | |||
| Advanced haptics, Adaptive Triggers<ref name="psvr2_specs_se">PlayStation VR2 tech specs. PlayStation. https://www.playstation.com/en-se/ps-vr2/ps-vr2-tech-specs/</ref> | |||
|- | |||
! Finger Sensing | |||
| Capacitive (thumb, index)<ref name="giessen_quest2_specs">Meta Quest 2 Specifications. University of Giessen. https://www.uni-giessen.de/de/studium/lehre/projekte/nidit/goals/quest2/specifications_quest-2.pdf</ref> | |||
| Capacitive (thumb, index)<ref name="meta_touchplus_specs"/> | |||
| Capacitive (thumb, index), Precision pinch<ref name="meta_touchpro_accessories"/> | |||
| Full 5-finger tracking, Grip force<ref name="valveindex"/> | |||
| Capacitive (thumb, index, middle)<ref name="psvr2_specs_se"/> | |||
|- | |||
! Key Features | |||
| Tracking ring, ergonomic grip | |||
| Ringless design, improved haptics | |||
| Self-tracking, Stylus tip, Rechargeable | |||
| Hand strap for open-hand interaction, per-finger tracking | |||
| Adaptive triggers, advanced haptics | |||
|- | |||
! Power Source | |||
| 1x AA Battery (~30 hrs)<ref name="gsmarena_quest3_review"/> | |||
| 1x AA Battery (~30 hrs)<ref name="gsmarena_quest3_review"/> | |||
| Rechargeable (~8 hrs)<ref name="komete_touchpro">Meta Quest Touch Pro Controllers. Komete XR. https://komete-xr.com/en/products/meta-quest-touch-pro-controllers</ref> | |||
| Rechargeable (7+ hrs)<ref name="valveindex"/> | |||
| Rechargeable (~4-5 hrs)<ref name="wikipsvr2"/> | |||
|} | |||
====Meta Quest Touch Controllers==== | |||
The [[Meta Quest]] ecosystem features multiple controller generations. [[Touch Plus]] controllers (2023) for [[Quest 3]] eliminated the tracking ring, placing infrared LEDs directly on the controller face. [[Hybrid tracking]] combines optical LED detection when in camera view with [[IMU]] motion sensing and AI-enhanced hand tracking fusion when occluded. [[TruTouch]] variable haptics provide realistic sensations from subtle taps to heavy impacts.<ref name="quest3">Meta Quest 3 - VR & AR Wiki. https://vrarwiki.com/wiki/Meta_Quest_3</ref> | |||
[[Touch Pro]] controllers (2022) for [[Quest Pro]] achieved self-tracking with onboard cameras and a [[Qualcomm Snapdragon]] 662 processor per controller. This eliminates dependence on headset line-of-sight, enabling reliable tracking when controllers are behind the user. The pressure sensor enables pinch detection and stylus tip capability for precision drawing.<ref name="wikitouchpro">Oculus Touch. Wikipedia. https://en.wikipedia.org/wiki/Oculus_Touch</ref><ref name="metahand"/> | |||
====Valve Index Controllers==== | |||
[[Valve Index]] Controllers demonstrate premium capability with 87 sensors including capacitive sensors detecting each finger's position, analog pressure sensing on grip measuring squeeze force from 0-100%, and force-sensitive triggers. The 1100mAh battery provides 7+ hours with USB-C fast charging.<ref name="valveindex"/> | |||
[[Lighthouse 2.0]] tracking achieves submillimeter positional accuracy by detecting precise timing of laser sweeps from [[base station|base stations]]. Each base station emits horizontal and vertical infrared laser planes at known rotation rates. When lasers hit controller sensors, the device calculates exact 3D position from sweep timing. Base stations support tracking volumes up to 33 feet by 33 feet.<ref name="wikiindex"/> | |||
The adjustable hand strap allows users to completely open their hands during gameplay, enabling natural throwing, catching, and two-handed weapon handling. This makes Index Controllers preferred by VR enthusiasts despite $279 price and external base station requirement.<ref name="valveindex"/> | |||
====PlayStation VR2 Sense Controllers==== | |||
[[PlayStation VR2]] [[Sense controllers]] adapted PlayStation 5 [[DualSense]] technology for VR with [[adaptive triggers]] featuring variable resistance. The R2 and L2 triggers simulate tension of drawing a bowstring, resistance of pulling a trigger, or pressure of squeezing a brake. Dedicated [[haptic actuator|haptic actuators]] deliver tailored sensations including impact of raindrops, texture of surfaces, and recoil of weapons.<ref name="psvr2blog">PlayStation VR2 and PlayStation VR2 Sense controller. PlayStation Blog. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/</ref> | |||
[[Inside-out tracking]] via four cameras on the PSVR2 headset captures LED tracking rings, with 6-axis motion sensing providing continuous updates. Sony announced hand tracking support at SIGGRAPH 2024, positioning PSVR2 as the first PlayStation system offering controller-free gameplay.<ref name="wikipsvr2"/> | |||
====Other Controller Systems==== | |||
[[HTC Vive]] controllers evolved through multiple generations. Original Vive wand controllers (2016) featured 24 sensors with circular trackpads tracked by Lighthouse 1.0. [[Vive Pro]] controllers (2018) added Lighthouse 2.0 compatibility for 10-meter tracking volumes. [[Cosmos]] controllers (2019) shifted to [[inside-out tracking]] with thumbsticks and face buttons.<ref name="wikihive">HTC Vive. Wikipedia. https://en.wikipedia.org/wiki/HTC_Vive</ref> | |||
[[Windows Mixed Reality]] controllers (2017) established Microsoft's specification for OEM partners including [[Acer]], [[HP]], [[Lenovo]], [[Samsung]], [[Dell]], and [[Asus]]. The design combined Vive-style circular touchpads with Touch-style thumbsticks, tracked by visible-light LEDs on circular rings.<ref name="wikihive"/> | |||
===Hand and Finger Tracking=== | |||
Controller-free hand tracking allows users to interact with virtual environments using only their natural hand movements, without holding any physical device.<ref name="meta_quest3_specs">Meta Quest 3: Tech Specs. Meta. https://www.meta.com/quest/quest-3/</ref> This technology is primarily camera-based. | |||
====Camera-Based Vision Systems==== | |||
Modern hand tracking relies on [[computer vision]] algorithms processing camera feeds in real-time. [[Meta Quest]] hand tracking uses headset cameras with [[machine learning]] models trained on millions of hand images to generate 26-point skeletal hand models at 30-90Hz. The Hands 2.2 update delivered 40% latency reduction through optimized [[neural networks]].<ref name="metahandstracking">All Hands on Deck: Crank up Hand Responsiveness. Meta for Developers. https://developers.meta.com/horizon/blog/hand-tracking-22-response-time-meta-quest-developers/</ref><ref name="pubmedhandtrack">A methodological framework to assess the accuracy of virtual reality hand-tracking systems. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10830632/</ref> | |||
[[Ultraleap]] (formerly [[Leap Motion]]) uses two infrared cameras and infrared LEDs illuminating hands with near-infrared light. The [[computer vision]] pipeline employs a [[Single Shot Detector]] neural network for palm detection, then a regression model outputs 3D coordinates for 21 keypoints per hand. The system tracks fingers even when partially hidden through predictive modeling.<ref name="leapmedium"/><ref name="ultraleapdocs">Ultraleap Hand Tracking Overview. Ultraleap Documentation. https://docs.ultraleap.com/hand-tracking/</ref> | |||
[[Apple Vision Pro]] employs high-resolution cameras transmitting over one billion pixels per second processed by the R1 chip within 12ms. Multiple infrared flood illuminators with camera arrays track hands from various angles, enabling reliable detection when hands overlap. The privacy-first architecture requires apps to explicitly request hand structure permissions.<ref name="applevp"/> | |||
====Computer Vision Algorithms==== | |||
[[MediaPipe Hands]], Google's open-source solution, demonstrates state-of-the-art pose estimation. The two-stage pipeline runs lightweight palm detection followed by regression predicting 21 3D hand landmarks. The model achieves real-time performance on mobile devices using efficient [[MobileNet]] architectures.<ref name="mediumhand">Hand Detection Tracking in Python using OpenCV and MediaPipe. Medium. https://gautamaditee.medium.com/hand-recognition-using-opencv-a7b109941c88</ref> | |||
Advanced approaches combine Tracking-by-Detection fusing [[Kernelized Correlation Filters]] for frame-to-frame tracking with [[Single Shot Detection]] for recovery from failures. [[Deep learning]] methods extract features using [[Convolutional Neural Networks]], while classical techniques like skin color segmentation, [[optical flow]], and depth sensing from [[Time-of-Flight]] sensors provide complementary information.<ref name="pubmedhandgesture">Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC8321080/</ref> | |||
Hand tracking is ideal for social VR, menu navigation, and applications where intuitive, simple gestures like pointing, pinching, and grabbing are sufficient. However, it currently has several limitations compared to physical controllers: it lacks the tactile feedback of a button press or trigger pull, making interactions feel less tangible; its precision can be lower, especially for fast movements; and tracking can be lost if hands are occluded from the camera's view or move outside the tracking zone.<ref name="mixed_news_hand_tracking">I tested Quest 3's hand tracking with a complete VR novice. MIXED News. https://mixed-news.com/en/meta-quest-3-hand-tracking-experiment/</ref> | |||
====Haptic Gloves==== | |||
[[HaptX Gloves]] G1 feature 135 microfluidic actuators providing true contact haptics with 0.9mm spatial resolution on fingers. The system delivers up to 40 pounds resistive force per hand through an integrated force feedback exoskeleton. Proprietary [[magnetic motion capture]] tracks all hand degrees of freedom. At $5,495 per pair, HaptX targets enterprise training applications.<ref name="haptx">Home | HaptX. https://haptx.com/</ref> | |||
[[bHaptics]] [[TactGlove DK2]] (2023) offers affordable alternative at $269 per pair with twelve HD [[Linear Resonant Actuators]] at fingertips plus wrist positions. The soft elastic material achieves 90% of bare hand tracking performance with Meta Quest 3.<ref name="bhaptics">Buy next generation full body haptic suit - bHaptics TactSuit. bHaptics. https://www.bhaptics.com/en/tactsuit/tactglove-dk2/</ref> | |||
[[SenseGlove]] Nova 2 (2023) introduced Active Contact Feedback in palm, complementing force feedback on fingers. The $5,000-7,000 enterprise solution uses four sensors for finger tracking with external [[SteamVR]] trackers for hand position. The Royal Netherlands Army, NASA, Emirates, and Procter & Gamble employ Nova 2 for training.<ref name="senseglove">Find out about our New Nova 2 Glove. SenseGlove. https://www.senseglove.com/product/nova-2/</ref> | |||
Carnegie Mellon University's Fluid Reality haptic gloves (2024) use [[electroosmotic pump|electroosmotic pumps]] enabling 0.2kg weight versus 17kg for alternatives. Thirty-two independent pressure actuators per finger pad fit in penny-sized arrays. Estimated commercial pricing around "a few hundred dollars" could bring haptic gloves to consumer VR.<ref name="cmugloves">Fluid Reality Haptic Gloves Bring Ultra-Sensitive Touch to VR. Carnegie Mellon University. https://www.cs.cmu.edu/news/2024/haptic-gloves</ref> | |||
===Eye Tracking=== | |||
[[Eye tracking]] in VR/AR employs infrared LEDs and cameras arranged between eyes and displays. Invisible infrared light projects patterns onto eyes, with cameras capturing pupil center and [[corneal reflection|corneal reflections]]. [[Machine learning]] algorithms process images at 100-200Hz to calculate [[gaze direction]], pupil size, and eye openness.<ref name="imotionseye">What is VR Eye Tracking? iMotions. https://imotions.com/blog/learning/best-practice/vr-eye-tracking/</ref> | |||
Eye tracking serves several key functions as an input modality: | |||
* '''Gaze-Based Interaction:''' It allows for a fast and intuitive way to target and select objects or UI elements. A user can simply look at a button and then perform a confirmation action (like a hand pinch or controller button press) to activate it. This can significantly speed up interaction compared to pointing with a controller.<ref name="meta_eye_tracking">Learn about Eye Tracking on Meta Quest Pro. Meta. https://www.meta.com/help/quest/8107387169303764/</ref> | |||
* '''[[Foveated rendering|Foveated Rendering]]:''' This is a powerful optimization technique that leverages how human vision works. The human eye only sees a very small area (the [[fovea]]) in high detail at any given moment. With eye tracking, the VR system can render only the part of the scene the user is directly looking at in full resolution, while progressively lowering the resolution in the peripheral vision. This can lead to massive performance savings (up to 70%) without the user perceiving any loss in visual quality, allowing for more complex graphics on less powerful hardware.<ref name="frontiers_eye_tracking">Eye Tracking in Virtual Reality: a Broad Review of Applications and Challenges. Frontiers in Virtual Reality, 2024. https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2024.1343773/full</ref><ref name="tobii_vr_component">Eye tracking in VR – A vital component. Tobii, February 16, 2024. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref> | |||
* '''Social Presence:''' By tracking eye movements, blinks, and pupil dilation, avatars in social VR can replicate a user's expressions much more realistically, leading to more natural and engaging social interactions.<ref name="meta_eye_tracking"/> | |||
* '''Analytics:''' In training, research, and marketing, eye tracking provides invaluable data on user attention and behavior, showing what users look at, in what order, and for how long. | |||
[[Tobii]] dominates commercial VR eye tracking, providing technology for [[PlayStation VR2]], [[HTC Vive Pro Eye]], [[Pimax Crystal]], and [[Varjo]] headsets. Integration enables [[foveated rendering]], concentrating GPU resources on high-resolution foveal region while rendering periphery at lower detail. PlayStation VR2 achieves 3.6x faster GPU performance through foveated rendering.<ref name="tobii">Eye tracking in VR – A vital component. Tobii. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref><ref name="pimax">Eye Tracking on VR (Virtual Reality) headsets. Pimax. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets</ref> | |||
[[Apple Vision Pro]]'s eye tracking serves as primary targeting mechanism functioning like a mouse cursor. High-performance infrared cameras and LEDs project patterns analyzed between display frames. Accuracy reaches 1.11 degrees in mixed reality mode and 0.93 degrees in VR mode within central field of view. The "look and pinch" interaction model eliminates need for pointing.<ref name="applevpeye">How You Control Apple Vision Pro With Your Eyes & Hands. UploadVR. https://www.uploadvr.com/apple-vision-pro-gesture-controls/</ref><ref name="pubmedeye">Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/</ref> | |||
[[HTC Vive Focus Vision]] (2024) integrated eye tracking as standard feature with 1-degree accuracy, using it for automatic [[IPD|interpupillary distance]] adjustment. [[Foveated rendering]] support and gaze input for UI complement hand tracking and controllers.<ref name="vivefocus">VIVE Focus Vision - New Standalone PC VR Headset. VIVE United States. https://www.vive.com/us/product/vive-focus-vision/overview/</ref> | |||
===Haptic Feedback Technology=== | |||
[[Haptic technology]] provides the sense of touch, applying forces, vibrations, or motions to the user to simulate interactions with virtual objects.<ref name="meta_haptics_overview">Haptics. Meta Horizon OS Developers. https://developers.meta.com/horizon/design/haptics-overview/</ref> It is a critical component for immersion, providing the physical confirmation that an action has occurred, such as feeling the impact of a virtual sword or the texture of a surface. | |||
* '''Vibrotactile Feedback:''' This is the most common form of haptics, using small motors to create vibrations. Modern controllers use [[Linear Resonant Actuator|Linear Resonant Actuators]] (LRAs) or Voice Coil Actuators (VCAs) to produce more precise and varied vibrations than the older Eccentric Rotating Mass (ERM) motors found in older gamepads.<ref name="arvrhub_haptics">How does Haptic Feedback Work in VR? ARVR Hub. https://arvrhub.com/haptic-feedback/</ref> | |||
* '''Kinesthetic (Force) Feedback:''' This type of feedback applies resistive forces to the user's body, simulating weight, inertia, and solidity. For example, a force-feedback joystick might resist being pushed, or a haptic glove might stop the user's fingers from closing when they grab a solid virtual object.<ref name="researchgate_haptics_pdf">Haptic Feedback for Virtual Reality. G. C. Burdea, 1999. Proceedings of the International Workshop on Virtual Reality and Prototyping. https://www.researchgate.net/publication/2356993_Haptic_Feedback_for_Virtual_Reality</ref> This is technologically complex and often requires large, grounded robotic arms or exoskeleton devices. | |||
* '''Tactile Feedback:''' This aims to simulate more subtle sensations like surface texture, pressure, temperature, and slippage. This is an active area of research, with various emerging technologies: | |||
** '''Microfluidics:''' Used in devices like [[HaptX Gloves]], which have tiny inflatable pockets (actuators) that are rapidly filled with air or liquid to create pressure points on the skin, simulating the shape and texture of an object.<ref name="xrtoday_haptx_review">HaptX Gloves G1 Review: Getting in Touch with VR. XR Today. https://www.xrtoday.com/reviews/haptx-gloves-g1-review-getting-in-touch-with-vr/</ref> | |||
** '''Electrotactile Stimulation:''' Applies small electrical currents to the skin to stimulate nerve endings, creating a variety of tactile sensations.<ref name="pmc_haptics_review">Haptic Sensing and Feedback Techniques toward Virtual Reality. Advanced Intelligent Systems, 2024. https://onlinelibrary.wiley.com/doi/10.1002/aisy.202300645</ref> | |||
** '''Ultrasonic Haptics:''' Uses arrays of ultrasonic transducers to focus sound waves in mid-air, creating pressure points that a user can feel on their bare skin without any wearable device.<ref name="simx_haptics">What Is Haptic Feedback? | Virtual Reality Medical Simulation. SimX. https://www.simxvr.com/glossary/haptic-feedback-definition/</ref> | |||
** '''Thermal Feedback:''' Uses [[Peltier effect|Peltier elements]] to rapidly heat or cool a surface that is in contact with the user's skin, simulating touching hot or cold objects.<ref name="senseglove_haptics_types">The Different Types of Haptic Feedback. SenseGlove, May 15, 2023. https://www.senseglove.com/what-are-the-different-types-of-haptic-feedback/</ref> | |||
Devices range from the integrated haptics in controllers to specialized [[haptic glove]]s, vests, and full-body [[haptic suit]]s that provide more comprehensive feedback.<ref name="arvrhub_haptics"/> | |||
===Voice Input=== | |||
[[Voice input]] relies on [[automatic speech recognition]] converting spoken words to text, combined with [[natural language processing]] understanding user intent. Modern systems employ cloud-based or on-device processing using wake words like "Hey Meta" for [[Meta Quest]] or Cortana for [[Microsoft HoloLens]].<ref name="msvoice">Voice input - Mixed Reality. Microsoft Learn. https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input</ref> | |||
[[Meta Quest]] voice commands enable over 100 commands including "Take a picture," "Start casting," and "Open [app name]." The [[Meta AI]] assistant introduced in 2024 extends capabilities to natural language queries.<ref name="questvoice">Meta Quest Voice Commands: The Ultimate Guide. AR/VR Tips. https://arvrtips.com/meta-quest-voice-commands/</ref> | |||
[[Microsoft HoloLens]] pioneered the "See It, Say It" model where voice-enabled buttons display tooltips when gazed at. Commands include hologram manipulation ("Bigger," "Face me"), device control ("Brightness up," "Volume down"), and queries ("What's my IP address?"). [[Dynamics 365 Remote Assist]] uses voice for hands-free field service.<ref name="hololensvoice">Use your voice to operate HoloLens. Microsoft Learn. https://learn.microsoft.com/en-us/hololens/hololens-cortana</ref> | |||
[[Cirrus Logic]]'s SoundClear technology provides hardware foundation with low-power, always-on voice processors featuring multi-mic noise reduction and wake word recognition from one foot to across-room distances.<ref name="cirrus">AR/VR Headsets. Cirrus Logic. https://www.cirrus.com/applications/wearables/ar-vr-headsets/</ref> | |||
Voice input leverages the built-in microphones in most VR headsets to allow for hands-free control and interaction. By using [[voice command]]s, users can navigate menus, search for content, dictate text, and control applications without using controllers.<ref name="ucalgary_voice_paper">Voice-Augmented Virtual Reality Interface for Serious Games. University of Calgary, 2024. https://cspages.ucalgary.ca/~richard.zhao1/publications/2024cog-voice_augmented_VR_interface.pdf</ref> The technology pipeline involves a [[speech recognition]] engine to transcribe spoken words into text, followed by a [[Natural Language Processing]] (NLP) model to interpret the user's intent from that text.<ref name="meta_voice_sdk_overview">Voice SDK Overview. Meta for Developers. https://developers.meta.com/horizon/documentation/unity/voice-sdk-overview/</ref> | |||
===Body Tracking=== | |||
[[Full-body tracking]] extends immersion beyond head and hands. | |||
====Full-Body Tracking==== | |||
While most VR systems natively track the head and hands, full-body tracking aims to capture the movement of the entire body, including the torso, legs, and feet, for a more complete and expressive avatar representation. | |||
* '''Marker-based Tracking:''' This is the traditional method used in [[motion capture]] for film and games. It involves the user wearing a suit covered in reflective markers, which are tracked by multiple external infrared cameras. While highly accurate, it is expensive and complex.<ref name="naimark_io"/> | |||
* '''Accessory-based Tracking:''' [[HTC Vive Tracker]] 3.0 attaches to body parts via elastic straps, tracked by [[SteamVR]] Lighthouse 2.0 with submillimeter accuracy. At 33% smaller and 15% lighter with 7.5-hour battery life, the tracker enables 6DOF tracking of feet, waist, chest, elbows, or shoulders. [[VRChat]] supports up to 11 tracking points for full-body avatar representation.<ref name="vivetracker">VIVE Tracker (3.0). VIVE United States. https://www.vive.com/us/accessory/tracker3/</ref> | |||
* '''Markerless Tracking:''' This emerging method uses computer vision and AI to estimate a user's body pose directly from camera data, without requiring any markers or additional trackers. This can be done with external depth-sensing cameras (like the [[Microsoft Kinect]]) or, increasingly, with the cameras already on the VR headset itself.<ref name="usc_markerless_paper">Markerless Full Body Tracking: Depth-Sensing Technology within Virtual Environments. USC Institute for Creative Technologies. https://ict.usc.edu/pubs/Markerless%20Full%20Body%20Tracking-%20Depth-Sensing%20Technology%20within%20Virtual%20Environments.pdf</ref><ref name="vr_collective_markerless">How Markerless Mocap is Transforming Location-Based VR Experiences. The VR Collective. https://thevrcollective.com/how-markerless-mocap-is-transforming-location-based-vr-experiences/</ref> | |||
[[Vive Ultimate Tracker]] (2024) eliminated base station requirement through self-tracking with onboard cameras. Two wide-angle cameras per tracker enable 6DOF [[inside-out tracking]], with up to five trackers connecting wirelessly.<ref name="viveultimate">VIVE Ultimate Tracker - Full-Body Tracking. VIVE. https://www.vive.com/us/accessory/vive-ultimate-tracker/</ref> | |||
[[SlimeVR]] pioneered affordable [[IMU]]-based full-body tracking using 9-axis sensors (accelerometer, gyroscope, magnetometer) sending rotation data via 2.4GHz WiFi. A 5-tracker lower-body set includes chest, two thighs, and two ankles for approximately $200 with 10-15 hour battery life. IMU tracking avoids occlusion issues but suffers from yaw drift requiring periodic recalibration.<ref name="slimevr">SlimeVR Full-Body Trackers. SlimeVR Official. https://slimevr.dev/</ref> | |||
[[HaritoraX]] 2 (2024) improved IMU tracking with built-in [[LiDAR]] sensors in ankle trackers detecting foot position relative to floor, plus geomagnetic compensation reducing rotational drift. Ultra-compact sensors enable up to 50 hours battery life.<ref name="haritorax">HaritoraX 2 - Fully wireless full-body tracking device. Shiftall. https://en.shiftall.net/products/haritorax2</ref> | |||
Research validates tracking accuracy. HTC Vive achieves approximately 2mm positional error and less than 1-degree orientation error. [[Oculus Quest]] 2 inside-out tracking shows 1.66mm ± 0.74mm translation accuracy and 0.34 ± 0.38 degrees rotation accuracy, comparable to external tracking systems.<ref name="acmtracking">Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. ACM Digital Library. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921</ref> | |||
====Locomotion==== | |||
[[Locomotion (virtual reality)|Locomotion]] refers to the methods used to move around within a virtual environment that is larger than the physical play space. | |||
* '''Physical Locomotion:''' | |||
** '''Room-scale:''' Users physically walk around a defined, tracked area. This is the most immersive method but is limited by the size of the physical room. | |||
** '''[[Omnidirectional treadmill]]s:''' These devices allow a user to walk, run, and jump in any direction while remaining in a fixed spot. They typically consist of a low-friction, concave platform where the user, wearing special shoes, can slide their feet to simulate walking. Sensors track the foot movement and translate it into in-game motion. Companies like [[KAT Walk]] and [[Virtuix]] are leading providers of consumer and arcade-level treadmills.<ref name="vrs_input_devices">Virtual Reality Input Devices. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality-gear/input-devices.html</ref><ref name="unboundxr_treadmills">Compare the Omni Directional Treadmills. Unbound XR. https://unboundxr.com/blogs/compare-vr-treadmills</ref> | |||
* '''Artificial Locomotion:''' | |||
** '''Teleportation:''' Users point a controller to a desired location and instantly appear there. This method is highly effective at preventing [[simulator sickness]] but can break the sense of presence and spatial awareness.<ref name="mdpi_locomotion">VR Locomotion in the New Era of VR: A Study of Techniques and Comparative Review. Multimodal Technologies and Interaction, 2019. https://www.mdpi.com/2414-4088/3/2/24</ref> | |||
** '''Smooth Locomotion:''' Users glide through the environment using the thumbstick on their controller, similar to a traditional first-person video game. While this provides a continuous sense of movement, it is a primary cause of simulator sickness for many users due to the disconnect between visual motion and the body's [[vestibular system]].<ref name="mdpi_locomotion"/> | |||
===Brain-Computer Interfaces=== | |||
[[Brain-computer interface|Brain-computer interfaces]] detect electrical signals from brain or nervous system, translating neural activity into digital commands. Non-invasive BCIs use [[electroencephalography]] measuring brain waves from scalp electrodes, while invasive approaches implant electrodes in brain tissue. [[Electromyography]] offers middle ground, measuring muscle activation signals from skin surface sensors.<ref name="frontiersbci">Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. Frontiers in Human Neuroscience. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full</ref> | |||
[[Meta]]'s [[EMG wristband]] (developed by acquired [[CTRL-labs]]) detects electrical signals from forearm muscles as motor neurons transmit movement commands. Signals are detected before fingers physically move, enabling negative latency. A July 2024 Nature paper demonstrated machine learning models working without user-specific calibration, the first generalizable neural interface.<ref name="ctrlabs">Facebook agrees to acquire brain-computing start-up CTRL-labs. CNBC. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html</ref><ref name="metaemg">Meta Details EMG Wristband Gestures. UploadVR. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/</ref> | |||
Mark Zuckerberg stated neural wristbands will ship "in the next few years," with leaked roadmaps indicating 2025-2027 launch alongside third-generation [[Ray-Ban]] smart glasses. The wristband enables handwriting in air, typing on surfaces, and precise finger tracking in any lighting without cameras.<ref name="zuckerbergwristband">Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. UploadVR. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/</ref> | |||
[[Valve]] and [[OpenBCI]] collaborated on the [[Galea]] headset (beta 2022), integrating EEG, EMG, EOG, EDA, PPG, and [[Tobii]] eye tracking into [[Valve Index]] modification. The open-source platform enables passive BCIs monitoring user state for adaptive VR experiences.<ref name="valvebci">Valve, OpenBCI & Tobii to Launch VR Brain-computer Interface 'Galea'. Road to VR. https://www.roadtovr.com/valve-openbci-immersive-vr-games/</ref> | |||
[[EMOTIV]] offers consumer/professional headsets including EPOC X (14-channel EEG), Insight (5-channel), and MN8 (32-channel research cap). The EmotivBCI software enables direct brain-computer interfacing with real-time monitoring of attention, workload, emotions, and stress.<ref name="emotiv">How BCI can elevate the AR/VR experience. EMOTIV. https://www.emotiv.com/blogs/news/bci-applications-for-vr-ar</ref> | |||
[[Neuralink]] received FDA approval in 2023 and implanted its first human patient in January 2024, who controls laptop cursor and plays video games via thought. [[Synchron]] takes less invasive approach, with 2024 demonstrations showing compatibility with [[Apple Vision Pro]] for thought-controlled VR/AR.<ref name="synchron">Brain Implant Hooked Up to Control VR Headset. Futurism. https://futurism.com/neoscope/synchron-brain-computer-interface-control-vr-headset</ref> | |||
===Specialized and Traditional Peripherals=== | |||
For certain applications, especially simulations, specialized peripherals offer a level of immersion and control that general-purpose motion controllers cannot match. | |||
* '''Flight Simulators:''' [[HOTAS]] (Hands On Throttle-And-Stick) systems, which replicate the joystick and throttle controls of an aircraft, are essential for flight simulation in VR. Popular models include the [[Thrustmaster]] HOTAS Warthog and T.Flight series.<ref name="thrustmaster_hotas_one">T.FLIGHT HOTAS ONE. Thrustmaster. https://eshop.thrustmaster.com/en_us/t-flight-hotas-one.html</ref><ref name="walmart_warthog">Thrustmaster HOTAS Warthog Flight Stick and Throttle for PC, VR. Walmart. https://www.walmart.com/ip/Thrustmaster-HOTAS-Warthog-Flight-Stick-and-Throttle-for-PC-VR/15268503</ref> | |||
* '''Racing Simulators:''' A [[racing wheel]] and pedal set is crucial for a realistic driving experience. High-end models from companies like Thrustmaster and [[MOZA]] feature powerful force feedback motors that simulate the torque on the steering wheel and the feel of the road.<ref name="thrustmaster_racing">Racing. Thrustmaster. https://www.thrustmaster.com/en-us/universe/racing/</ref><ref name="moza_racing">MOZA Racing Global. MOZA Racing. https://mozaracing.com/</ref> These are often mounted in dedicated racing cockpits for maximum stability and immersion.<ref name="nextlevel_racing">Racing Simulator Cockpits. Next Level Racing. https://nextlevelracing.com/racing-cockpits/</ref> | |||
* '''Traditional Peripherals:''' [[Keyboard and mouse]] and traditional [[gamepad]]s can still be used in VR, typically for seated experiences, ports of non-VR games, or for productivity tasks. Some platforms, like Meta Quest, have begun to integrate tracking for specific models of physical keyboards (for example the [[Logitech K830]]), allowing users to see a virtual representation of their keyboard and hands while typing, which greatly improves usability for work and text entry in VR.<ref name="medium_k830">The Logitech K830 Keyboard And Typing In VR. Medium, August 25, 2021. https://medium.com/xrlo-extended-reality-lowdown/the-logitech-k830-keyboard-and-typing-in-vr-556e2740c48d</ref><ref name="reddit_kb_mouse_vr">Can I use mouse and keyboard with the vr headset on pc for vr games instead of controllers? Reddit. https://www.reddit.com/r/oculus/comments/10946c5/can_i_use_mouse_and_keyboard_with_the_vr_headset/</ref> | |||
==Applications of Input Across Industries== | |||
The diversity of input methods in XR has enabled a wide range of applications beyond gaming, transforming how professionals in various fields train, design, and interact with digital data. | |||
===Gaming and Entertainment=== | |||
Gaming remains the primary driver of the consumer VR market, and input methods are integral to defining gameplay experiences. Motion controllers allow for direct, physical interaction, making games like [[Beat Saber]] (slashing blocks with virtual sabers) and ''[[Half-Life: Alyx]]'' (manipulating objects, reloading weapons, and solving puzzles with virtual hands) highly immersive.<ref name="cavendish_gaming">The Evolution of VR and AR in Gaming: A Historical Perspective. Cavendish Professionals. https://www.cavendishprofessionals.com/the-evolution-of-vr-and-ar-in-gaming-a-historical-perspective/</ref> Specialized peripherals cater to dedicated simulation genres; flight simulators like [[Microsoft Flight Simulator]] are best experienced with a HOTAS setup, while racing games like ''[[Assetto Corsa]]'' achieve maximum realism with a force-feedback racing wheel and pedals.<ref name="steam_community_ed">How to play in VR with Mouse and keyboard? Steam Community. https://steamcommunity.com/app/359320/discussions/0/5311389137862908260/?l=tchinese</ref> | |||
===Training and Simulation=== | |||
VR provides a safe, cost-effective, and repeatable environment for training in high-stakes professions. The input method is chosen to best replicate the real-world task. | |||
* '''Healthcare and Medical Training:''' Surgeons use VR simulations with advanced haptic devices to practice complex procedures. These systems can simulate the resistance and texture of different human tissues, allowing for realistic practice without risk to patients.<ref name="ijrpr_ar_vr">A Comprehensive Review of Augmented Reality and Virtual Reality. International Journal of Research and Presentations, 2023. https://ijrpr.com/uploads/V4ISSUE4/IJRPR12239.pdf</ref><ref name="thors_vr_training">The Role of Virtual Reality in Technical Training. THORS. https://thors.com/the-role-of-virtual-reality-in-technical-training/</ref> VR is also used for therapy, such as treating phobias or [[PTSD]], by exposing patients to triggering stimuli in a controlled environment.<ref name="ryan_funding_history">The History and Applications of Virtual Reality Headsets. Ryan. https://funding.ryan.com/blog/business-strategy/applications-and-history-of-vr-headsets/</ref> | |||
* '''Aerospace and Military:''' Flight simulation is one of the oldest and most mature applications of VR. Pilots train in highly realistic virtual cockpits, often using exact replicas of the physical HOTAS controls and panels.<ref name="mdpi_applications">Applications of Virtual Reality Simulations and Machine. MDPI. https://www.mdpi.com/2673-4591/100/1/19</ref> Similarly, military forces use VR for tactical training, combat simulations, and vehicle operation.<ref name="ijrpr_ar_vr"/> | |||
* '''Industrial and Technical Training:''' VR allows workers to learn how to operate heavy machinery, perform maintenance on complex equipment, or practice assembly line tasks in a virtual factory. This hands-on learning in a risk-free environment improves skill retention and safety.<ref name="synergyxr_education">VR Simulations in Education: Transforming Learning. SynergyXR. https://synergyxr.com/resources/learn/blogs/vr-simulations-in-education/</ref><ref name="thors_vr_training"/> | |||
===Creative and Design Tools=== | |||
VR is transforming digital content creation by moving it from 2D screens into an immersive 3D space. | |||
* '''3D Art and Sculpting:''' Applications like [[Adobe Substance 3D Modeler]] (successor to Oculus Medium) and [[Tilt Brush]] allow artists to use motion controllers to sculpt, paint, and create in three dimensions. This provides a more intuitive and physical connection to the creative process, akin to working with real-world materials.<ref name="searchmyexpert_vr_art">VR Art and Creativity: Unleashing the Power of Virtual Reality. SearchMyExpert. https://www.searchmyexpert.com/resources/ar-vr-development/vr-art-creativity</ref> | |||
* '''Architecture and Industrial Design:''' Architects, engineers, and designers use VR to visualize their creations at a 1:1 scale. By "walking through" a virtual building or examining a digital prototype of a car, they can gain a much deeper understanding of space, scale, and ergonomics.<ref name="wikipedia_vr_applications">Virtual reality applications. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality_applications</ref> Using motion controllers, designers can directly manipulate elements of the model, enabling a rapid and iterative design process from within the virtual environment itself.<ref name="dla_paper_vr_design">The Use of Immersive Virtual Reality as a Design Input Tool. Digital Landscape Architecture, 2017. https://gispoint.de/fileadmin/user_upload/paper_gis_open/DLA_2017/537629026.pdf</ref> | |||
===Accessibility=== | |||
The variety of input modalities in XR offers new avenues for accessibility. For users with mobility impairments who cannot use traditional controllers, alternative inputs like voice commands and eye tracking can provide full control over the virtual environment. Gaze-based selection can replace hand pointing, and voice commands can execute complex actions, making immersive experiences accessible to a wider audience.<ref name="ucalgary_voice_paper"/><ref name="ixrlabs_haptics_education">Why is Haptic Feedback important for VR Education? iXR Labs. https://www.ixrlabs.com/blog/why-haptic-feedback-important-for-vr-education/</ref> | |||
==Challenges and Design Considerations== | |||
Despite rapid advancements, designing effective and comfortable input for VR and AR presents unique challenges that are not present in traditional 2D interface design. These challenges span human physiology, technical limitations, and new design paradigms. | |||
===Human Factors and User Comfort=== | |||
* '''Simulator Sickness:''' Often cited as the biggest barrier to VR adoption, simulator sickness is a form of motion sickness that occurs when there is a conflict between the visual motion perceived by the eyes and the lack of physical motion detected by the body's vestibular system.<ref name="viroreact_design_principles">VR Design Principles. ViroReact. https://viro-community.readme.io/docs/vr-design-principles</ref> This is most commonly caused by artificial locomotion methods (like smooth locomotion with a thumbstick) and can be exacerbated by low frame rates or high latency. Input design principles to mitigate this include prioritizing teleportation over smooth locomotion, avoiding artificial camera acceleration, and ensuring the application maintains a consistently high frame rate (typically 90 [[Hz]] or higher).<ref name="medium_vr_challenges">The Biggest Challenges in AR/VR Design and How to Overcome Them. Medium. https://medium.com/cva-design/the-biggest-challenges-in-ar-vr-design-and-how-to-overcome-them-25210d435a79</ref><ref name="appypie_vr_limitations">The Limitations of Virtual Reality. Appy Pie. https://www.appypie.com/blog/virtual-reality-limitations</ref> | |||
* '''Ergonomics and Physical Fatigue:''' Unlike using a mouse, VR input often requires physical movement of the arms, hands, and body. Heavy or poorly balanced controllers can lead to arm and wrist strain over long sessions.<ref name="synergyxr_controllers_review"/> Controller-free hand tracking can lead to "gorilla arm" syndrome, where users become fatigued from holding their arms up in the air to interact with interfaces. Good design practice involves placing frequently used UI elements within a comfortable, resting range of motion.<ref name="viroreact_design_principles"/> | |||
===Technical Hurdles=== | |||
* '''Tracking Fidelity and Occlusion:''' While modern tracking systems are robust, they are not flawless. Inside-out systems can lose track of controllers when they are held outside the headset cameras' field of view (for example behind the back or too close to the face).<ref name="pimax_tracking"/> Hand tracking can be unreliable during fast movements, complex finger interactions, or when one hand occludes the other.<ref name="mixed_news_hand_tracking"/> These tracking failures can break immersion and cause user frustration. | |||
* '''Haptic Fidelity and Cost:''' The haptic feedback in most consumer VR controllers is limited to simple vibrations. Creating realistic tactile sensations, such as the texture of a surface, the weight of an object, or the precise feeling of pressure, is extremely challenging.<ref name="flatirons_haptics_disadvantages">Disadvantages of Haptic Technology. Flatirons. https://flatirons.com/blog/disadvantages-of-haptic-technology/</ref> Advanced haptic devices like force-feedback exoskeletons or microfluidic gloves exist, but they are currently very expensive, bulky, and largely confined to research and enterprise applications.<ref name="boreas_bad_haptics">How Bad Haptic Feedback Can Ruin the User Experience. Boreas Technologies. https://pages.boreas.ca/blog/how-bad-haptic-feedback-can-ruin-the-user-experience</ref> | |||
* '''Hardware Constraints:''' Standalone VR headsets operate under significant power and thermal constraints. The onboard processing power limits the complexity of the physics simulations, the number of tracked objects, and the sophistication of the rendering, which in turn affects the realism of interactions. Limited battery life also curtails the duration of untethered VR sessions.<ref name="medium_vr_challenges"/><ref name="vcd_vr_challenges">Exploring the Challenges and Limitations of Virtual Reality. VCD Social Club. https://vcdsocialclub.co.uk/exploring-the-challenges-and-limitations-of-virtual-reality</ref> | |||
===Interaction Design Paradigms=== | |||
Designing a [[user interface]] (UI) for a 3D space requires a fundamental rethinking of principles from 2D design. | |||
* '''Spatial UI:''' UI elements cannot be fixed to the screen; they must exist within the 3D world. Designers must consider the optimal placement of menus and information to be within the user's "comfort zone", typically a 94° horizontal and 32° vertical arc in front of the user, and at a comfortable viewing distance (generally between 0.5 meters and 10 meters) to avoid eye strain and maintain stereoscopic depth perception.<ref name="viroreact_design_principles"/> | |||
* '''Interaction Abstraction:''' A core challenge is deciding on the level of abstraction for an interaction. A "natural" interaction, like picking up an object with tracked hands, is intuitive but can be imprecise and lacks tactile feedback. An "abstract" interaction, like pressing a button to grab an object, is reliable and provides clear feedback but is less immersive.<ref name="ramotion_vr_ux">VR in UX Design: Basic Guidelines. Ramotion. https://www.ramotion.com/blog/vr-in-ux-design/</ref> Designers must constantly balance the trade-offs between intuitiveness, reliability, and user comfort for every interaction. | |||
==Technical Standards== | |||
===OpenXR=== | |||
The [[Khronos Group]] released [[OpenXR]] 1.0 in July 2019, providing first truly cross-platform API for XR applications. OpenXR abstracts hardware differences behind unified interface, enabling developers to write code once and deploy across [[Meta Quest]], [[SteamVR]], [[Windows Mixed Reality]], [[HTC Vive]], [[Varjo]], [[Magic Leap]], and most major platforms except Apple.<ref name="openxr">OpenXR - High-performance access to AR and VR. Khronos Group. https://www.khronos.org/openxr/</ref><ref name="wikiopenxr">OpenXR. Wikipedia. https://en.wikipedia.org/wiki/OpenXR</ref> | |||
Version 1.1 (April 2024) consolidated proven extensions into core specification, with action-based input mapping letting runtimes translate abstract actions like "grab" to platform-specific button configurations. Major runtimes including Meta Quest OpenXR, SteamVR, Windows Mixed Reality, PICO, and Varjo are officially conformant.<ref name="openxr"/> | |||
The extension system balances standardization with innovation. Core features work everywhere, while extensions like `XR_FB_foveated_rendering` for Meta's [[foveated rendering]] or `XR_FB_passthrough` for [[mixed reality]] enable platform-specific capabilities when available.<ref name="openxr"/> | |||
===WebXR=== | |||
The [[W3C]] Immersive Web Working Group developed [[WebXR Device API]] as successor to [[WebVR]], reaching Candidate Recommendation Draft with implementation in Chrome/Edge 79+, Opera 66+, Samsung Internet 12+, [[Oculus Browser]], and Safari on visionOS. The JavaScript API provides browser-based VR/AR without requiring native application installation.<ref name="webxr">WebXR Device API. W3C. https://www.w3.org/TR/webxr/</ref><ref name="mdnwebxr">WebXR Device API - Web APIs | MDN. MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API</ref> | |||
[[WebGL]] and [[WebGPU]] integration enables hardware-accelerated 3D rendering. Related specifications include WebXR Augmented Reality Module for hit testing, WebXR Layers API for performance optimization, WebXR Gamepads Module for controller input, and WebXR Hand Input Module for hand tracking access.<ref name="webxr"/> | |||
==Comparison of Input Methods== | |||
{| class="wikitable" | |||
! Input Method | |||
!! Accuracy | |||
!! Latency | |||
!! Advantages | |||
!! Disadvantages | |||
|- | |||
| Motion Controllers | |||
|| 1-2mm | |||
|| <25ms | |||
|| Highest precision, haptic feedback, reliable tracking | |||
|| Learning curve, battery management, occludes hands | |||
|- | |||
| Hand Tracking | |||
|| 5-10mm | |||
|| 30-50ms | |||
|| Natural interaction, no hardware needed, intuitive | |||
|| Lower precision, occlusion issues, no haptic feedback | |||
|- | |||
| Eye Tracking | |||
|| 0.5-1.0° | |||
|| 5-10ms | |||
|| Fast targeting, foveated rendering, natural selection | |||
|| Calibration required, privacy concerns, vergence issues | |||
|- | |||
| Voice Input | |||
|| N/A | |||
|| 100-300ms | |||
|| Hands-free, accessible | |||
|| Environmental noise, privacy concerns, social awkwardness | |||
|- | |||
| EMG Wristband | |||
|| Sub-mm | |||
|| Negative | |||
|| Works in dark, subtle input, negative latency | |||
|| Requires tight fit, limited gestures, interference issues | |||
|- | |||
| Full Body Tracking | |||
|| 2mm | |||
|| <20ms | |||
|| Complete avatar representation, immersive | |||
|| Setup complexity, cost, space requirements | |||
|} | |||
==Current State and Future Trends== | |||
The 2024-2025 period represents inflection point for VR/AR input. [[Apple Vision Pro]] launched February 2024 as first major headset without bundled controllers, validating controller-free interaction. [[Meta Quest 3S]] (September 2024) brought high-quality hand tracking to $300 price point. [[HTC Vive Focus Vision]] (September 2024) demonstrated enterprise commitment to multi-modal input with controllers, hand tracking, and eye tracking simultaneously.<ref name="vivefocus"/> | |||
===Advanced Haptics=== | |||
The next major leap in immersion will likely come from advancements in haptic feedback. The goal is to move beyond simple vibrations to provide rich, nuanced tactile information. Key areas of research and development include: | |||
* '''High-Fidelity Tactile Displays:''' Devices that can simulate a wide range of surface textures, pressures, and temperatures. This may be achieved through [[haptic glove]]s using arrays of microfluidic actuators, electrostimulation, or piezoelectric materials that deform when a current is applied.<ref name="pmc_haptics_review"/><ref name="senseglove_haptics_types"/> | |||
* '''Kinesthetic and Force Feedback:''' Efforts are underway to create more compact, affordable, and ungrounded force-feedback devices. This includes exoskeletal gloves that can apply resistive force to individual fingers and wrist-mounted devices that use gyroscopic or other principles to simulate weight and inertia. | |||
* '''Mid-Air Haptics:''' Technologies using phased arrays of ultrasonic transducers to project tactile sensations onto a user's bare hands in mid-air are maturing. This could allow users to "feel" virtual buttons and textures without wearing any device. | |||
===Neural Interfaces=== | |||
[[EMG wristband|EMG wristbands]] represent most significant emerging technology, with Meta planning 2025-2027 launch with third-generation Ray-Ban glasses. The July 2024 Nature paper demonstrating generalizable models working without user calibration removes major commercialization barrier.<ref name="metaemg"/> | |||
The long-term, paradigm-shifting future of input lies in [[brain-computer interface]]s (BCIs), also known as neural interfaces. These technologies aim to establish a direct communication pathway between the brain and a computer, potentially allowing users to control virtual objects or navigate interfaces through thought alone.<ref name="naimark_io"/> Companies like [[Neuralink]] are developing invasive BCIs for medical applications, which involve surgically implanted electrodes to read neural signals with high fidelity.<ref name="neuralink_homepage">Neuralink - Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/</ref> | |||
===Convergence and Multi-Modal Input=== | |||
Enterprise haptic gloves found viability at $5,000-7,000 price points for training applications, while Carnegie Mellon's Fluid Reality prototype promises consumer pricing around "a few hundred dollars" if manufacturing scales.<ref name="cmugloves"/> | |||
Eye tracking transitions from premium to standard feature, with [[PlayStation VR2]], [[Apple Vision Pro]], and [[HTC Vive Focus Vision]] including it as core functionality rather than add-on. [[Tobii]]'s licensing model enables rapid market expansion across platforms.<ref name="tobii"/> | |||
The industry converges on multi-modal input supporting simultaneous use of controllers, hand tracking, eye tracking, and voice commands. Users seamlessly switch between input methods depending on task, controllers for gaming precision, hand tracking for social interaction, eye tracking for UI targeting, and voice for explicit commands. In the more immediate future, the most significant trend is the convergence of multiple input streams into a single, cohesive interaction model. Instead of relying on a single input method, future systems will intelligently combine data from eye tracking, hand tracking, voice commands, and biometric sensors to gain a more holistic and context-aware understanding of user intent. | |||
The [[Apple Vision Pro]]'s primary interaction model is a prominent example of this trend. It uses eye tracking to determine what a user is looking at (the "target") and hand tracking to detect a simple pinch gesture as the confirmation "click."<ref name="youtube_controller_tierlist">Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o</ref> This fusion of two separate input modalities creates an interaction that is fast, intuitive, and requires minimal physical effort. Future systems will likely expand on this, using voice commands to modify properties of the object a user is looking at, or using biometric data to adapt a virtual environment based on a user's emotional state. This multi-modal approach promises to make interaction in XR feel less like operating a computer and more like a natural extension of the user's own body and mind. | |||
==See Also== | |||
* [[Input Devices]] | |||
* [[Controllers]] | |||
* [[Hand Tracking]] | |||
* [[Eye Tracking]] | |||
* [[Body Tracking]] | |||
* [[Brain-Computer Interface]] | |||
* [[OpenXR]] | |||
* [[Positional Tracking]] | |||
* [[6DOF]] | |||
* [[Haptic Technology]] | |||
* [[Motion Controller]] | |||
* [[Voice Command]] | |||
* [[Immersion (virtual reality)]] | |||
* [[Foveated Rendering]] | |||
* [[Locomotion (virtual reality)]] | |||
==References== | |||
<references> | |||
<ref name="forwork_meta_guide">Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/</ref> | |||
<ref name="naimark_io">VR / AR Fundamentals - 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e</ref> | |||
<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref> | |||
<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref> | |||
<ref name="pubmedvive">The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC5439658/</ref> | |||
<ref name="sagejournal">The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. Sage Journals. https://journals.sagepub.com/doi/full/10.1177/2041669517708205</ref> | |||
<ref name="vrshistory">History Of Virtual Reality. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality/history.html</ref> | |||
<ref name="hqsoftware_history">A Brief History of AR and VR: Virtual Reality Timeline. HQSoftware. https://hqsoftwarelab.com/blog/the-history-of-ar-and-vr-a-timeline-of-notable-milestones/</ref> | |||
<ref name="coursera_history_vr">History of Virtual Reality: From the 1800s to the 21st Century. Coursera, July 12, 2023. https://www.coursera.org/articles/history-of-virtual-reality</ref> | |||
<ref name="ieeespectrum">The Tremendous VR and CG Systems-of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref> | |||
<ref name="sutherland_ultimate_display">The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.</ref> | |||
<ref name="sutherland_hmd_paper">A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.</ref> | |||
<ref name="wikivr">Virtual reality. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref> | |||
<ref name="wikiar">Augmented reality. Wikipedia. https://en.wikipedia.org/wiki/Augmented_reality</ref> | |||
<ref name="wikivrp">VPL Research. Wikipedia. https://en.wikipedia.org/wiki/VPL_Research</ref> | |||
<ref name="lumen_and_forge_history">The History of Virtual Reality. Lumen & Forge. https://lumenandforge.com/the-history-of-virtual-reality</ref> | |||
<ref name="tomsguide">Oculus Touch Controllers Are A Lighter and Better Touch Than HTC Vive. Tom's Guide. https://www.tomsguide.com/us/oculus-touch-controllers,review-4072.html</ref> | |||
<ref name="valveindex">Controllers - Valve Index® - Upgrade your experience. Valve Corporation. https://www.valvesoftware.com/en/index/controllers</ref> | |||
<ref name="wikiindex">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref> | |||
<ref name="wikileap">Leap Motion. Wikipedia. https://en.wikipedia.org/wiki/Leap_Motion</ref> | |||
<ref name="leapmedium">How Does the Leap Motion Controller Work? Medium. https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04</ref> | |||
<ref name="metahand">Hand tracking technology & haptic feedback. Meta for Work. https://forwork.meta.com/blog/hand-tracking-technology-and-haptic-feedback-mr/</ref> | |||
<ref name="applevp">Introducing Apple Vision Pro: Apple's first spatial computer. Apple. https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/</ref> | |||
<ref name="quest3">Meta Quest 3 - VR & AR Wiki. https://vrarwiki.com/wiki/Meta_Quest_3</ref> | |||
<ref name="wikitouchpro">Oculus Touch. Wikipedia. https://en.wikipedia.org/wiki/Oculus_Touch</ref> | |||
<ref name="psvr2blog">PlayStation VR2 and PlayStation VR2 Sense controller. PlayStation Blog. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/</ref> | |||
<ref name="wikipsvr2">PlayStation VR2. Wikipedia. https://en.wikipedia.org/wiki/PlayStation_VR2</ref> | |||
<ref name="wikihive">HTC Vive. Wikipedia. https://en.wikipedia.org/wiki/HTC_Vive</ref> | |||
<ref name="google_dof">Degrees of freedom. Google VR. https://developers.google.com/vr/discover/degrees-of-freedom</ref> | |||
<ref name="strivr_dof">6DoF vs 3DoF: Degrees of freedom in VR. Strivr. https://www.strivr.com/blog/6dof-vs-3dof-understanding-importance</ref> | |||
<ref name="varjo_dof">Degrees of freedom in VR/XR. Varjo. https://varjo.com/learning-hub/degrees-of-freedom-in-vr-xr/</ref> | |||
<ref name="wikipedia_vr_methods">Virtual reality - Forms and methods. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref> | |||
<ref name="unity_tracking">What is Inside-Out/Outside-In Tracking. Unity. https://unity.com/glossary/Insideout-outsidein-tracking</ref> | |||
<ref name="wikipedia_valve_index">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref> | |||
<ref name="pimax_tracking">Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr</ref> | |||
<ref name="zilliz_tracking">What types of tracking systems are used in VR (for example inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein</ref> | |||
<ref name="meta_controllers_pro">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/</ref> | |||
<ref name="milvus_motion_controllers">What role do motion controllers play in VR, and how do you support them? Milvus. https://milvus.io/ai-quick-reference/what-role-do-motion-controllers-play-in-vr-and-how-do-you-support-them</ref> | |||
<ref name="synergyxr_controllers_review">VR Controllers: A Comprehensive Review. SynergyXR. https://synergyxr.com/resources/learn/blogs/vr-controllers-a-comprehensive-review/</ref> | |||
<ref name="wevolver_quest2">Meta Quest 2. Wevolver. https://www.wevolver.com/specs/meta-quest-2</ref> | |||
<ref name="wikipedia_quest3">Meta Quest 3. Wikipedia. https://en.wikipedia.org/wiki/Meta_Quest_3</ref> | |||
<ref name="meta_touchpro_accessories">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/quest/accessories/quest-touch-pro-controllers-and-charging-dock/</ref> | |||
<ref name="reddit_controller_weights">Controller weights comparison... How do you have the stamina to swing fast? Reddit. https://www.reddit.com/r/beatsaber/comments/1b0zmo9/controller_weights_comparisonhow_do_you_have_the/</ref> | |||
<ref name="reddit_quest2_weight">The Quest Touch Pro Controllers weigh ~164g each. Reddit. https://www.reddit.com/r/oculus/comments/y545yh/the_quest_touch_pro_controllers_weigh_164g_each/</ref> | |||
<ref name="gsmarena_quest3_review">Meta Quest 3 Review. GSMArena. https://www.gsmarena.com/meta_quest_3_review-news-60375.php</ref> | |||
<ref name="meta_touchplus_specs">Meta Quest Touch Plus Controller. Meta. https://www.meta.com/quest/accessories/quest-touch-plus-controller/</ref> | |||
<ref name="psvr2_specs_se">PlayStation VR2 tech specs. PlayStation. https://www.playstation.com/en-se/ps-vr2/ps-vr2-tech-specs/</ref> | |||
<ref name="giessen_quest2_specs">Meta Quest 2 Specifications. University of Giessen. https://www.uni-giessen.de/de/studium/lehre/projekte/nidit/goals/quest2/specifications_quest-2.pdf</ref> | |||
<ref name="komete_touchpro">Meta Quest Touch Pro Controllers. Komete XR. https://komete-xr.com/en/products/meta-quest-touch-pro-controllers</ref> | |||
<ref name="meta_quest3_specs">Meta Quest 3: Tech Specs. Meta. https://www.meta.com/quest/quest-3/</ref> | |||
<ref name="metahandstracking">All Hands on Deck: Crank up Hand Responsiveness. Meta for Developers. https://developers.meta.com/horizon/blog/hand-tracking-22-response-time-meta-quest-developers/</ref> | |||
<ref name="pubmedhandtrack">A methodological framework to assess the accuracy of virtual reality hand-tracking systems. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10830632/</ref> | |||
<ref name="ultraleapdocs">Ultraleap Hand Tracking Overview. Ultraleap Documentation. https://docs.ultraleap.com/hand-tracking/</ref> | |||
<ref name="mediumhand">Hand Detection Tracking in Python using OpenCV and MediaPipe. Medium. https://gautamaditee.medium.com/hand-recognition-using-opencv-a7b109941c88</ref> | |||
<ref name="pubmedhandgesture">Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC8321080/</ref> | |||
<ref name="mixed_news_hand_tracking">I tested Quest 3's hand tracking with a complete VR novice. MIXED News. https://mixed-news.com/en/meta-quest-3-hand-tracking-experiment/</ref> | |||
<ref name="haptx">Home | HaptX. https://haptx.com/</ref> | |||
<ref name="bhaptics">Buy next generation full body haptic suit - bHaptics TactSuit. bHaptics. https://www.bhaptics.com/en/tactsuit/tactglove-dk2/</ref> | |||
<ref name="senseglove">Find out about our New Nova 2 Glove. SenseGlove. https://www.senseglove.com/product/nova-2/</ref> | |||
<ref name="cmugloves">Fluid Reality Haptic Gloves Bring Ultra-Sensitive Touch to VR. Carnegie Mellon University. https://www.cs.cmu.edu/news/2024/haptic-gloves</ref> | |||
<ref name="imotionseye">What is VR Eye Tracking? iMotions. https://imotions.com/blog/learning/best-practice/vr-eye-tracking/</ref> | |||
<ref name="meta_eye_tracking">Learn about Eye Tracking on Meta Quest Pro. Meta. https://www.meta.com/help/quest/8107387169303764/</ref> | |||
<ref name="frontiers_eye_tracking">Eye Tracking in Virtual Reality: a Broad Review of Applications and Challenges. Frontiers in Virtual Reality, 2024. https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2024.1343773/full</ref> | |||
<ref name="tobii_vr_component">Eye tracking in VR – A vital component. Tobii, February 16, 2024. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref> | |||
<ref name="tobii">Eye tracking in VR – A vital component. Tobii. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref> | |||
<ref name="pimax">Eye Tracking on VR (Virtual Reality) headsets. Pimax. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets</ref> | |||
<ref name="applevpeye">How You Control Apple Vision Pro With Your Eyes & Hands. UploadVR. https://www.uploadvr.com/apple-vision-pro-gesture-controls/</ref> | |||
<ref name="pubmedeye">Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/</ref> | |||
<ref name="vivefocus">VIVE Focus Vision - New Standalone PC VR Headset. VIVE United States. https://www.vive.com/us/product/vive-focus-vision/overview/</ref> | |||
<ref name="meta_haptics_overview">Haptics. Meta Horizon OS Developers. https://developers.meta.com/horizon/design/haptics-overview/</ref> | |||
<ref name="arvrhub_haptics">How does Haptic Feedback Work in VR? ARVR Hub. https://arvrhub.com/haptic-feedback/</ref> | |||
<ref name="researchgate_haptics_pdf">Haptic Feedback for Virtual Reality. G. C. Burdea, 1999. Proceedings of the International Workshop on Virtual Reality and Prototyping. https://www.researchgate.net/publication/2356993_Haptic_Feedback_for_Virtual_Reality</ref> | |||
<ref name="xrtoday_haptx_review">HaptX Gloves G1 Review: Getting in Touch with VR. XR Today. https://www.xrtoday.com/reviews/haptx-gloves-g1-review-getting-in-touch-with-vr/</ref> | |||
<ref name="pmc_haptics_review">Haptic Sensing and Feedback Techniques toward Virtual Reality. Advanced Intelligent Systems, 2024. https://onlinelibrary.wiley.com/doi/10.1002/aisy.202300645</ref> | |||
<ref name="simx_haptics">What Is Haptic Feedback? | Virtual Reality Medical Simulation. SimX. https://www.simxvr.com/glossary/haptic-feedback-definition/</ref> | |||
<ref name="senseglove_haptics_types">The Different Types of Haptic Feedback. SenseGlove, May 15, 2023. https://www.senseglove.com/what-are-the-different-types-of-haptic-feedback/</ref> | |||
<ref name="msvoice">Voice input - Mixed Reality. Microsoft Learn. https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input</ref> | |||
<ref name="questvoice">Meta Quest Voice Commands: The Ultimate Guide. AR/VR Tips. https://arvrtips.com/meta-quest-voice-commands/</ref> | |||
<ref name="hololensvoice">Use your voice to operate HoloLens. Microsoft Learn. https://learn.microsoft.com/en-us/hololens/hololens-cortana</ref> | |||
<ref name="cirrus">AR/VR Headsets. Cirrus Logic. https://www.cirrus.com/applications/wearables/ar-vr-headsets/</ref> | |||
<ref name="ucalgary_voice_paper">Voice-Augmented Virtual Reality Interface for Serious Games. University of Calgary, 2024. https://cspages.ucalgary.ca/~richard.zhao1/publications/2024cog-voice_augmented_VR_interface.pdf</ref> | |||
<ref name="meta_voice_sdk_overview">Voice SDK Overview. Meta for Developers. https://developers.meta.com/horizon/documentation/unity/voice-sdk-overview/</ref> | |||
<ref name="usc_markerless_paper">Markerless Full Body Tracking: Depth-Sensing Technology within Virtual Environments. USC Institute for Creative Technologies. https://ict.usc.edu/pubs/Markerless%20Full%20Body%20Tracking-%20Depth-Sensing%20Technology%20within%20Virtual%20Environments.pdf</ref> | |||
<ref name="vr_collective_markerless">How Markerless Mocap is Transforming Location-Based VR Experiences. The VR Collective. https://thevrcollective.com/how-markerless-mocap-is-transforming-location-based-vr-experiences/</ref> | |||
<ref name="vivetracker">VIVE Tracker (3.0). VIVE United States. https://www.vive.com/us/accessory/tracker3/</ref> | |||
<ref name="viveultimate">VIVE Ultimate Tracker - Full-Body Tracking. VIVE. https://www.vive.com/us/accessory/vive-ultimate-tracker/</ref> | |||
<ref name="slimevr">SlimeVR Full-Body Trackers. SlimeVR Official. https://slimevr.dev/</ref> | |||
<ref name="haritorax">HaritoraX 2 - Fully wireless full-body tracking device. Shiftall. https://en.shiftall.net/products/haritorax2</ref> | |||
<ref name="acmtracking">Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. ACM Digital Library. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921</ref> | |||
<ref name="vrs_input_devices">Virtual Reality Input Devices. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality-gear/input-devices.html</ref> | |||
<ref name="unboundxr_treadmills">Compare the Omni Directional Treadmills. Unbound XR. https://unboundxr.com/blogs/compare-vr-treadmills</ref> | |||
<ref name="mdpi_locomotion">VR Locomotion in the New Era of VR: A Study of Techniques and Comparative Review. Multimodal Technologies and Interaction, 2019. https://www.mdpi.com/2414-4088/3/2/24</ref> | |||
<ref name="frontiersbci">Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. Frontiers in Human Neuroscience. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full</ref> | |||
<ref name="ctrlabs">Facebook agrees to acquire brain-computing start-up CTRL-labs. CNBC. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html</ref> | |||
<ref name="metaemg">Meta Details EMG Wristband Gestures. UploadVR. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/</ref> | |||
<ref name="zuckerbergwristband">Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. UploadVR. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/</ref> | |||
<ref name="valvebci">Valve, OpenBCI & Tobii to Launch VR Brain-computer Interface 'Galea'. Road to VR. https://www.roadtovr.com/valve-openbci-immersive-vr-games/</ref> | |||
<ref name="emotiv">How BCI can elevate the AR/VR experience. EMOTIV. https://www.emotiv.com/blogs/news/bci-applications-for-vr-ar</ref> | |||
<ref name="synchron">Brain Implant Hooked Up to Control VR Headset. Futurism. https://futurism.com/neoscope/synchron-brain-computer-interface-control-vr-headset</ref> | |||
<ref name="thrustmaster_hotas_one">T.FLIGHT HOTAS ONE. Thrustmaster. https://eshop.thrustmaster.com/en_us/t-flight-hotas-one.html</ref> | |||
<ref name="walmart_warthog">Thrustmaster HOTAS Warthog Flight Stick and Throttle for PC, VR. Walmart. https://www.walmart.com/ip/Thrustmaster-HOTAS-Warthog-Flight-Stick-and-Throttle-for-PC-VR/15268503</ref> | |||
<ref name="thrustmaster_racing">Racing. Thrustmaster. https://www.thrustmaster.com/en-us/universe/racing/</ref> | |||
<ref name="moza_racing">MOZA Racing Global. MOZA Racing. https://mozaracing.com/</ref> | |||
<ref name="nextlevel_racing">Racing Simulator Cockpits. Next Level Racing. https://nextlevelracing.com/racing-cockpits/</ref> | |||
<ref name="medium_k830">The Logitech K830 Keyboard And Typing In VR. Medium, August 25, 2021. https://medium.com/xrlo-extended-reality-lowdown/the-logitech-k830-keyboard-and-typing-in-vr-556e2740c48d</ref> | |||
<ref name="reddit_kb_mouse_vr">Can I use mouse and keyboard with the vr headset on pc for vr games instead of controllers? Reddit. https://www.reddit.com/r/oculus/comments/10946c5/can_i_use_mouse_and_keyboard_with_the_vr_headset/</ref> | |||
<ref name="cavendish_gaming">The Evolution of VR and AR in Gaming: A Historical Perspective. Cavendish Professionals. https://www.cavendishprofessionals.com/the-evolution-of-vr-and-ar-in-gaming-a-historical-perspective/</ref> | |||
<ref name="steam_community_ed">How to play in VR with Mouse and keyboard? Steam Community. https://steamcommunity.com/app/359320/discussions/0/5311389137862908260/?l=tchinese</ref> | |||
<ref name="ijrpr_ar_vr">A Comprehensive Review of Augmented Reality and Virtual Reality. International Journal of Research and Presentations, 2023. https://ijrpr.com/uploads/V4ISSUE4/IJRPR12239.pdf</ref> | |||
<ref name="thors_vr_training">The Role of Virtual Reality in Technical Training. THORS. https://thors.com/the-role-of-virtual-reality-in-technical-training/</ref> | |||
<ref name="ryan_funding_history">The History and Applications of Virtual Reality Headsets. Ryan. https://funding.ryan.com/blog/business-strategy/applications-and-history-of-vr-headsets/</ref> | |||
<ref name="mdpi_applications">Applications of Virtual Reality Simulations and Machine. MDPI. https://www.mdpi.com/2673-4591/100/1/19</ref> | |||
<ref name="synergyxr_education">VR Simulations in Education: Transforming Learning. SynergyXR. https://synergyxr.com/resources/learn/blogs/vr-simulations-in-education/</ref> | |||
<ref name="searchmyexpert_vr_art">VR Art and Creativity: Unleashing the Power of Virtual Reality. SearchMyExpert. https://www.searchmyexpert.com/resources/ar-vr-development/vr-art-creativity</ref> | |||
<ref name="wikipedia_vr_applications">Virtual reality applications. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality_applications</ref> | |||
<ref name="dla_paper_vr_design">The Use of Immersive Virtual Reality as a Design Input Tool. Digital Landscape Architecture, 2017. https://gispoint.de/fileadmin/user_upload/paper_gis_open/DLA_2017/537629026.pdf</ref> | |||
<ref name="ixrlabs_haptics_education">Why is Haptic Feedback important for VR Education? iXR Labs. https://www.ixrlabs.com/blog/why-haptic-feedback-important-for-vr-education/</ref> | |||
<ref name="viroreact_design_principles">VR Design Principles. ViroReact. https://viro-community.readme.io/docs/vr-design-principles</ref> | |||
<ref name="medium_vr_challenges">The Biggest Challenges in AR/VR Design and How to Overcome Them. Medium. https://medium.com/cva-design/the-biggest-challenges-in-ar-vr-design-and-how-to-overcome-them-25210d435a79</ref> | |||
<ref name="appypie_vr_limitations">The Limitations of Virtual Reality. Appy Pie. https://www.appypie.com/blog/virtual-reality-limitations</ref> | |||
<ref name="flatirons_haptics_disadvantages">Disadvantages of Haptic Technology. Flatirons. https://flatirons.com/blog/disadvantages-of-haptic-technology/</ref> | |||
<ref name="boreas_bad_haptics">How Bad Haptic Feedback Can Ruin the User Experience. Boreas Technologies. https://pages.boreas.ca/blog/how-bad-haptic-feedback-can-ruin-the-user-experience</ref> | |||
<ref name="vcd_vr_challenges">Exploring the Challenges and Limitations of Virtual Reality. VCD Social Club. https://vcdsocialclub.co.uk/exploring-the-challenges-and-limitations-of-virtual-reality</ref> | |||
<ref name="ramotion_vr_ux">VR in UX Design: Basic Guidelines. Ramotion. https://www.ramotion.com/blog/vr-in-ux-design/</ref> | |||
<ref name="openxr">OpenXR - High-performance access to AR and VR. Khronos Group. https://www.khronos.org/openxr/</ref> | |||
<ref name="wikiopenxr">OpenXR. Wikipedia. https://en.wikipedia.org/wiki/OpenXR</ref> | |||
<ref name="webxr">WebXR Device API. W3C. https://www.w3.org/TR/webxr/</ref> | |||
<ref name="mdnwebxr">WebXR Device API - Web APIs | MDN. MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API</ref> | |||
<ref name="neuralink_homepage">Neuralink - Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/</ref> | |||
<ref name="youtube_controller_tierlist">Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o</ref> | |||
</references> | |||
[[Category:Terms]] | [[Category:Terms]] | ||
[[Category:Technical Terms]] | |||
[[Category:Input Methods]] | |||
[[Category:VR Technology]] | |||
[[Category:AR Technology]] | |||