Jump to content

Input: Difference between revisions

No edit summary
No edit summary
Line 1: Line 1:
{{stub}}
{{stub}}
{{See also|Input Devices}}
{{See also|Input Devices}}
[[Input]] is the method of control in [[VR]] or [[AR]]. Input methods can be traditional devices as mouse, keyboard and gamepad or [[Input Devices|new devices]] such as [[positional tracking|motion-tracked]] controllers like [[Touch]] and [[SteamVR Controllers]]. There are even more natural input methods such as tracking your [[:Category:Hands/Fingers Tracking|hands]], [[:Category:Hands/Fingers Tracking|fingers]], [[:Category:Feet Tracking|feet]] and [[:Category:Body Tracking|body]] directly, without controllers or markers.


[[Input]] is the method of control and interaction in [[VR|Virtual Reality]] and [[AR|Augmented Reality]] environments. Input methods range from traditional devices like [[mouse]], [[keyboard]], and [[gamepad]] to specialized [[Input Devices|VR/AR input devices]] including [[positional tracking|motion-tracked]] [[controllers]] like [[Touch]] and [[SteamVR Controllers]], camera-based [[:Category:Hands/Fingers Tracking|hand]] and [[:Category:Hands/Fingers Tracking|finger tracking]], [[eye tracking]], [[voice input]], [[:Category:Body Tracking|full body tracking]], and emerging [[brain-computer interface|neural interfaces]]. Modern VR/AR systems typically support multiple input modalities simultaneously, allowing users to seamlessly switch between controllers, hand gestures, gaze-based selection, and voice commands depending on the task and context.<ref name="gitbook">Input Method and Interaction Design. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref>
==Definition and Technical Overview==
Input in immersive environments refers to mechanisms that capture user actions and translate them into commands within virtual or augmented spaces. Unlike traditional computing interfaces limited to keyboards and mice, VR/AR input systems capture [[6DOF|six degrees of freedom]] for position and orientation tracking, hand and finger poses with 26+ joint positions, [[eye tracking|eye gaze vectors]] with sub-degree precision, and [[voice input|voice commands]] processed through [[natural language processing]].<ref name="fiveable">Input methods and interaction paradigms in VR/AR. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref>
The technical architecture comprises three layers: hardware devices including [[sensors]], [[cameras]], [[controllers]], and [[tracking systems]] that capture user actions; transfer functions that convert human output into digital input through algorithms and [[machine learning]] models; and tracking systems that measure spatial position with metrics including [[degrees of freedom]], accuracy (typically 1-5mm for commercial systems), precision, update rate (60-120Hz), and [[latency]] (target below 20ms to prevent [[motion sickness]]).<ref name="pubmedvive">The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. https://pmc.ncbi.nlm.nih.gov/articles/PMC5439658/</ref>
Modern systems employ [[sensor fusion]], combining multiple data sources for robust tracking. A typical VR controller integrates infrared LEDs for [[optical tracking]], [[IMU|inertial measurement units]] with [[accelerometer|accelerometers]] and [[gyroscope|gyroscopes]] for motion sensing, [[capacitive sensor|capacitive sensors]] for touch detection, and force sensors for grip pressure.<ref name="sagejournal">The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. https://journals.sagepub.com/doi/full/10.1177/2041669517708205</ref>
==Historical Evolution==
===Early Pioneers (1960s-1980s)===
[[Ivan Sutherland]] created the first interactive computer graphics input system in 1963 with [[Sketchpad]], using a light pen for real-time line drawings on a TX-2 computer at [[MIT]]. His 1968 [[Sword of Damocles]] [[head-mounted display]] at [[Harvard]] pioneered position tracking via mechanical linkages suspended from the ceiling.<ref name="vrshistory">History Of Virtual Reality. https://www.vrs.org.uk/virtual-reality/history.html</ref><ref name="ieeespectrum">The Tremendous VR and CG Systems—of the 1960s. https://spectrum.ieee.org/sketchpad</ref>
The 1970s brought [[Myron Krueger]]'s VIDEOPLACE system at University of Wisconsin-Madison, which introduced [[computer vision]] for body tracking using floor sensors and cameras to detect silhouettes for interaction without worn devices.<ref name="wikivr">Virtual reality - Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref>
[[VPL Research]], founded by [[Jaron Lanier]] in 1984, commercialized the first VR input devices. The [[DataGlove]] (1987) used [[fiber optic]] sensors to detect finger flexure, tracking 256 positions per finger at $9,000 per glove. [[NASA]] adopted DataGloves for astronaut training simulations.<ref name="wikivrp">VPL Research - Wikipedia. https://en.wikipedia.org/wiki/VPL_Research</ref><ref name="coursera">A Brief History of Virtual Reality: Major Events and Ideas. https://www.coursera.org/articles/history-of-virtual-reality</ref>
[[Nintendo]] brought gesture input to consumers with the [[Power Glove]] (1989), a simplified DataGlove derivative selling for $100. Despite nearly one million units sold, poor precision (only 4 finger positions versus DataGlove's 256) led to commercial failure, demonstrating that consumers demanded both affordability and accuracy.<ref name="aimagazine">The 1960s to the VR revolution: The history of VR headsets. https://aimagazine.com/articles/the-1960s-to-the-vr-revolution-the-history-of-vr-headsets</ref>
===Modern VR Era (2012-Present)===
[[Palmer Luckey]]'s [[Oculus Rift]] Kickstarter campaign in 2012 raised $2.4 million and catalyzed the modern VR era. [[HTC Vive]] (2016) introduced [[Lighthouse tracking]] using [[base station|base stations]] that emit infrared laser sweeps, with photosensors on the headset and controllers calculating position from laser timing. This [[outside-in tracking]] approach achieved sub-millimeter accuracy across room-scale spaces.<ref name="pubmedvive"/>
[[Oculus Touch]] controllers (2016) pioneered [[capacitive sensor|capacitive touch sensors]] for finger presence detection, allowing natural hand gestures like pointing or thumbs-up without full hand tracking. The crescent-shaped ergonomic design with analog sticks, face buttons, triggers, and grip buttons became the template for modern VR controllers.<ref name="tomsguide">Oculus Touch Controllers Are A Lighter and Better Touch Than HTC Vive. https://www.tomsguide.com/us/oculus-touch-controllers,review-4072.html</ref>
[[Valve Index]] Controllers (2019) featured 87 sensors per controller tracking hand position, finger positions, motion, and grip pressure. The adjustable hand straps allowed users to completely open their hands without dropping the controllers, enabling natural grabbing motions.<ref name="valveindex">Controllers - Valve Index. https://www.valvesoftware.com/en/index/controllers</ref><ref name="wikiindex">Valve Index - Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref>
===Controller-Free Era (2013-Present)===
[[Leap Motion]] Controller (2013) brought camera-based [[hand tracking]] to consumers as a USB peripheral with two monochromatic infrared cameras tracking hands at 200 frames per second within a 3-foot hemispherical area.<ref name="wikileap">Leap Motion - Wikipedia. https://en.wikipedia.org/wiki/Leap_Motion</ref><ref name="leapmedium">How Does the Leap Motion Controller Work? https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04</ref>
[[Meta Quest]] introduced hand tracking via software update in December 2019, marking the first mainstream standalone headset offering controller-free input. The February 2023 "Direct Touch" update allowed tapping virtual interfaces directly with fingers, making hand tracking practical for UI interaction.<ref name="metahand">Hand tracking technology & haptic feedback. https://forwork.meta.com/blog/hand-tracking-technology-and-haptic-feedback-mr/</ref>
[[Apple Vision Pro]] (February 2024) launched as the first major headset without bundled controllers, validating controller-free as a primary interaction paradigm. The system combines high-precision [[eye tracking]] for targeting with pinch gestures for confirmation, processed by the dedicated R1 chip with 12ms latency.<ref name="applevp">Introducing Apple Vision Pro: Apple's first spatial computer. https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/</ref>
==Motion-Tracked Controllers==
===Meta Quest Touch Controllers===
The [[Meta Quest]] ecosystem features multiple controller generations. [[Touch Plus]] controllers (2023) for [[Quest 3]] eliminated the tracking ring, placing infrared LEDs directly on the controller face. [[Hybrid tracking]] combines optical LED detection when in camera view with [[IMU]] motion sensing and AI-enhanced hand tracking fusion when occluded. [[TruTouch]] variable haptics provide realistic sensations from subtle taps to heavy impacts.<ref name="quest3">Meta Quest 3 - VR & AR Wiki. https://vrarwiki.com/wiki/Meta_Quest_3</ref>
[[Touch Pro]] controllers (2022) for [[Quest Pro]] achieved self-tracking with onboard cameras and a [[Qualcomm Snapdragon]] 662 processor per controller. This eliminates dependence on headset line-of-sight, enabling reliable tracking when controllers are behind the user. The pressure sensor enables pinch detection and stylus tip capability for precision drawing.<ref name="wikitouchpro">Oculus Touch - Wikipedia. https://en.wikipedia.org/wiki/Oculus_Touch</ref><ref name="metahand"/>
===Valve Index Controllers===
[[Valve Index]] Controllers demonstrate premium capability with 87 sensors including capacitive sensors detecting each finger's position, analog pressure sensing on grip measuring squeeze force from 0-100%, and force-sensitive triggers. The 1100mAh battery provides 7+ hours with USB-C fast charging.<ref name="valveindex"/>
[[Lighthouse 2.0]] tracking achieves submillimeter positional accuracy by detecting precise timing of laser sweeps from [[base station|base stations]]. Each base station emits horizontal and vertical infrared laser planes at known rotation rates. When lasers hit controller sensors, the device calculates exact 3D position from sweep timing. Base stations support tracking volumes up to 33 feet by 33 feet.<ref name="wikiindex"/>
The adjustable hand strap allows users to completely open their hands during gameplay, enabling natural throwing, catching, and two-handed weapon handling. This makes Index Controllers preferred by VR enthusiasts despite $279 price and external base station requirement.<ref name="valveindex"/>
===PlayStation VR2 Sense Controllers===
[[PlayStation VR2]] [[Sense controllers]] adapted PlayStation 5 [[DualSense]] technology for VR with [[adaptive triggers]] featuring variable resistance. The R2 and L2 triggers simulate tension of drawing a bowstring, resistance of pulling a trigger, or pressure of squeezing a brake. Dedicated [[haptic actuator|haptic actuators]] deliver tailored sensations including impact of raindrops, texture of surfaces, and recoil of weapons.<ref name="psvr2blog">PlayStation VR2 and PlayStation VR2 Sense controller. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/</ref>
[[Inside-out tracking]] via four cameras on the PSVR2 headset captures LED tracking rings, with 6-axis motion sensing providing continuous updates. Sony announced hand tracking support at SIGGRAPH 2024, positioning PSVR2 as the first PlayStation system offering controller-free gameplay.<ref name="wikipsvr2">PlayStation VR2 - Wikipedia. https://en.wikipedia.org/wiki/PlayStation_VR2</ref>
===Other Controller Systems===
[[HTC Vive]] controllers evolved through multiple generations. Original Vive wand controllers (2016) featured 24 sensors with circular trackpads tracked by Lighthouse 1.0. [[Vive Pro]] controllers (2018) added Lighthouse 2.0 compatibility for 10-meter tracking volumes. [[Cosmos]] controllers (2019) shifted to [[inside-out tracking]] with thumbsticks and face buttons.<ref name="wikihive">HTC Vive - Wikipedia. https://en.wikipedia.org/wiki/HTC_Vive</ref>
[[Windows Mixed Reality]] controllers (2017) established Microsoft's specification for OEM partners including [[Acer]], [[HP]], [[Lenovo]], [[Samsung]], [[Dell]], and [[Asus]]. The design combined Vive-style circular touchpads with Touch-style thumbsticks, tracked by visible-light LEDs on circular rings.<ref name="wikihive"/>
==Hand and Finger Tracking==
===Camera-Based Vision Systems===
Modern hand tracking relies on [[computer vision]] algorithms processing camera feeds in real-time. [[Meta Quest]] hand tracking uses headset cameras with [[machine learning]] models trained on millions of hand images to generate 26-point skeletal hand models at 30-90Hz. The Hands 2.2 update delivered 40% latency reduction through optimized [[neural networks]].<ref name="metahandstracking">All Hands on Deck: Crank up Hand Responsiveness. https://developers.meta.com/horizon/blog/hand-tracking-22-response-time-meta-quest-developers/</ref><ref name="pubmedhandtrack">A methodological framework to assess the accuracy of virtual reality hand-tracking systems. https://pmc.ncbi.nlm.nih.gov/articles/PMC10830632/</ref>
[[Ultraleap]] (formerly [[Leap Motion]]) uses two infrared cameras and infrared LEDs illuminating hands with near-infrared light. The [[computer vision]] pipeline employs a [[Single Shot Detector]] neural network for palm detection, then a regression model outputs 3D coordinates for 21 keypoints per hand. The system tracks fingers even when partially hidden through predictive modeling.<ref name="leapmedium"/><ref name="ultraleapdocs">Ultraleap Hand Tracking Overview. https://docs.ultraleap.com/hand-tracking/</ref>
[[Apple Vision Pro]] employs high-resolution cameras transmitting over one billion pixels per second processed by the R1 chip within 12ms. Multiple infrared flood illuminators with camera arrays track hands from various angles, enabling reliable detection when hands overlap. The privacy-first architecture requires apps to explicitly request hand structure permissions.<ref name="applevp"/>
===Computer Vision Algorithms===
[[MediaPipe Hands]], Google's open-source solution, demonstrates state-of-the-art pose estimation. The two-stage pipeline runs lightweight palm detection followed by regression predicting 21 3D hand landmarks. The model achieves real-time performance on mobile devices using efficient [[MobileNet]] architectures.<ref name="mediumhand">Hand Detection Tracking in Python using OpenCV and MediaPipe. https://gautamaditee.medium.com/hand-recognition-using-opencv-a7b109941c88</ref>
Advanced approaches combine Tracking-by-Detection fusing [[Kernelized Correlation Filters]] for frame-to-frame tracking with [[Single Shot Detection]] for recovery from failures. [[Deep learning]] methods extract features using [[Convolutional Neural Networks]], while classical techniques like skin color segmentation, [[optical flow]], and depth sensing from [[Time-of-Flight]] sensors provide complementary information.<ref name="pubmedhandgesture">Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. https://pmc.ncbi.nlm.nih.gov/articles/PMC8321080/</ref>
===Haptic Gloves===
[[HaptX Gloves]] G1 feature 135 microfluidic actuators providing true contact haptics with 0.9mm spatial resolution on fingers. The system delivers up to 40 pounds resistive force per hand through an integrated force feedback exoskeleton. Proprietary [[magnetic motion capture]] tracks all hand degrees of freedom. At $5,495 per pair, HaptX targets enterprise training applications.<ref name="haptx">Home | HaptX. https://haptx.com/</ref>
[[bHaptics]] [[TactGlove DK2]] (2023) offers affordable alternative at $269 per pair with twelve HD [[Linear Resonant Actuators]] at fingertips plus wrist positions. The soft elastic material achieves 90% of bare hand tracking performance with Meta Quest 3.<ref name="bhaptics">Buy next generation full body haptic suit - bHaptics TactSuit. https://www.bhaptics.com/en/tactsuit/tactglove-dk2/</ref>
[[SenseGlove]] Nova 2 (2023) introduced Active Contact Feedback in palm, complementing force feedback on fingers. The $5,000-7,000 enterprise solution uses four sensors for finger tracking with external [[SteamVR]] trackers for hand position. The Royal Netherlands Army, NASA, Emirates, and Procter & Gamble employ Nova 2 for training.<ref name="senseglove">Find out about our New Nova 2 Glove. https://www.senseglove.com/product/nova-2/</ref>
Carnegie Mellon University's Fluid Reality haptic gloves (2024) use [[electroosmotic pump|electroosmotic pumps]] enabling 0.2kg weight versus 17kg for alternatives. Thirty-two independent pressure actuators per finger pad fit in penny-sized arrays. Estimated commercial pricing around "a few hundred dollars" could bring haptic gloves to consumer VR.<ref name="cmugloves">Fluid Reality Haptic Gloves Bring Ultra-Sensitive Touch to VR. https://www.cs.cmu.edu/news/2024/haptic-gloves</ref>
==Eye Tracking==
[[Eye tracking]] in VR/AR employs infrared LEDs and cameras arranged between eyes and displays. Invisible infrared light projects patterns onto eyes, with cameras capturing pupil center and [[corneal reflection|corneal reflections]]. [[Machine learning]] algorithms process images at 100-200Hz to calculate [[gaze direction]], pupil size, and eye openness.<ref name="imotionseye">What is VR Eye Tracking? https://imotions.com/blog/learning/best-practice/vr-eye-tracking/</ref>
[[Tobii]] dominates commercial VR eye tracking, providing technology for [[PlayStation VR2]], [[HTC Vive Pro Eye]], [[Pimax Crystal]], and [[Varjo]] headsets. Integration enables [[foveated rendering]]—concentrating GPU resources on high-resolution foveal region while rendering periphery at lower detail. PlayStation VR2 achieves 3.6x faster GPU performance through foveated rendering.<ref name="tobii">Eye tracking in VR – A vital component. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref><ref name="pimax">Eye Tracking on VR (Virtual Reality) headsets. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets</ref>
[[Apple Vision Pro]]'s eye tracking serves as primary targeting mechanism functioning like a mouse cursor. High-performance infrared cameras and LEDs project patterns analyzed between display frames. Accuracy reaches 1.11 degrees in mixed reality mode and 0.93 degrees in VR mode within central field of view. The "look and pinch" interaction model eliminates need for pointing.<ref name="applevpeye">How You Control Apple Vision Pro With Your Eyes & Hands. https://www.uploadvr.com/apple-vision-pro-gesture-controls/</ref><ref name="pubmedeye">Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/</ref>
[[HTC Vive Focus Vision]] (2024) integrated eye tracking as standard feature with 1-degree accuracy, using it for automatic [[IPD|interpupillary distance]] adjustment. [[Foveated rendering]] support and gaze input for UI complement hand tracking and controllers.<ref name="vivefocus">VIVE Focus Vision - New Standalone PC VR Headset. https://www.vive.com/us/product/vive-focus-vision/overview/</ref>
==Body Tracking==
[[Full-body tracking]] extends immersion beyond head and hands. [[HTC Vive Tracker]] 3.0 attaches to body parts via elastic straps, tracked by [[SteamVR]] Lighthouse 2.0 with submillimeter accuracy. At 33% smaller and 15% lighter with 7.5-hour battery life, the tracker enables 6DOF tracking of feet, waist, chest, elbows, or shoulders. [[VRChat]] supports up to 11 tracking points for full-body avatar representation.<ref name="vivetracker">VIVE Tracker (3.0). https://www.vive.com/us/accessory/tracker3/</ref>
[[Vive Ultimate Tracker]] (2024) eliminated base station requirement through self-tracking with onboard cameras. Two wide-angle cameras per tracker enable 6DOF [[inside-out tracking]], with up to five trackers connecting wirelessly.<ref name="viveultimate">VIVE Ultimate Tracker - Full-Body Tracking. https://www.vive.com/us/accessory/vive-ultimate-tracker/</ref>
[[SlimeVR]] pioneered affordable [[IMU]]-based full-body tracking using 9-axis sensors (accelerometer, gyroscope, magnetometer) sending rotation data via 2.4GHz WiFi. A 5-tracker lower-body set includes chest, two thighs, and two ankles for approximately $200 with 10-15 hour battery life. IMU tracking avoids occlusion issues but suffers from yaw drift requiring periodic recalibration.<ref name="slimevr">SlimeVR Full-Body Trackers. https://slimevr.dev/</ref>
[[HaritoraX]] 2 (2024) improved IMU tracking with built-in [[LiDAR]] sensors in ankle trackers detecting foot position relative to floor, plus geomagnetic compensation reducing rotational drift. Ultra-compact sensors enable up to 50 hours battery life.<ref name="haritorax">HaritoraX 2 - Fully wireless full-body tracking device. https://en.shiftall.net/products/haritorax2</ref>
Research validates tracking accuracy. HTC Vive achieves approximately 2mm positional error and less than 1-degree orientation error. [[Oculus Quest]] 2 inside-out tracking shows 1.66mm ± 0.74mm translation accuracy and 0.34 ± 0.38 degrees rotation accuracy—comparable to external tracking systems.<ref name="acmtracking">Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921</ref>
==Voice Input==
[[Voice input]] relies on [[automatic speech recognition]] converting spoken words to text, combined with [[natural language processing]] understanding user intent. Modern systems employ cloud-based or on-device processing using wake words like "Hey Meta" for [[Meta Quest]] or Cortana for [[Microsoft HoloLens]].<ref name="msvoice">Voice input - Mixed Reality. https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input</ref>
[[Meta Quest]] voice commands enable over 100 commands including "Take a picture," "Start casting," and "Open [app name]." The [[Meta AI]] assistant introduced in 2024 extends capabilities to natural language queries.<ref name="questvoice">Meta Quest Voice Commands: The Ultimate Guide. https://arvrtips.com/meta-quest-voice-commands/</ref>
[[Microsoft HoloLens]] pioneered the "See It, Say It" model where voice-enabled buttons display tooltips when gazed at. Commands include hologram manipulation ("Bigger," "Face me"), device control ("Brightness up," "Volume down"), and queries ("What's my IP address?"). [[Dynamics 365 Remote Assist]] uses voice for hands-free field service.<ref name="hololensvoice">Use your voice to operate HoloLens. https://learn.microsoft.com/en-us/hololens/hololens-cortana</ref>
[[Cirrus Logic]]'s SoundClear technology provides hardware foundation with low-power, always-on voice processors featuring multi-mic noise reduction and wake word recognition from one foot to across-room distances.<ref name="cirrus">AR/VR Headsets | Cirrus Logic. https://www.cirrus.com/applications/wearables/ar-vr-headsets/</ref>
==Brain-Computer Interfaces==
[[Brain-computer interface|Brain-computer interfaces]] detect electrical signals from brain or nervous system, translating neural activity into digital commands. Non-invasive BCIs use [[electroencephalography]] measuring brain waves from scalp electrodes, while invasive approaches implant electrodes in brain tissue. [[Electromyography]] offers middle ground, measuring muscle activation signals from skin surface sensors.<ref name="frontiersbci">Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full</ref>
[[Meta]]'s [[EMG wristband]] (developed by acquired [[CTRL-labs]]) detects electrical signals from forearm muscles as motor neurons transmit movement commands. Signals are detected before fingers physically move, enabling negative latency. A July 2024 Nature paper demonstrated machine learning models working without user-specific calibration—the first generalizable neural interface.<ref name="ctrlabs">Facebook agrees to acquire brain-computing start-up CTRL-labs. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html</ref><ref name="metaemg">Meta Details EMG Wristband Gestures. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/</ref>
Mark Zuckerberg stated neural wristbands will ship "in the next few years," with leaked roadmaps indicating 2025-2027 launch alongside third-generation [[Ray-Ban]] smart glasses. The wristband enables handwriting in air, typing on surfaces, and precise finger tracking in any lighting without cameras.<ref name="zuckerbergwristband">Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/</ref>
[[Valve]] and [[OpenBCI]] collaborated on the [[Galea]] headset (beta 2022), integrating EEG, EMG, EOG, EDA, PPG, and [[Tobii]] eye tracking into [[Valve Index]] modification. The open-source platform enables passive BCIs monitoring user state for adaptive VR experiences.<ref name="valvebci">Valve, OpenBCI & Tobii to Launch VR Brain-computer Interface 'Galea'. https://www.roadtovr.com/valve-openbci-immersive-vr-games/</ref>
[[EMOTIV]] offers consumer/professional headsets including EPOC X (14-channel EEG), Insight (5-channel), and MN8 (32-channel research cap). The EmotivBCI software enables direct brain-computer interfacing with real-time monitoring of attention, workload, emotions, and stress.<ref name="emotiv">How BCI can elevate the AR/VR experience. https://www.emotiv.com/blogs/news/bci-applications-for-vr-ar</ref>
[[Neuralink]] received FDA approval in 2023 and implanted its first human patient in January 2024, who controls laptop cursor and plays video games via thought. [[Synchron]] takes less invasive approach, with 2024 demonstrations showing compatibility with [[Apple Vision Pro]] for thought-controlled VR/AR.<ref name="synchron">Brain Implant Hooked Up to Control VR Headset. https://futurism.com/neoscope/synchron-brain-computer-interface-control-vr-headset</ref>
==Technical Standards==
===OpenXR===
The [[Khronos Group]] released [[OpenXR]] 1.0 in July 2019, providing first truly cross-platform API for XR applications. OpenXR abstracts hardware differences behind unified interface, enabling developers to write code once and deploy across [[Meta Quest]], [[SteamVR]], [[Windows Mixed Reality]], [[HTC Vive]], [[Varjo]], [[Magic Leap]], and most major platforms except Apple.<ref name="openxr">OpenXR - High-performance access to AR and VR. https://www.khronos.org/openxr/</ref><ref name="wikiopenxr">OpenXR - Wikipedia. https://en.wikipedia.org/wiki/OpenXR</ref>
Version 1.1 (April 2024) consolidated proven extensions into core specification, with action-based input mapping letting runtimes translate abstract actions like "grab" to platform-specific button configurations. Major runtimes including Meta Quest OpenXR, SteamVR, Windows Mixed Reality, PICO, and Varjo are officially conformant.<ref name="openxr"/>
The extension system balances standardization with innovation. Core features work everywhere, while extensions like `XR_FB_foveated_rendering` for Meta's [[foveated rendering]] or `XR_FB_passthrough` for [[mixed reality]] enable platform-specific capabilities when available.<ref name="openxr"/>
===WebXR===
The [[W3C]] Immersive Web Working Group developed [[WebXR Device API]] as successor to [[WebVR]], reaching Candidate Recommendation Draft with implementation in Chrome/Edge 79+, Opera 66+, Samsung Internet 12+, [[Oculus Browser]], and Safari on visionOS. The JavaScript API provides browser-based VR/AR without requiring native application installation.<ref name="webxr">WebXR Device API. https://www.w3.org/TR/webxr/</ref><ref name="mdnwebxr">WebXR Device API - Web APIs | MDN. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API</ref>
[[WebGL]] and [[WebGPU]] integration enables hardware-accelerated 3D rendering. Related specifications include WebXR Augmented Reality Module for hit testing, WebXR Layers API for performance optimization, WebXR Gamepads Module for controller input, and WebXR Hand Input Module for hand tracking access.<ref name="webxr"/>
==Comparison of Input Methods==
{| class="wikitable"
! Input Method !! Accuracy !! Latency !! Advantages !! Disadvantages
|-
| Motion Controllers || 1-2mm || <25ms || Highest precision, haptic feedback, reliable tracking || Learning curve, battery management, occludes hands
|-
| Hand Tracking || 5-10mm || 30-50ms || Natural interaction, no hardware needed, intuitive || Lower precision, occlusion issues, no haptic feedback
|-
| Eye Tracking || 0.5-1.0° || 5-10ms || Fast targeting, foveated rendering, natural selection || Calibration required, privacy concerns, vergence issues
|-
| Voice Input || N/A || 100-300ms || Hands-free, accessible || Environmental noise, privacy concerns, social awkwardness
|-
| EMG Wristband || Sub-mm || Negative || Works in dark, subtle input, negative latency || Requires tight fit, limited gestures, interference issues
|-
| Full Body Tracking || 2mm || <20ms || Complete avatar representation, immersive || Setup complexity, cost, space requirements
|}
==Current State and Future Trends==
The 2024-2025 period represents inflection point for VR/AR input. [[Apple Vision Pro]] launched February 2024 as first major headset without bundled controllers, validating controller-free interaction. [[Meta Quest 3S]] (September 2024) brought high-quality hand tracking to $300 price point. [[HTC Vive Focus Vision]] (September 2024) demonstrated enterprise commitment to multi-modal input with controllers, hand tracking, and eye tracking simultaneously.<ref name="vivefocus"/>
[[EMG wristband|EMG wristbands]] represent most significant emerging technology, with Meta planning 2025-2027 launch with third-generation Ray-Ban glasses. The July 2024 Nature paper demonstrating generalizable models working without user calibration removes major commercialization barrier.<ref name="metaemg"/>
Enterprise haptic gloves found viability at $5,000-7,000 price points for training applications, while Carnegie Mellon's Fluid Reality prototype promises consumer pricing around "a few hundred dollars" if manufacturing scales.<ref name="cmugloves"/>
Eye tracking transitions from premium to standard feature, with [[PlayStation VR2]], [[Apple Vision Pro]], and [[HTC Vive Focus Vision]] including it as core functionality rather than add-on. [[Tobii]]'s licensing model enables rapid market expansion across platforms.<ref name="tobii"/>
The industry converges on multi-modal input supporting simultaneous use of controllers, hand tracking, eye tracking, and voice commands. Users seamlessly switch between input methods depending on task—controllers for gaming precision, hand tracking for social interaction, eye tracking for UI targeting, and voice for explicit commands.
==See Also==
* [[Input Devices]]
* [[Controllers]]
* [[Hand Tracking]]
* [[Eye Tracking]]
* [[Body Tracking]]
* [[Brain-Computer Interface]]
* [[OpenXR]]
* [[Positional Tracking]]
* [[6DOF]]
==References==
<references>
<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref>
<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref>
<ref name="pubmedvive">Niehorster, D. C., Li, L., & Lappe, M. (2017). The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC5439658/</ref>
<ref name="sagejournal">Niehorster, D. C., Li, L., & Lappe, M. (2017). The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. Sage Journals. https://journals.sagepub.com/doi/full/10.1177/2041669517708205</ref>
<ref name="vrshistory">History Of Virtual Reality. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality/history.html</ref>
<ref name="ieeespectrum">The Tremendous VR and CG Systems—of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref>
<ref name="wikivr">Virtual reality. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref>
<ref name="wikivrp">VPL Research. Wikipedia. https://en.wikipedia.org/wiki/VPL_Research</ref>
<ref name="coursera">A Brief History of Virtual Reality: Major Events and Ideas. Coursera. https://www.coursera.org/articles/history-of-virtual-reality</ref>
<ref name="aimagazine">The 1960s to the VR revolution: The history of VR headsets. AI Magazine. https://aimagazine.com/articles/the-1960s-to-the-vr-revolution-the-history-of-vr-headsets</ref>
<ref name="tomsguide">Oculus Touch Controllers Are A Lighter and Better Touch Than HTC Vive. Tom's Guide. https://www.tomsguide.com/us/oculus-touch-controllers,review-4072.html</ref>
<ref name="valveindex">Controllers - Valve Index® - Upgrade your experience. Valve Corporation. https://www.valvesoftware.com/en/index/controllers</ref>
<ref name="wikiindex">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref>
<ref name="wikileap">Leap Motion. Wikipedia. https://en.wikipedia.org/wiki/Leap_Motion</ref>
<ref name="leapmedium">How Does the Leap Motion Controller Work? Medium. https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04</ref>
<ref name="metahand">Hand tracking technology & haptic feedback. Meta for Work. https://forwork.meta.com/blog/hand-tracking-technology-and-haptic-feedback-mr/</ref>
<ref name="applevp">Introducing Apple Vision Pro: Apple's first spatial computer. Apple. https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/</ref>
<ref name="quest3">Meta Quest 3 - VR & AR Wiki - Virtual Reality & Augmented Reality Wiki. https://vrarwiki.com/wiki/Meta_Quest_3</ref>
<ref name="wikitouchpro">Oculus Touch. Wikipedia. https://en.wikipedia.org/wiki/Oculus_Touch</ref>
<ref name="psvr2blog">PlayStation VR2 and PlayStation VR2 Sense controller: the next generation of VR gaming on PS5. PlayStation Blog. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/</ref>
<ref name="wikipsvr2">PlayStation VR2. Wikipedia. https://en.wikipedia.org/wiki/PlayStation_VR2</ref>
<ref name="wikihive">HTC Vive. Wikipedia. https://en.wikipedia.org/wiki/HTC_Vive</ref>
<ref name="metahandstracking">All Hands on Deck: Crank up Hand Responsiveness and Unlock New Gameplay with Hands 2.2. Meta for Developers. https://developers.meta.com/horizon/blog/hand-tracking-22-response-time-meta-quest-developers/</ref>
<ref name="pubmedhandtrack">A methodological framework to assess the accuracy of virtual reality hand-tracking systems: A case study with the Meta Quest 2. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10830632/</ref>
<ref name="ultraleapdocs">Ultraleap Hand Tracking Overview. Ultraleap Documentation. https://docs.ultraleap.com/hand-tracking/</ref>
<ref name="mediumhand">Hand Detection Tracking in Python using OpenCV and MediaPipe. Medium. https://gautamaditee.medium.com/hand-recognition-using-opencv-a7b109941c88</ref>
<ref name="pubmedhandgesture">Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC8321080/</ref>
<ref name="haptx">Home | HaptX. https://haptx.com/</ref>
<ref name="bhaptics">Buy next generation full body haptic suit - bHaptics TactSuit. bHaptics. https://www.bhaptics.com/en/tactsuit/tactglove-dk2/</ref>
<ref name="senseglove">Find out about our New Nova 2 Glove. SenseGlove. https://www.senseglove.com/product/nova-2/</ref>
<ref name="cmugloves">Fluid Reality Haptic Gloves Bring Ultra-Sensitive Touch to VR. Carnegie Mellon University. https://www.cs.cmu.edu/news/2024/haptic-gloves</ref>
<ref name="imotionseye">What is VR Eye Tracking? [And How Does it Work?] iMotions. https://imotions.com/blog/learning/best-practice/vr-eye-tracking/</ref>
<ref name="tobii">Eye tracking in VR – A vital component. Tobii. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref>
<ref name="pimax">Eye Tracking on VR (Virtual Reality) headsets. Pimax. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets</ref>
<ref name="applevpeye">How You Control Apple Vision Pro With Your Eyes & Hands. UploadVR. https://www.uploadvr.com/apple-vision-pro-gesture-controls/</ref>
<ref name="pubmedeye">Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/</ref>
<ref name="vivefocus">VIVE Focus Vision - New Standalone PC VR Headset for Gaming. VIVE United States. https://www.vive.com/us/product/vive-focus-vision/overview/</ref>
<ref name="vivetracker">VIVE Tracker (3.0). VIVE United States. https://www.vive.com/us/accessory/tracker3/</ref>
<ref name="viveultimate">VIVE Ultimate Tracker - Full-Body Tracking, SteamVR Support. VIVE. https://www.vive.com/us/accessory/vive-ultimate-tracker/</ref>
<ref name="slimevr">SlimeVR Full-Body Trackers. SlimeVR Official. https://slimevr.dev/</ref>
<ref name="haritorax">HaritoraX 2 - Fully wireless full-body tracking device. Shiftall. https://en.shiftall.net/products/haritorax2</ref>
<ref name="acmtracking">Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2 in a Room Scale Setup. ACM Digital Library. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921</ref>
<ref name="msvoice">Voice input - Mixed Reality. Microsoft Learn. https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input</ref>
<ref name="questvoice">Meta Quest Voice Commands: The Ultimate Guide [2025]. AR/VR Tips. https://arvrtips.com/meta-quest-voice-commands/</ref>
<ref name="hololensvoice">Use your voice to operate HoloLens. Microsoft Learn. https://learn.microsoft.com/en-us/hololens/hololens-cortana</ref>
<ref name="cirrus">AR/VR Headsets. Cirrus Logic. https://www.cirrus.com/applications/wearables/ar-vr-headsets/</ref>
<ref name="frontiersbci">Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. Frontiers in Human Neuroscience. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full</ref>
<ref name="ctrlabs">Facebook agrees to acquire brain-computing start-up CTRL-labs. CNBC. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html</ref>
<ref name="metaemg">Meta Details EMG Wristband Gestures You'll Use To Control Its HUD & AR Glasses. UploadVR. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/</ref>
<ref name="zuckerbergwristband">Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. UploadVR. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/</ref>
<ref name="valvebci">Valve, OpenBCI & Tobii to Launch VR Brain-computer Interface 'Galea' in Early 2022. Road to VR. https://www.roadtovr.com/valve-openbci-immersive-vr-games/</ref>
<ref name="emotiv">How BCI can elevate the AR/VR experience. EMOTIV. https://www.emotiv.com/blogs/news/bci-applications-for-vr-ar</ref>
<ref name="synchron">Brain Implant Hooked Up to Control VR Headset. Futurism. https://futurism.com/neoscope/synchron-brain-computer-interface-control-vr-headset</ref>
<ref name="openxr">OpenXR - High-performance access to AR and VR—collectively known as XR—platforms and devices. Khronos Group. https://www.khronos.org/openxr/</ref>
<ref name="wikiopenxr">OpenXR. Wikipedia. https://en.wikipedia.org/wiki/OpenXR</ref>
<ref name="webxr">WebXR Device API. W3C. https://www.w3.org/TR/webxr/</ref>
<ref name="mdnwebxr">WebXR Device API - Web APIs | MDN. MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API</ref>
</references>


[[Category:Terms]]
[[Category:Terms]]
[[Category:Technical Terms]]
[[Category:Technical Terms]]
[[Category:Input Methods]]
[[Category:VR Technology]]
[[Category:AR Technology]]