Jump to content

Input

From VR & AR Wiki
Revision as of 11:25, 13 October 2025 by Xinreality (talk | contribs)
This page is a stub, please expand it if you have more information.
See also: Input Devices

Input is the method of control and interaction in Virtual Reality and Augmented Reality environments. Input methods range from traditional devices like mouse, keyboard, and gamepad to specialized VR/AR input devices including motion-tracked controllers like Touch and SteamVR Controllers, camera-based hand and finger tracking, eye tracking, voice input, full body tracking, and emerging neural interfaces. Modern VR/AR systems typically support multiple input modalities simultaneously, allowing users to seamlessly switch between controllers, hand gestures, gaze-based selection, and voice commands depending on the task and context.[1]

Definition and Technical Overview

Input in immersive environments refers to mechanisms that capture user actions and translate them into commands within virtual or augmented spaces. Unlike traditional computing interfaces limited to keyboards and mice, VR/AR input systems capture six degrees of freedom for position and orientation tracking, hand and finger poses with 26+ joint positions, eye gaze vectors with sub-degree precision, and voice commands processed through natural language processing.[2]

The technical architecture comprises three layers: hardware devices including sensors, cameras, controllers, and tracking systems that capture user actions; transfer functions that convert human output into digital input through algorithms and machine learning models; and tracking systems that measure spatial position with metrics including degrees of freedom, accuracy (typically 1-5mm for commercial systems), precision, update rate (60-120Hz), and latency (target below 20ms to prevent motion sickness).[3]

Modern systems employ sensor fusion, combining multiple data sources for robust tracking. A typical VR controller integrates infrared LEDs for optical tracking, inertial measurement units with accelerometers and gyroscopes for motion sensing, capacitive sensors for touch detection, and force sensors for grip pressure.[4]

Historical Evolution

Early Pioneers (1960s-1980s)

Ivan Sutherland created the first interactive computer graphics input system in 1963 with Sketchpad, using a light pen for real-time line drawings on a TX-2 computer at MIT. His 1968 Sword of Damocles head-mounted display at Harvard pioneered position tracking via mechanical linkages suspended from the ceiling.[5][6]

The 1970s brought Myron Krueger's VIDEOPLACE system at University of Wisconsin-Madison, which introduced computer vision for body tracking using floor sensors and cameras to detect silhouettes for interaction without worn devices.[7]

VPL Research, founded by Jaron Lanier in 1984, commercialized the first VR input devices. The DataGlove (1987) used fiber optic sensors to detect finger flexure, tracking 256 positions per finger at $9,000 per glove. NASA adopted DataGloves for astronaut training simulations.[8][9]

Nintendo brought gesture input to consumers with the Power Glove (1989), a simplified DataGlove derivative selling for $100. Despite nearly one million units sold, poor precision (only 4 finger positions versus DataGlove's 256) led to commercial failure, demonstrating that consumers demanded both affordability and accuracy.[10]

Modern VR Era (2012-Present)

Palmer Luckey's Oculus Rift Kickstarter campaign in 2012 raised $2.4 million and catalyzed the modern VR era. HTC Vive (2016) introduced Lighthouse tracking using base stations that emit infrared laser sweeps, with photosensors on the headset and controllers calculating position from laser timing. This outside-in tracking approach achieved sub-millimeter accuracy across room-scale spaces.[3]

Oculus Touch controllers (2016) pioneered capacitive touch sensors for finger presence detection, allowing natural hand gestures like pointing or thumbs-up without full hand tracking. The crescent-shaped ergonomic design with analog sticks, face buttons, triggers, and grip buttons became the template for modern VR controllers.[11]

Valve Index Controllers (2019) featured 87 sensors per controller tracking hand position, finger positions, motion, and grip pressure. The adjustable hand straps allowed users to completely open their hands without dropping the controllers, enabling natural grabbing motions.[12][13]

Controller-Free Era (2013-Present)

Leap Motion Controller (2013) brought camera-based hand tracking to consumers as a USB peripheral with two monochromatic infrared cameras tracking hands at 200 frames per second within a 3-foot hemispherical area.[14][15]

Meta Quest introduced hand tracking via software update in December 2019, marking the first mainstream standalone headset offering controller-free input. The February 2023 "Direct Touch" update allowed tapping virtual interfaces directly with fingers, making hand tracking practical for UI interaction.[16]

Apple Vision Pro (February 2024) launched as the first major headset without bundled controllers, validating controller-free as a primary interaction paradigm. The system combines high-precision eye tracking for targeting with pinch gestures for confirmation, processed by the dedicated R1 chip with 12ms latency.[17]

Motion-Tracked Controllers

Meta Quest Touch Controllers

The Meta Quest ecosystem features multiple controller generations. Touch Plus controllers (2023) for Quest 3 eliminated the tracking ring, placing infrared LEDs directly on the controller face. Hybrid tracking combines optical LED detection when in camera view with IMU motion sensing and AI-enhanced hand tracking fusion when occluded. TruTouch variable haptics provide realistic sensations from subtle taps to heavy impacts.[18]

Touch Pro controllers (2022) for Quest Pro achieved self-tracking with onboard cameras and a Qualcomm Snapdragon 662 processor per controller. This eliminates dependence on headset line-of-sight, enabling reliable tracking when controllers are behind the user. The pressure sensor enables pinch detection and stylus tip capability for precision drawing.[19][16]

Valve Index Controllers

Valve Index Controllers demonstrate premium capability with 87 sensors including capacitive sensors detecting each finger's position, analog pressure sensing on grip measuring squeeze force from 0-100%, and force-sensitive triggers. The 1100mAh battery provides 7+ hours with USB-C fast charging.[12]

Lighthouse 2.0 tracking achieves submillimeter positional accuracy by detecting precise timing of laser sweeps from base stations. Each base station emits horizontal and vertical infrared laser planes at known rotation rates. When lasers hit controller sensors, the device calculates exact 3D position from sweep timing. Base stations support tracking volumes up to 33 feet by 33 feet.[13]

The adjustable hand strap allows users to completely open their hands during gameplay, enabling natural throwing, catching, and two-handed weapon handling. This makes Index Controllers preferred by VR enthusiasts despite $279 price and external base station requirement.[12]

PlayStation VR2 Sense Controllers

PlayStation VR2 Sense controllers adapted PlayStation 5 DualSense technology for VR with adaptive triggers featuring variable resistance. The R2 and L2 triggers simulate tension of drawing a bowstring, resistance of pulling a trigger, or pressure of squeezing a brake. Dedicated haptic actuators deliver tailored sensations including impact of raindrops, texture of surfaces, and recoil of weapons.[20]

Inside-out tracking via four cameras on the PSVR2 headset captures LED tracking rings, with 6-axis motion sensing providing continuous updates. Sony announced hand tracking support at SIGGRAPH 2024, positioning PSVR2 as the first PlayStation system offering controller-free gameplay.[21]

Other Controller Systems

HTC Vive controllers evolved through multiple generations. Original Vive wand controllers (2016) featured 24 sensors with circular trackpads tracked by Lighthouse 1.0. Vive Pro controllers (2018) added Lighthouse 2.0 compatibility for 10-meter tracking volumes. Cosmos controllers (2019) shifted to inside-out tracking with thumbsticks and face buttons.[22]

Windows Mixed Reality controllers (2017) established Microsoft's specification for OEM partners including Acer, HP, Lenovo, Samsung, Dell, and Asus. The design combined Vive-style circular touchpads with Touch-style thumbsticks, tracked by visible-light LEDs on circular rings.[22]

Hand and Finger Tracking

Camera-Based Vision Systems

Modern hand tracking relies on computer vision algorithms processing camera feeds in real-time. Meta Quest hand tracking uses headset cameras with machine learning models trained on millions of hand images to generate 26-point skeletal hand models at 30-90Hz. The Hands 2.2 update delivered 40% latency reduction through optimized neural networks.[23][24]

Ultraleap (formerly Leap Motion) uses two infrared cameras and infrared LEDs illuminating hands with near-infrared light. The computer vision pipeline employs a Single Shot Detector neural network for palm detection, then a regression model outputs 3D coordinates for 21 keypoints per hand. The system tracks fingers even when partially hidden through predictive modeling.[15][25]

Apple Vision Pro employs high-resolution cameras transmitting over one billion pixels per second processed by the R1 chip within 12ms. Multiple infrared flood illuminators with camera arrays track hands from various angles, enabling reliable detection when hands overlap. The privacy-first architecture requires apps to explicitly request hand structure permissions.[17]

Computer Vision Algorithms

MediaPipe Hands, Google's open-source solution, demonstrates state-of-the-art pose estimation. The two-stage pipeline runs lightweight palm detection followed by regression predicting 21 3D hand landmarks. The model achieves real-time performance on mobile devices using efficient MobileNet architectures.[26]

Advanced approaches combine Tracking-by-Detection fusing Kernelized Correlation Filters for frame-to-frame tracking with Single Shot Detection for recovery from failures. Deep learning methods extract features using Convolutional Neural Networks, while classical techniques like skin color segmentation, optical flow, and depth sensing from Time-of-Flight sensors provide complementary information.[27]

Haptic Gloves

HaptX Gloves G1 feature 135 microfluidic actuators providing true contact haptics with 0.9mm spatial resolution on fingers. The system delivers up to 40 pounds resistive force per hand through an integrated force feedback exoskeleton. Proprietary magnetic motion capture tracks all hand degrees of freedom. At $5,495 per pair, HaptX targets enterprise training applications.[28]

bHaptics TactGlove DK2 (2023) offers affordable alternative at $269 per pair with twelve HD Linear Resonant Actuators at fingertips plus wrist positions. The soft elastic material achieves 90% of bare hand tracking performance with Meta Quest 3.[29]

SenseGlove Nova 2 (2023) introduced Active Contact Feedback in palm, complementing force feedback on fingers. The $5,000-7,000 enterprise solution uses four sensors for finger tracking with external SteamVR trackers for hand position. The Royal Netherlands Army, NASA, Emirates, and Procter & Gamble employ Nova 2 for training.[30]

Carnegie Mellon University's Fluid Reality haptic gloves (2024) use electroosmotic pumps enabling 0.2kg weight versus 17kg for alternatives. Thirty-two independent pressure actuators per finger pad fit in penny-sized arrays. Estimated commercial pricing around "a few hundred dollars" could bring haptic gloves to consumer VR.[31]

Eye Tracking

Eye tracking in VR/AR employs infrared LEDs and cameras arranged between eyes and displays. Invisible infrared light projects patterns onto eyes, with cameras capturing pupil center and corneal reflections. Machine learning algorithms process images at 100-200Hz to calculate gaze direction, pupil size, and eye openness.[32]

Tobii dominates commercial VR eye tracking, providing technology for PlayStation VR2, HTC Vive Pro Eye, Pimax Crystal, and Varjo headsets. Integration enables foveated rendering—concentrating GPU resources on high-resolution foveal region while rendering periphery at lower detail. PlayStation VR2 achieves 3.6x faster GPU performance through foveated rendering.[33][34]

Apple Vision Pro's eye tracking serves as primary targeting mechanism functioning like a mouse cursor. High-performance infrared cameras and LEDs project patterns analyzed between display frames. Accuracy reaches 1.11 degrees in mixed reality mode and 0.93 degrees in VR mode within central field of view. The "look and pinch" interaction model eliminates need for pointing.[35][36]

HTC Vive Focus Vision (2024) integrated eye tracking as standard feature with 1-degree accuracy, using it for automatic interpupillary distance adjustment. Foveated rendering support and gaze input for UI complement hand tracking and controllers.[37]

Body Tracking

Full-body tracking extends immersion beyond head and hands. HTC Vive Tracker 3.0 attaches to body parts via elastic straps, tracked by SteamVR Lighthouse 2.0 with submillimeter accuracy. At 33% smaller and 15% lighter with 7.5-hour battery life, the tracker enables 6DOF tracking of feet, waist, chest, elbows, or shoulders. VRChat supports up to 11 tracking points for full-body avatar representation.[38]

Vive Ultimate Tracker (2024) eliminated base station requirement through self-tracking with onboard cameras. Two wide-angle cameras per tracker enable 6DOF inside-out tracking, with up to five trackers connecting wirelessly.[39]

SlimeVR pioneered affordable IMU-based full-body tracking using 9-axis sensors (accelerometer, gyroscope, magnetometer) sending rotation data via 2.4GHz WiFi. A 5-tracker lower-body set includes chest, two thighs, and two ankles for approximately $200 with 10-15 hour battery life. IMU tracking avoids occlusion issues but suffers from yaw drift requiring periodic recalibration.[40]

HaritoraX 2 (2024) improved IMU tracking with built-in LiDAR sensors in ankle trackers detecting foot position relative to floor, plus geomagnetic compensation reducing rotational drift. Ultra-compact sensors enable up to 50 hours battery life.[41]

Research validates tracking accuracy. HTC Vive achieves approximately 2mm positional error and less than 1-degree orientation error. Oculus Quest 2 inside-out tracking shows 1.66mm ± 0.74mm translation accuracy and 0.34 ± 0.38 degrees rotation accuracy—comparable to external tracking systems.[42]

Voice Input

Voice input relies on automatic speech recognition converting spoken words to text, combined with natural language processing understanding user intent. Modern systems employ cloud-based or on-device processing using wake words like "Hey Meta" for Meta Quest or Cortana for Microsoft HoloLens.[43]

Meta Quest voice commands enable over 100 commands including "Take a picture," "Start casting," and "Open [app name]." The Meta AI assistant introduced in 2024 extends capabilities to natural language queries.[44]

Microsoft HoloLens pioneered the "See It, Say It" model where voice-enabled buttons display tooltips when gazed at. Commands include hologram manipulation ("Bigger," "Face me"), device control ("Brightness up," "Volume down"), and queries ("What's my IP address?"). Dynamics 365 Remote Assist uses voice for hands-free field service.[45]

Cirrus Logic's SoundClear technology provides hardware foundation with low-power, always-on voice processors featuring multi-mic noise reduction and wake word recognition from one foot to across-room distances.[46]

Brain-Computer Interfaces

Brain-computer interfaces detect electrical signals from brain or nervous system, translating neural activity into digital commands. Non-invasive BCIs use electroencephalography measuring brain waves from scalp electrodes, while invasive approaches implant electrodes in brain tissue. Electromyography offers middle ground, measuring muscle activation signals from skin surface sensors.[47]

Meta's EMG wristband (developed by acquired CTRL-labs) detects electrical signals from forearm muscles as motor neurons transmit movement commands. Signals are detected before fingers physically move, enabling negative latency. A July 2024 Nature paper demonstrated machine learning models working without user-specific calibration—the first generalizable neural interface.[48][49]

Mark Zuckerberg stated neural wristbands will ship "in the next few years," with leaked roadmaps indicating 2025-2027 launch alongside third-generation Ray-Ban smart glasses. The wristband enables handwriting in air, typing on surfaces, and precise finger tracking in any lighting without cameras.[50]

Valve and OpenBCI collaborated on the Galea headset (beta 2022), integrating EEG, EMG, EOG, EDA, PPG, and Tobii eye tracking into Valve Index modification. The open-source platform enables passive BCIs monitoring user state for adaptive VR experiences.[51]

EMOTIV offers consumer/professional headsets including EPOC X (14-channel EEG), Insight (5-channel), and MN8 (32-channel research cap). The EmotivBCI software enables direct brain-computer interfacing with real-time monitoring of attention, workload, emotions, and stress.[52]

Neuralink received FDA approval in 2023 and implanted its first human patient in January 2024, who controls laptop cursor and plays video games via thought. Synchron takes less invasive approach, with 2024 demonstrations showing compatibility with Apple Vision Pro for thought-controlled VR/AR.[53]

Technical Standards

OpenXR

The Khronos Group released OpenXR 1.0 in July 2019, providing first truly cross-platform API for XR applications. OpenXR abstracts hardware differences behind unified interface, enabling developers to write code once and deploy across Meta Quest, SteamVR, Windows Mixed Reality, HTC Vive, Varjo, Magic Leap, and most major platforms except Apple.[54][55]

Version 1.1 (April 2024) consolidated proven extensions into core specification, with action-based input mapping letting runtimes translate abstract actions like "grab" to platform-specific button configurations. Major runtimes including Meta Quest OpenXR, SteamVR, Windows Mixed Reality, PICO, and Varjo are officially conformant.[54]

The extension system balances standardization with innovation. Core features work everywhere, while extensions like `XR_FB_foveated_rendering` for Meta's foveated rendering or `XR_FB_passthrough` for mixed reality enable platform-specific capabilities when available.[54]

WebXR

The W3C Immersive Web Working Group developed WebXR Device API as successor to WebVR, reaching Candidate Recommendation Draft with implementation in Chrome/Edge 79+, Opera 66+, Samsung Internet 12+, Oculus Browser, and Safari on visionOS. The JavaScript API provides browser-based VR/AR without requiring native application installation.[56][57]

WebGL and WebGPU integration enables hardware-accelerated 3D rendering. Related specifications include WebXR Augmented Reality Module for hit testing, WebXR Layers API for performance optimization, WebXR Gamepads Module for controller input, and WebXR Hand Input Module for hand tracking access.[56]

Comparison of Input Methods

Input Method Accuracy Latency Advantages Disadvantages
Motion Controllers 1-2mm <25ms Highest precision, haptic feedback, reliable tracking Learning curve, battery management, occludes hands
Hand Tracking 5-10mm 30-50ms Natural interaction, no hardware needed, intuitive Lower precision, occlusion issues, no haptic feedback
Eye Tracking 0.5-1.0° 5-10ms Fast targeting, foveated rendering, natural selection Calibration required, privacy concerns, vergence issues
Voice Input N/A 100-300ms Hands-free, accessible Environmental noise, privacy concerns, social awkwardness
EMG Wristband Sub-mm Negative Works in dark, subtle input, negative latency Requires tight fit, limited gestures, interference issues
Full Body Tracking 2mm <20ms Complete avatar representation, immersive Setup complexity, cost, space requirements

Current State and Future Trends

The 2024-2025 period represents inflection point for VR/AR input. Apple Vision Pro launched February 2024 as first major headset without bundled controllers, validating controller-free interaction. Meta Quest 3S (September 2024) brought high-quality hand tracking to $300 price point. HTC Vive Focus Vision (September 2024) demonstrated enterprise commitment to multi-modal input with controllers, hand tracking, and eye tracking simultaneously.[37]

EMG wristbands represent most significant emerging technology, with Meta planning 2025-2027 launch with third-generation Ray-Ban glasses. The July 2024 Nature paper demonstrating generalizable models working without user calibration removes major commercialization barrier.[49]

Enterprise haptic gloves found viability at $5,000-7,000 price points for training applications, while Carnegie Mellon's Fluid Reality prototype promises consumer pricing around "a few hundred dollars" if manufacturing scales.[31]

Eye tracking transitions from premium to standard feature, with PlayStation VR2, Apple Vision Pro, and HTC Vive Focus Vision including it as core functionality rather than add-on. Tobii's licensing model enables rapid market expansion across platforms.[33]

The industry converges on multi-modal input supporting simultaneous use of controllers, hand tracking, eye tracking, and voice commands. Users seamlessly switch between input methods depending on task—controllers for gaming precision, hand tracking for social interaction, eye tracking for UI targeting, and voice for explicit commands.

See Also

References

  1. Input Method and Interaction Design. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design Cite error: Invalid <ref> tag; name "gitbook" defined multiple times with different content
  2. Input methods and interaction paradigms in VR/AR. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi Cite error: Invalid <ref> tag; name "fiveable" defined multiple times with different content
  3. 3.0 3.1 The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. https://pmc.ncbi.nlm.nih.gov/articles/PMC5439658/ Cite error: Invalid <ref> tag; name "pubmedvive" defined multiple times with different content
  4. The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. https://journals.sagepub.com/doi/full/10.1177/2041669517708205 Cite error: Invalid <ref> tag; name "sagejournal" defined multiple times with different content
  5. History Of Virtual Reality. https://www.vrs.org.uk/virtual-reality/history.html Cite error: Invalid <ref> tag; name "vrshistory" defined multiple times with different content
  6. The Tremendous VR and CG Systems—of the 1960s. https://spectrum.ieee.org/sketchpad Cite error: Invalid <ref> tag; name "ieeespectrum" defined multiple times with different content
  7. Virtual reality - Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality Cite error: Invalid <ref> tag; name "wikivr" defined multiple times with different content
  8. VPL Research - Wikipedia. https://en.wikipedia.org/wiki/VPL_Research Cite error: Invalid <ref> tag; name "wikivrp" defined multiple times with different content
  9. A Brief History of Virtual Reality: Major Events and Ideas. https://www.coursera.org/articles/history-of-virtual-reality Cite error: Invalid <ref> tag; name "coursera" defined multiple times with different content
  10. The 1960s to the VR revolution: The history of VR headsets. https://aimagazine.com/articles/the-1960s-to-the-vr-revolution-the-history-of-vr-headsets Cite error: Invalid <ref> tag; name "aimagazine" defined multiple times with different content
  11. Oculus Touch Controllers Are A Lighter and Better Touch Than HTC Vive. https://www.tomsguide.com/us/oculus-touch-controllers,review-4072.html Cite error: Invalid <ref> tag; name "tomsguide" defined multiple times with different content
  12. 12.0 12.1 12.2 Controllers - Valve Index. https://www.valvesoftware.com/en/index/controllers Cite error: Invalid <ref> tag; name "valveindex" defined multiple times with different content
  13. 13.0 13.1 Valve Index - Wikipedia. https://en.wikipedia.org/wiki/Valve_Index Cite error: Invalid <ref> tag; name "wikiindex" defined multiple times with different content
  14. Leap Motion - Wikipedia. https://en.wikipedia.org/wiki/Leap_Motion Cite error: Invalid <ref> tag; name "wikileap" defined multiple times with different content
  15. 15.0 15.1 How Does the Leap Motion Controller Work? https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04 Cite error: Invalid <ref> tag; name "leapmedium" defined multiple times with different content
  16. 16.0 16.1 Hand tracking technology & haptic feedback. https://forwork.meta.com/blog/hand-tracking-technology-and-haptic-feedback-mr/ Cite error: Invalid <ref> tag; name "metahand" defined multiple times with different content
  17. 17.0 17.1 Introducing Apple Vision Pro: Apple's first spatial computer. https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/ Cite error: Invalid <ref> tag; name "applevp" defined multiple times with different content
  18. Meta Quest 3 - VR & AR Wiki. https://vrarwiki.com/wiki/Meta_Quest_3 Cite error: Invalid <ref> tag; name "quest3" defined multiple times with different content
  19. Oculus Touch - Wikipedia. https://en.wikipedia.org/wiki/Oculus_Touch Cite error: Invalid <ref> tag; name "wikitouchpro" defined multiple times with different content
  20. PlayStation VR2 and PlayStation VR2 Sense controller. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/ Cite error: Invalid <ref> tag; name "psvr2blog" defined multiple times with different content
  21. PlayStation VR2 - Wikipedia. https://en.wikipedia.org/wiki/PlayStation_VR2 Cite error: Invalid <ref> tag; name "wikipsvr2" defined multiple times with different content
  22. 22.0 22.1 HTC Vive - Wikipedia. https://en.wikipedia.org/wiki/HTC_Vive Cite error: Invalid <ref> tag; name "wikihive" defined multiple times with different content
  23. All Hands on Deck: Crank up Hand Responsiveness. https://developers.meta.com/horizon/blog/hand-tracking-22-response-time-meta-quest-developers/ Cite error: Invalid <ref> tag; name "metahandstracking" defined multiple times with different content
  24. A methodological framework to assess the accuracy of virtual reality hand-tracking systems. https://pmc.ncbi.nlm.nih.gov/articles/PMC10830632/ Cite error: Invalid <ref> tag; name "pubmedhandtrack" defined multiple times with different content
  25. Ultraleap Hand Tracking Overview. https://docs.ultraleap.com/hand-tracking/ Cite error: Invalid <ref> tag; name "ultraleapdocs" defined multiple times with different content
  26. Hand Detection Tracking in Python using OpenCV and MediaPipe. https://gautamaditee.medium.com/hand-recognition-using-opencv-a7b109941c88 Cite error: Invalid <ref> tag; name "mediumhand" defined multiple times with different content
  27. Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. https://pmc.ncbi.nlm.nih.gov/articles/PMC8321080/ Cite error: Invalid <ref> tag; name "pubmedhandgesture" defined multiple times with different content
  28. Home | HaptX. https://haptx.com/
  29. Buy next generation full body haptic suit - bHaptics TactSuit. https://www.bhaptics.com/en/tactsuit/tactglove-dk2/ Cite error: Invalid <ref> tag; name "bhaptics" defined multiple times with different content
  30. Find out about our New Nova 2 Glove. https://www.senseglove.com/product/nova-2/ Cite error: Invalid <ref> tag; name "senseglove" defined multiple times with different content
  31. 31.0 31.1 Fluid Reality Haptic Gloves Bring Ultra-Sensitive Touch to VR. https://www.cs.cmu.edu/news/2024/haptic-gloves Cite error: Invalid <ref> tag; name "cmugloves" defined multiple times with different content
  32. What is VR Eye Tracking? https://imotions.com/blog/learning/best-practice/vr-eye-tracking/ Cite error: Invalid <ref> tag; name "imotionseye" defined multiple times with different content
  33. 33.0 33.1 Eye tracking in VR – A vital component. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Cite error: Invalid <ref> tag; name "tobii" defined multiple times with different content
  34. Eye Tracking on VR (Virtual Reality) headsets. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets Cite error: Invalid <ref> tag; name "pimax" defined multiple times with different content
  35. How You Control Apple Vision Pro With Your Eyes & Hands. https://www.uploadvr.com/apple-vision-pro-gesture-controls/ Cite error: Invalid <ref> tag; name "applevpeye" defined multiple times with different content
  36. Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/ Cite error: Invalid <ref> tag; name "pubmedeye" defined multiple times with different content
  37. 37.0 37.1 VIVE Focus Vision - New Standalone PC VR Headset. https://www.vive.com/us/product/vive-focus-vision/overview/ Cite error: Invalid <ref> tag; name "vivefocus" defined multiple times with different content
  38. VIVE Tracker (3.0). https://www.vive.com/us/accessory/tracker3/ Cite error: Invalid <ref> tag; name "vivetracker" defined multiple times with different content
  39. VIVE Ultimate Tracker - Full-Body Tracking. https://www.vive.com/us/accessory/vive-ultimate-tracker/ Cite error: Invalid <ref> tag; name "viveultimate" defined multiple times with different content
  40. SlimeVR Full-Body Trackers. https://slimevr.dev/ Cite error: Invalid <ref> tag; name "slimevr" defined multiple times with different content
  41. HaritoraX 2 - Fully wireless full-body tracking device. https://en.shiftall.net/products/haritorax2 Cite error: Invalid <ref> tag; name "haritorax" defined multiple times with different content
  42. Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921 Cite error: Invalid <ref> tag; name "acmtracking" defined multiple times with different content
  43. Voice input - Mixed Reality. https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input Cite error: Invalid <ref> tag; name "msvoice" defined multiple times with different content
  44. Meta Quest Voice Commands: The Ultimate Guide. https://arvrtips.com/meta-quest-voice-commands/ Cite error: Invalid <ref> tag; name "questvoice" defined multiple times with different content
  45. Use your voice to operate HoloLens. https://learn.microsoft.com/en-us/hololens/hololens-cortana Cite error: Invalid <ref> tag; name "hololensvoice" defined multiple times with different content
  46. AR/VR Headsets | Cirrus Logic. https://www.cirrus.com/applications/wearables/ar-vr-headsets/ Cite error: Invalid <ref> tag; name "cirrus" defined multiple times with different content
  47. Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full Cite error: Invalid <ref> tag; name "frontiersbci" defined multiple times with different content
  48. Facebook agrees to acquire brain-computing start-up CTRL-labs. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html Cite error: Invalid <ref> tag; name "ctrlabs" defined multiple times with different content
  49. 49.0 49.1 Meta Details EMG Wristband Gestures. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/ Cite error: Invalid <ref> tag; name "metaemg" defined multiple times with different content
  50. Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/ Cite error: Invalid <ref> tag; name "zuckerbergwristband" defined multiple times with different content
  51. Valve, OpenBCI & Tobii to Launch VR Brain-computer Interface 'Galea'. https://www.roadtovr.com/valve-openbci-immersive-vr-games/ Cite error: Invalid <ref> tag; name "valvebci" defined multiple times with different content
  52. How BCI can elevate the AR/VR experience. https://www.emotiv.com/blogs/news/bci-applications-for-vr-ar Cite error: Invalid <ref> tag; name "emotiv" defined multiple times with different content
  53. Brain Implant Hooked Up to Control VR Headset. https://futurism.com/neoscope/synchron-brain-computer-interface-control-vr-headset Cite error: Invalid <ref> tag; name "synchron" defined multiple times with different content
  54. 54.0 54.1 54.2 OpenXR - High-performance access to AR and VR. https://www.khronos.org/openxr/ Cite error: Invalid <ref> tag; name "openxr" defined multiple times with different content
  55. OpenXR - Wikipedia. https://en.wikipedia.org/wiki/OpenXR Cite error: Invalid <ref> tag; name "wikiopenxr" defined multiple times with different content
  56. 56.0 56.1 WebXR Device API. https://www.w3.org/TR/webxr/ Cite error: Invalid <ref> tag; name "webxr" defined multiple times with different content
  57. WebXR Device API - Web APIs | MDN. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API Cite error: Invalid <ref> tag; name "mdnwebxr" defined multiple times with different content