Input
- See also: Input Devices
Input in virtual reality (VR) and augmented reality (AR) refers to the various methods and technologies that allow a user to interact with, control, and provide data to a computer-generated environment.[1] Unlike traditional computing that primarily relies on a keyboard and mouse, extended reality (XR) input encompasses a wide spectrum of devices and techniques designed to create a sense of immersion and presence by translating a user's physical actions into digital ones.[2]
Input methods range from traditional devices like gamepads to sophisticated motion controllers that track hand movements, and increasingly, to more natural interfaces such as controller-free hand tracking, eye tracking, and voice commands.[3] Modern VR/AR systems typically support multiple input modalities simultaneously, allowing users to seamlessly switch between controllers, hand gestures, gaze-based selection, and voice commands depending on the task and context.[4]
Definition and Technical Overview
Input in immersive environments refers to mechanisms that capture user actions and translate them into commands within virtual or augmented spaces. Unlike traditional computing interfaces limited to keyboards and mice, VR/AR input systems capture six degrees of freedom for position and orientation tracking, hand and finger poses with 26+ joint positions, eye gaze vectors with sub-degree precision, and voice commands processed through natural language processing.[4]
The technical architecture comprises three layers: hardware devices including sensors, cameras, controllers, and tracking systems that capture user actions; transfer functions that convert human output into digital input through algorithms and machine learning models; and tracking systems that measure spatial position with metrics including degrees of freedom, accuracy (typically 1-5mm for commercial systems), precision, update rate (60-120Hz), and latency (target below 20ms to prevent motion sickness).[5]
Modern systems employ sensor fusion, combining multiple data sources for robust tracking. A typical VR controller integrates infrared LEDs for optical tracking, inertial measurement units with accelerometers and gyroscopes for motion sensing, capacitive sensors for touch detection, and force sensors for grip pressure.[6]
Historical Evolution
The history of input devices in VR and AR spans over a century, evolving from early stereoscopic viewers to sophisticated tracking systems and brain interfaces. Key milestones focus on improving interactivity, control precision, and user immersion.
Early Precursors (1838–1980s)
The earliest roots of immersive input can be traced to the 19th century with the invention of the stereoscope by Charles Wheatstone in 1838. This device used twin mirrors to project a separate image to each eye, creating a sense of 3D depth and immersion from static images, establishing the core principle of stereoscopic vision that underpins modern VR headsets.[7]
The first interactive simulators emerged in the early 20th century:
- 1849: David Brewster develops the lenticular stereoscope, the first portable 3D viewer using optics for user interaction with images.[8]
- 1929: Edwin Link creates the Link Trainer flight simulator, an electromechanical flight simulator that responded to a pilot's manipulation of its controls, demonstrating an early form of interactive, simulation-based input.[7]
- 1952: Morton Heilig invents Sensorama, a multi-sensory machine with stereoscopic display and physical feedback for immersive interaction, an arcade-style cabinet that stimulated multiple senses including sight, sound, smell, and touch via a vibrating chair.[8][9]
The direct lineage of modern VR input began in the 1960s:
- 1961: Philco Corporation engineers developed the Headsight, the first head-mounted display (HMD), which featured a magnetic motion tracking system where head movements would control a remote camera, allowing for intuitive remote viewing.[7]
- 1963: Ivan Sutherland created the first interactive computer graphics input system with Sketchpad, using a light pen for real-time line drawings on a TX-2 computer at MIT.[10]
- 1965: Sutherland conceptualized the "Ultimate Display," a theoretical room that could simulate reality so perfectly that a user could not differentiate it from the real world, including not just visual and auditory simulation but also haptic feedback and interaction with virtual objects.[11]
- 1968: Sutherland and his student Bob Sproull built the first actual VR/AR HMD, nicknamed "Sword of Damocles." The device was connected to a computer and used a mechanical or ultrasonic head-tracking system to update the user's perspective in real-time as they moved their head, marking the first instance of interactive, computer-generated immersive graphics.[12]
- 1969: Myron Krueger develops responsive computer-generated environments, termed "artificial reality."[8]
- 1974–1975: Krueger builds Videoplace, enabling tracker-free interaction in simulated environments and allowing interaction with virtual objects via gestures at University of Wisconsin-Madison.[8][13]
Commercial VR Peripherals Era (1980s–1990s)
The 1980s saw the commercialization of specialized VR input devices:
- 1980: Steve Mann creates a wearable computer with vision overlays for AR input.[14]
- 1982: Daniel Sandin and Thomas DeFanti invented the Sayre Glove, which used optical sensors to track finger movements.[7]
- 1984: Jaron Lanier founds VPL Research, developing the EyePhone HMD and DataGlove for gesture input.[8]
- 1987: Lanier coins "virtual reality"; VPL Research releases the DataGlove, which used fiber optic sensors to detect finger flexure, tracking 256 positions per finger at $9,000 per glove. NASA adopted DataGloves for astronaut training simulations.[15][16]
- 1989: VPL produces DataGlove, licensed to Mattel for Power Glove, bringing gesture input to consumers at $100. Despite nearly one million units sold, poor precision (only 4 finger positions versus DataGlove's 256) led to commercial failure. VPL develops DataSuit for full-body tracking.[8][13]
- 1990: Tom Caudell coins "augmented reality" at Boeing, using HMDs for schematic overlays.[8]
- 1991: Virtuality Group creates VR arcade machines with controllers and trackers; Sega develops VR headset with inertial sensors.[8]
- 1992: Louis Rosenberg develops Virtual Fixtures AR system; Virtuality systems use exoskeleton gloves; CAVE created for multi-user interaction.[8][13]
- 1995: Nintendo's Virtual Boy for home VR gaming; University of Massachusetts develops vision-based tracking.[8]
Modern VR Era (2000s–Present)
- 2000: ARToolKit released for marker-based AR; ARQuake, first mobile AR game, uses HMD, tracker, GPS, and gun controller.[8]
- 2003: Sony's EyeToy for gesture and motion control via camera.[8]
- 2010: Palmer Luckey prototypes Oculus VR HMD with head tracking.[8]
- 2012: Oculus Rift Kickstarter campaign raised $2.4 million and catalyzed the modern VR era; Google Glass for optical AR.[8]
- 2013: Leap Motion Controller brought camera-based hand tracking to consumers as a USB peripheral with two monochromatic infrared cameras tracking hands at 200 frames per second within a 3-foot hemispherical area.[17][18]
- 2015: Google Cardboard uses smartphone sensors; Microsoft HoloLens announced with gesture input; OSVR by Razer for open-source tracking.[13]
- 2016: HTC Vive introduced Lighthouse tracking using base stations that emit infrared laser sweeps, with photosensors on the headset and controllers calculating position from laser timing. This outside-in tracking approach achieved sub-millimeter accuracy across room-scale spaces.[5] Oculus Touch controllers pioneered capacitive touch sensors for finger presence detection, allowing natural hand gestures like pointing or thumbs-up without full hand tracking.[19] PlayStation VR launched; ARCore and ARKit released.[8]
- 2017: Facebook announces brain interface; immersive VR therapy for phantom limb pain using myoelectric controls.[2]
- 2018: Magic Leap One with hand tracking; Leap Motion's Project North Star open-source AR headset.[8]
- 2019: Meta Quest introduced hand tracking via software update in December 2019, marking the first mainstream standalone headset offering controller-free input. Valve Index Controllers featured 87 sensors per controller tracking hand position, finger positions, motion, and grip pressure with adjustable hand straps allowing users to completely open their hands without dropping the controllers.[20][21] HoloLens 2 released.[13]
- 2020: Meta Quest 2 with pancake lenses.[13]
- 2022: Meta Quest Pro with face/eye tracking and Touch Pro controllers achieving self-tracking with onboard cameras; PlayStation VR2 with haptics.[13]
- 2023: Meta Quest 3 with "Direct Touch" update allowed tapping virtual interfaces directly with fingers; Apple Vision Pro with eye/hand tracking announced; WebXR for browser-based input.[8][22]
- 2024: Apple Vision Pro launched in February 2024 as the first major headset without bundled controllers, validating controller-free as a primary interaction paradigm. The system combines high-precision eye tracking for targeting with pinch gestures for confirmation, processed by the dedicated R1 chip with 12ms latency.[23]
- 2025: AI integration enhances object recognition, natural language, and tracking for intuitive controls.[8]
Core Concepts
Modern VR/AR input is built upon several fundamental concepts that define how users can move and interact within a virtual space.
Degrees of Freedom (DoF)
Degrees of freedom refers to the number of ways a rigid body can move in 3D space. This is a critical concept for understanding the capabilities of a VR system's tracking.[24]
- Three Degrees of Freedom (3DoF): This allows for the tracking of rotational movement only, tracking when a user looks up/down (pitch), left/right (yaw), and tilts their head side-to-side (roll). A 3DoF headset or controller can track these rotations but cannot track the user's physical movement through space. Early mobile VR headsets like the Google Cardboard and Samsung Gear VR were 3DoF systems.[25]
- Six Degrees of Freedom (6DoF): This tracks both rotational and translational movement. In addition to the three rotational axes, 6DoF systems can track movement forward/backward (surging), left/right (strafing), and up/down (elevating). This allows a user to physically walk around, duck, and lean within the virtual environment, which is essential for true immersion and is the standard for modern VR systems like the Meta Quest 3 and Valve Index.[26]
Tracking Technologies
Positional tracking is the technology that enables 6DoF by determining the real-time position and orientation of the headset and controllers. There are two primary methods for achieving this.[27]
- Outside-In Tracking: This method uses external sensors (for example cameras or infrared emitters called "base stations" or "lighthouses") placed in the physical environment to track the position of the headset and controllers. These external sensors monitor markers (often infrared LEDs) on the tracked devices. Systems like the original HTC Vive, Oculus Rift CV1, and the Valve Index use outside-in tracking. This method can provide highly accurate and stable tracking but requires a more complex setup and a dedicated play space.[28][29]
- Inside-Out Tracking: This method places the tracking sensors, typically cameras, directly on the headset itself. These cameras observe the surrounding environment and use computer vision algorithms (such as simultaneous localization and mapping) to calculate the headset's position and orientation relative to fixed points in the room.[30] Controllers are tracked by these same headset cameras observing their infrared LEDs. This approach is used by all modern standalone headsets, such as the Meta Quest series and the Pico 4, as it eliminates the need for external hardware, making setup much simpler and allowing the system to be used in any location.[31]
- Self-Tracking (Inside-Out on Controller): A newer hybrid approach places cameras directly onto the controllers themselves, as seen with the Meta Quest Touch Pro controllers. Each controller has its own onboard cameras and a Qualcomm Snapdragon 662 processor per controller, allowing it to track its own position in 3D space independently of the headset's cameras. This provides more robust tracking, preventing loss of tracking when the controllers are outside the headset's field of view (for example behind the user's back).[32]
Input Modalities
VR and AR input combine multiple modalities for effective interaction, including:
- Gestures: Recognizing predefined or natural hand movements.
- Buttons and Triggers: Physical inputs on controllers for discrete actions.
- Haptics: Tactile feedback to simulate touch or resistance.
- Voice Commands: Speech recognition for command input.
- Gaze: Eye tracking for targeting and selection.
Types of Input Methods
Input methods in VR and AR are categorized based on sensors, vision, sound, mind, and multimodal combinations. These enable detection of human effectors like hands, eyes, voice, and brain signals.[2][3]
Motion-Tracked Controllers
Motion controllers are handheld devices that translate the user's hand and finger movements into the virtual environment.[33] They are the most common input method for 6DoF VR, typically featuring a combination of buttons, triggers, thumbsticks, and tracking sensors.[34]
| Feature | Meta Quest 2 Touch | Meta Quest Touch Plus | Meta Quest Touch Pro | Valve Index Controller | PlayStation VR2 Sense |
|---|---|---|---|---|---|
| Primary System(s) | Meta Quest 2 | Meta Quest 3 | Meta Quest Pro, Quest 2/3 | Valve Index, any SteamVR system | PlayStation VR2 with PlayStation 5 |
| Tracking Method | Inside-out (via headset)[35] | Inside-out (via headset)[36] | Self-contained inside-out[37] | Outside-in (Lighthouse)[20] | Inside-out (via headset)[38] |
| Weight (w/ battery) | ~150 g[39] | ~126 g[40] | ~164 g[41] | ~196 g[39] | ~168 g[39] |
| Haptics | Standard vibrotactile | TruTouch variable haptics[42] | Localized TruTouch haptics[37] | HD LRA haptics[20] | Advanced haptics, Adaptive Triggers[43] |
| Finger Sensing | Capacitive (thumb, index)[44] | Capacitive (thumb, index)[42] | Capacitive (thumb, index), Precision pinch[37] | Full 5-finger tracking, Grip force[20] | Capacitive (thumb, index, middle)[43] |
| Key Features | Tracking ring, ergonomic grip | Ringless design, improved haptics | Self-tracking, Stylus tip, Rechargeable | Hand strap for open-hand interaction, per-finger tracking | Adaptive triggers, advanced haptics |
| Power Source | 1x AA Battery (~30 hrs)[40] | 1x AA Battery (~30 hrs)[40] | Rechargeable (~8 hrs)[45] | Rechargeable (7+ hrs)[20] | Rechargeable (~4-5 hrs)[38] |
Meta Quest Touch Controllers
The Meta Quest ecosystem features multiple controller generations. Touch Plus controllers (2023) for Quest 3 eliminated the tracking ring, placing infrared LEDs directly on the controller face. Hybrid tracking combines optical LED detection when in camera view with IMU motion sensing and AI-enhanced hand tracking fusion when occluded. TruTouch variable haptics provide realistic sensations from subtle taps to heavy impacts.[46]
Touch Pro controllers (2022) for Quest Pro achieved self-tracking with onboard cameras and a Qualcomm Snapdragon 662 processor per controller. This eliminates dependence on headset line-of-sight, enabling reliable tracking when controllers are behind the user. The pressure sensor enables pinch detection and stylus tip capability for precision drawing.[47][22]
Valve Index Controllers
Valve Index Controllers demonstrate premium capability with 87 sensors including capacitive sensors detecting each finger's position, analog pressure sensing on grip measuring squeeze force from 0-100%, and force-sensitive triggers. The 1100mAh battery provides 7+ hours with USB-C fast charging.[20]
Lighthouse 2.0 tracking achieves submillimeter positional accuracy by detecting precise timing of laser sweeps from base stations. Each base station emits horizontal and vertical infrared laser planes at known rotation rates. When lasers hit controller sensors, the device calculates exact 3D position from sweep timing. Base stations support tracking volumes up to 33 feet by 33 feet.[21]
The adjustable hand strap allows users to completely open their hands during gameplay, enabling natural throwing, catching, and two-handed weapon handling. This makes Index Controllers preferred by VR enthusiasts despite $279 price and external base station requirement.[20]
PlayStation VR2 Sense Controllers
PlayStation VR2 Sense controllers adapted PlayStation 5 DualSense technology for VR with adaptive triggers featuring variable resistance. The R2 and L2 triggers simulate tension of drawing a bowstring, resistance of pulling a trigger, or pressure of squeezing a brake. Dedicated haptic actuators deliver tailored sensations including impact of raindrops, texture of surfaces, and recoil of weapons.[48]
Inside-out tracking via four cameras on the PSVR2 headset captures LED tracking rings, with 6-axis motion sensing providing continuous updates. Sony announced hand tracking support at SIGGRAPH 2024, positioning PSVR2 as the first PlayStation system offering controller-free gameplay.[38]
Other Controller Systems
HTC Vive controllers evolved through multiple generations. Original Vive wand controllers (2016) featured 24 sensors with circular trackpads tracked by Lighthouse 1.0. Vive Pro controllers (2018) added Lighthouse 2.0 compatibility for 10-meter tracking volumes. Cosmos controllers (2019) shifted to inside-out tracking with thumbsticks and face buttons.[49]
Windows Mixed Reality controllers (2017) established Microsoft's specification for OEM partners including Acer, HP, Lenovo, Samsung, Dell, and Asus. The design combined Vive-style circular touchpads with Touch-style thumbsticks, tracked by visible-light LEDs on circular rings.[49]
Hand and Finger Tracking
Controller-free hand tracking allows users to interact with virtual environments using only their natural hand movements, without holding any physical device.[50] This technology is primarily camera-based.
Camera-Based Vision Systems
Modern hand tracking relies on computer vision algorithms processing camera feeds in real-time. Meta Quest hand tracking uses headset cameras with machine learning models trained on millions of hand images to generate 26-point skeletal hand models at 30-90Hz. The Hands 2.2 update delivered 40% latency reduction through optimized neural networks.[51][52]
Ultraleap (formerly Leap Motion) uses two infrared cameras and infrared LEDs illuminating hands with near-infrared light. The computer vision pipeline employs a Single Shot Detector neural network for palm detection, then a regression model outputs 3D coordinates for 21 keypoints per hand. The system tracks fingers even when partially hidden through predictive modeling.[18][53]
Apple Vision Pro employs high-resolution cameras transmitting over one billion pixels per second processed by the R1 chip within 12ms. Multiple infrared flood illuminators with camera arrays track hands from various angles, enabling reliable detection when hands overlap. The privacy-first architecture requires apps to explicitly request hand structure permissions.[23]
Computer Vision Algorithms
MediaPipe Hands, Google's open-source solution, demonstrates state-of-the-art pose estimation. The two-stage pipeline runs lightweight palm detection followed by regression predicting 21 3D hand landmarks. The model achieves real-time performance on mobile devices using efficient MobileNet architectures.[54]
Advanced approaches combine Tracking-by-Detection fusing Kernelized Correlation Filters for frame-to-frame tracking with Single Shot Detection for recovery from failures. Deep learning methods extract features using Convolutional Neural Networks, while classical techniques like skin color segmentation, optical flow, and depth sensing from Time-of-Flight sensors provide complementary information.[55]
Hand tracking is ideal for social VR, menu navigation, and applications where intuitive, simple gestures like pointing, pinching, and grabbing are sufficient. However, it currently has several limitations compared to physical controllers: it lacks the tactile feedback of a button press or trigger pull, making interactions feel less tangible; its precision can be lower, especially for fast movements; and tracking can be lost if hands are occluded from the camera's view or move outside the tracking zone.[56]
Haptic Gloves
HaptX Gloves G1 feature 135 microfluidic actuators providing true contact haptics with 0.9mm spatial resolution on fingers. The system delivers up to 40 pounds resistive force per hand through an integrated force feedback exoskeleton. Proprietary magnetic motion capture tracks all hand degrees of freedom. At $5,495 per pair, HaptX targets enterprise training applications.[57]
bHaptics TactGlove DK2 (2023) offers affordable alternative at $269 per pair with twelve HD Linear Resonant Actuators at fingertips plus wrist positions. The soft elastic material achieves 90% of bare hand tracking performance with Meta Quest 3.[58]
SenseGlove Nova 2 (2023) introduced Active Contact Feedback in palm, complementing force feedback on fingers. The $5,000-7,000 enterprise solution uses four sensors for finger tracking with external SteamVR trackers for hand position. The Royal Netherlands Army, NASA, Emirates, and Procter & Gamble employ Nova 2 for training.[59]
Carnegie Mellon University's Fluid Reality haptic gloves (2024) use electroosmotic pumps enabling 0.2kg weight versus 17kg for alternatives. Thirty-two independent pressure actuators per finger pad fit in penny-sized arrays. Estimated commercial pricing around "a few hundred dollars" could bring haptic gloves to consumer VR.[60]
Eye Tracking
Eye tracking in VR/AR employs infrared LEDs and cameras arranged between eyes and displays. Invisible infrared light projects patterns onto eyes, with cameras capturing pupil center and corneal reflections. Machine learning algorithms process images at 100-200Hz to calculate gaze direction, pupil size, and eye openness.[61]
Eye tracking serves several key functions as an input modality:
- Gaze-Based Interaction: It allows for a fast and intuitive way to target and select objects or UI elements. A user can simply look at a button and then perform a confirmation action (like a hand pinch or controller button press) to activate it. This can significantly speed up interaction compared to pointing with a controller.[62]
- Foveated Rendering: This is a powerful optimization technique that leverages how human vision works. The human eye only sees a very small area (the fovea) in high detail at any given moment. With eye tracking, the VR system can render only the part of the scene the user is directly looking at in full resolution, while progressively lowering the resolution in the peripheral vision. This can lead to massive performance savings (up to 70%) without the user perceiving any loss in visual quality, allowing for more complex graphics on less powerful hardware.[63][64]
- Social Presence: By tracking eye movements, blinks, and pupil dilation, avatars in social VR can replicate a user's expressions much more realistically, leading to more natural and engaging social interactions.[62]
- Analytics: In training, research, and marketing, eye tracking provides invaluable data on user attention and behavior, showing what users look at, in what order, and for how long.
Tobii dominates commercial VR eye tracking, providing technology for PlayStation VR2, HTC Vive Pro Eye, Pimax Crystal, and Varjo headsets. Integration enables foveated rendering, concentrating GPU resources on high-resolution foveal region while rendering periphery at lower detail. PlayStation VR2 achieves 3.6x faster GPU performance through foveated rendering.[65][66]
Apple Vision Pro's eye tracking serves as primary targeting mechanism functioning like a mouse cursor. High-performance infrared cameras and LEDs project patterns analyzed between display frames. Accuracy reaches 1.11 degrees in mixed reality mode and 0.93 degrees in VR mode within central field of view. The "look and pinch" interaction model eliminates need for pointing.[67][68]
HTC Vive Focus Vision (2024) integrated eye tracking as standard feature with 1-degree accuracy, using it for automatic interpupillary distance adjustment. Foveated rendering support and gaze input for UI complement hand tracking and controllers.[69]
Haptic Feedback Technology
Haptic technology provides the sense of touch, applying forces, vibrations, or motions to the user to simulate interactions with virtual objects.[70] It is a critical component for immersion, providing the physical confirmation that an action has occurred, such as feeling the impact of a virtual sword or the texture of a surface.
- Vibrotactile Feedback: This is the most common form of haptics, using small motors to create vibrations. Modern controllers use Linear Resonant Actuators (LRAs) or Voice Coil Actuators (VCAs) to produce more precise and varied vibrations than the older Eccentric Rotating Mass (ERM) motors found in older gamepads.[71]
- Kinesthetic (Force) Feedback: This type of feedback applies resistive forces to the user's body, simulating weight, inertia, and solidity. For example, a force-feedback joystick might resist being pushed, or a haptic glove might stop the user's fingers from closing when they grab a solid virtual object.[72] This is technologically complex and often requires large, grounded robotic arms or exoskeleton devices.
- Tactile Feedback: This aims to simulate more subtle sensations like surface texture, pressure, temperature, and slippage. This is an active area of research, with various emerging technologies:
- Microfluidics: Used in devices like HaptX Gloves, which have tiny inflatable pockets (actuators) that are rapidly filled with air or liquid to create pressure points on the skin, simulating the shape and texture of an object.[73]
- Electrotactile Stimulation: Applies small electrical currents to the skin to stimulate nerve endings, creating a variety of tactile sensations.[74]
- Ultrasonic Haptics: Uses arrays of ultrasonic transducers to focus sound waves in mid-air, creating pressure points that a user can feel on their bare skin without any wearable device.[75]
- Thermal Feedback: Uses Peltier elements to rapidly heat or cool a surface that is in contact with the user's skin, simulating touching hot or cold objects.[76]
Devices range from the integrated haptics in controllers to specialized haptic gloves, vests, and full-body haptic suits that provide more comprehensive feedback.[71]
Voice Input
Voice input relies on automatic speech recognition converting spoken words to text, combined with natural language processing understanding user intent. Modern systems employ cloud-based or on-device processing using wake words like "Hey Meta" for Meta Quest or Cortana for Microsoft HoloLens.[77]
Meta Quest voice commands enable over 100 commands including "Take a picture," "Start casting," and "Open [app name]." The Meta AI assistant introduced in 2024 extends capabilities to natural language queries.[78]
Microsoft HoloLens pioneered the "See It, Say It" model where voice-enabled buttons display tooltips when gazed at. Commands include hologram manipulation ("Bigger," "Face me"), device control ("Brightness up," "Volume down"), and queries ("What's my IP address?"). Dynamics 365 Remote Assist uses voice for hands-free field service.[79]
Cirrus Logic's SoundClear technology provides hardware foundation with low-power, always-on voice processors featuring multi-mic noise reduction and wake word recognition from one foot to across-room distances.[80]
Voice input leverages the built-in microphones in most VR headsets to allow for hands-free control and interaction. By using voice commands, users can navigate menus, search for content, dictate text, and control applications without using controllers.[81] The technology pipeline involves a speech recognition engine to transcribe spoken words into text, followed by a Natural Language Processing (NLP) model to interpret the user's intent from that text.[82]
Body Tracking
Full-body tracking extends immersion beyond head and hands.
Full-Body Tracking
While most VR systems natively track the head and hands, full-body tracking aims to capture the movement of the entire body, including the torso, legs, and feet, for a more complete and expressive avatar representation.
- Marker-based Tracking: This is the traditional method used in motion capture for film and games. It involves the user wearing a suit covered in reflective markers, which are tracked by multiple external infrared cameras. While highly accurate, it is expensive and complex.[2]
- Accessory-based Tracking: HTC Vive Tracker 3.0 attaches to body parts via elastic straps, tracked by SteamVR Lighthouse 2.0 with submillimeter accuracy. At 33% smaller and 15% lighter with 7.5-hour battery life, the tracker enables 6DOF tracking of feet, waist, chest, elbows, or shoulders. VRChat supports up to 11 tracking points for full-body avatar representation.[83]
- Markerless Tracking: This emerging method uses computer vision and AI to estimate a user's body pose directly from camera data, without requiring any markers or additional trackers. This can be done with external depth-sensing cameras (like the Microsoft Kinect) or, increasingly, with the cameras already on the VR headset itself.[84][85]
Vive Ultimate Tracker (2024) eliminated base station requirement through self-tracking with onboard cameras. Two wide-angle cameras per tracker enable 6DOF inside-out tracking, with up to five trackers connecting wirelessly.[86]
SlimeVR pioneered affordable IMU-based full-body tracking using 9-axis sensors (accelerometer, gyroscope, magnetometer) sending rotation data via 2.4GHz WiFi. A 5-tracker lower-body set includes chest, two thighs, and two ankles for approximately $200 with 10-15 hour battery life. IMU tracking avoids occlusion issues but suffers from yaw drift requiring periodic recalibration.[87]
HaritoraX 2 (2024) improved IMU tracking with built-in LiDAR sensors in ankle trackers detecting foot position relative to floor, plus geomagnetic compensation reducing rotational drift. Ultra-compact sensors enable up to 50 hours battery life.[88]
Research validates tracking accuracy. HTC Vive achieves approximately 2mm positional error and less than 1-degree orientation error. Oculus Quest 2 inside-out tracking shows 1.66mm ± 0.74mm translation accuracy and 0.34 ± 0.38 degrees rotation accuracy, comparable to external tracking systems.[89]
Locomotion
Locomotion refers to the methods used to move around within a virtual environment that is larger than the physical play space.
- Physical Locomotion:
- Room-scale: Users physically walk around a defined, tracked area. This is the most immersive method but is limited by the size of the physical room.
- Omnidirectional treadmills: These devices allow a user to walk, run, and jump in any direction while remaining in a fixed spot. They typically consist of a low-friction, concave platform where the user, wearing special shoes, can slide their feet to simulate walking. Sensors track the foot movement and translate it into in-game motion. Companies like KAT Walk and Virtuix are leading providers of consumer and arcade-level treadmills.[90][91]
- Artificial Locomotion:
- Teleportation: Users point a controller to a desired location and instantly appear there. This method is highly effective at preventing simulator sickness but can break the sense of presence and spatial awareness.[92]
- Smooth Locomotion: Users glide through the environment using the thumbstick on their controller, similar to a traditional first-person video game. While this provides a continuous sense of movement, it is a primary cause of simulator sickness for many users due to the disconnect between visual motion and the body's vestibular system.[92]
Brain-Computer Interfaces
Brain-computer interfaces detect electrical signals from brain or nervous system, translating neural activity into digital commands. Non-invasive BCIs use electroencephalography measuring brain waves from scalp electrodes, while invasive approaches implant electrodes in brain tissue. Electromyography offers middle ground, measuring muscle activation signals from skin surface sensors.[93]
Meta's EMG wristband (developed by acquired CTRL-labs) detects electrical signals from forearm muscles as motor neurons transmit movement commands. Signals are detected before fingers physically move, enabling negative latency. A July 2024 Nature paper demonstrated machine learning models working without user-specific calibration, the first generalizable neural interface.[94][95]
Mark Zuckerberg stated neural wristbands will ship "in the next few years," with leaked roadmaps indicating 2025-2027 launch alongside third-generation Ray-Ban smart glasses. The wristband enables handwriting in air, typing on surfaces, and precise finger tracking in any lighting without cameras.[96]
Valve and OpenBCI collaborated on the Galea headset (beta 2022), integrating EEG, EMG, EOG, EDA, PPG, and Tobii eye tracking into Valve Index modification. The open-source platform enables passive BCIs monitoring user state for adaptive VR experiences.[97]
EMOTIV offers consumer/professional headsets including EPOC X (14-channel EEG), Insight (5-channel), and MN8 (32-channel research cap). The EmotivBCI software enables direct brain-computer interfacing with real-time monitoring of attention, workload, emotions, and stress.[98]
Neuralink received FDA approval in 2023 and implanted its first human patient in January 2024, who controls laptop cursor and plays video games via thought. Synchron takes less invasive approach, with 2024 demonstrations showing compatibility with Apple Vision Pro for thought-controlled VR/AR.[99]
Specialized and Traditional Peripherals
For certain applications, especially simulations, specialized peripherals offer a level of immersion and control that general-purpose motion controllers cannot match.
- Flight Simulators: HOTAS (Hands On Throttle-And-Stick) systems, which replicate the joystick and throttle controls of an aircraft, are essential for flight simulation in VR. Popular models include the Thrustmaster HOTAS Warthog and T.Flight series.[100][101]
- Racing Simulators: A racing wheel and pedal set is crucial for a realistic driving experience. High-end models from companies like Thrustmaster and MOZA feature powerful force feedback motors that simulate the torque on the steering wheel and the feel of the road.[102][103] These are often mounted in dedicated racing cockpits for maximum stability and immersion.[104]
- Traditional Peripherals: Keyboard and mouse and traditional gamepads can still be used in VR, typically for seated experiences, ports of non-VR games, or for productivity tasks. Some platforms, like Meta Quest, have begun to integrate tracking for specific models of physical keyboards (for example the Logitech K830), allowing users to see a virtual representation of their keyboard and hands while typing, which greatly improves usability for work and text entry in VR.[105][106]
Applications of Input Across Industries
The diversity of input methods in XR has enabled a wide range of applications beyond gaming, transforming how professionals in various fields train, design, and interact with digital data.
Gaming and Entertainment
Gaming remains the primary driver of the consumer VR market, and input methods are integral to defining gameplay experiences. Motion controllers allow for direct, physical interaction, making games like Beat Saber (slashing blocks with virtual sabers) and Half-Life: Alyx (manipulating objects, reloading weapons, and solving puzzles with virtual hands) highly immersive.[107] Specialized peripherals cater to dedicated simulation genres; flight simulators like Microsoft Flight Simulator are best experienced with a HOTAS setup, while racing games like Assetto Corsa achieve maximum realism with a force-feedback racing wheel and pedals.[108]
Training and Simulation
VR provides a safe, cost-effective, and repeatable environment for training in high-stakes professions. The input method is chosen to best replicate the real-world task.
- Healthcare and Medical Training: Surgeons use VR simulations with advanced haptic devices to practice complex procedures. These systems can simulate the resistance and texture of different human tissues, allowing for realistic practice without risk to patients.[109][110] VR is also used for therapy, such as treating phobias or PTSD, by exposing patients to triggering stimuli in a controlled environment.[111]
- Aerospace and Military: Flight simulation is one of the oldest and most mature applications of VR. Pilots train in highly realistic virtual cockpits, often using exact replicas of the physical HOTAS controls and panels.[112] Similarly, military forces use VR for tactical training, combat simulations, and vehicle operation.[109]
- Industrial and Technical Training: VR allows workers to learn how to operate heavy machinery, perform maintenance on complex equipment, or practice assembly line tasks in a virtual factory. This hands-on learning in a risk-free environment improves skill retention and safety.[113][110]
Creative and Design Tools
VR is transforming digital content creation by moving it from 2D screens into an immersive 3D space.
- 3D Art and Sculpting: Applications like Adobe Substance 3D Modeler (successor to Oculus Medium) and Tilt Brush allow artists to use motion controllers to sculpt, paint, and create in three dimensions. This provides a more intuitive and physical connection to the creative process, akin to working with real-world materials.[114]
- Architecture and Industrial Design: Architects, engineers, and designers use VR to visualize their creations at a 1:1 scale. By "walking through" a virtual building or examining a digital prototype of a car, they can gain a much deeper understanding of space, scale, and ergonomics.[115] Using motion controllers, designers can directly manipulate elements of the model, enabling a rapid and iterative design process from within the virtual environment itself.[116]
Accessibility
The variety of input modalities in XR offers new avenues for accessibility. For users with mobility impairments who cannot use traditional controllers, alternative inputs like voice commands and eye tracking can provide full control over the virtual environment. Gaze-based selection can replace hand pointing, and voice commands can execute complex actions, making immersive experiences accessible to a wider audience.[81][117]
Challenges and Design Considerations
Despite rapid advancements, designing effective and comfortable input for VR and AR presents unique challenges that are not present in traditional 2D interface design. These challenges span human physiology, technical limitations, and new design paradigms.
Human Factors and User Comfort
- Simulator Sickness: Often cited as the biggest barrier to VR adoption, simulator sickness is a form of motion sickness that occurs when there is a conflict between the visual motion perceived by the eyes and the lack of physical motion detected by the body's vestibular system.[118] This is most commonly caused by artificial locomotion methods (like smooth locomotion with a thumbstick) and can be exacerbated by low frame rates or high latency. Input design principles to mitigate this include prioritizing teleportation over smooth locomotion, avoiding artificial camera acceleration, and ensuring the application maintains a consistently high frame rate (typically 90 Hz or higher).[119][120]
- Ergonomics and Physical Fatigue: Unlike using a mouse, VR input often requires physical movement of the arms, hands, and body. Heavy or poorly balanced controllers can lead to arm and wrist strain over long sessions.[34] Controller-free hand tracking can lead to "gorilla arm" syndrome, where users become fatigued from holding their arms up in the air to interact with interfaces. Good design practice involves placing frequently used UI elements within a comfortable, resting range of motion.[118]
Technical Hurdles
- Tracking Fidelity and Occlusion: While modern tracking systems are robust, they are not flawless. Inside-out systems can lose track of controllers when they are held outside the headset cameras' field of view (for example behind the back or too close to the face).[30] Hand tracking can be unreliable during fast movements, complex finger interactions, or when one hand occludes the other.[56] These tracking failures can break immersion and cause user frustration.
- Haptic Fidelity and Cost: The haptic feedback in most consumer VR controllers is limited to simple vibrations. Creating realistic tactile sensations, such as the texture of a surface, the weight of an object, or the precise feeling of pressure, is extremely challenging.[121] Advanced haptic devices like force-feedback exoskeletons or microfluidic gloves exist, but they are currently very expensive, bulky, and largely confined to research and enterprise applications.[122]
- Hardware Constraints: Standalone VR headsets operate under significant power and thermal constraints. The onboard processing power limits the complexity of the physics simulations, the number of tracked objects, and the sophistication of the rendering, which in turn affects the realism of interactions. Limited battery life also curtails the duration of untethered VR sessions.[119][123]
Interaction Design Paradigms
Designing a user interface (UI) for a 3D space requires a fundamental rethinking of principles from 2D design.
- Spatial UI: UI elements cannot be fixed to the screen; they must exist within the 3D world. Designers must consider the optimal placement of menus and information to be within the user's "comfort zone", typically a 94° horizontal and 32° vertical arc in front of the user, and at a comfortable viewing distance (generally between 0.5 meters and 10 meters) to avoid eye strain and maintain stereoscopic depth perception.[118]
- Interaction Abstraction: A core challenge is deciding on the level of abstraction for an interaction. A "natural" interaction, like picking up an object with tracked hands, is intuitive but can be imprecise and lacks tactile feedback. An "abstract" interaction, like pressing a button to grab an object, is reliable and provides clear feedback but is less immersive.[124] Designers must constantly balance the trade-offs between intuitiveness, reliability, and user comfort for every interaction.
Technical Standards
OpenXR
The Khronos Group released OpenXR 1.0 in July 2019, providing first truly cross-platform API for XR applications. OpenXR abstracts hardware differences behind unified interface, enabling developers to write code once and deploy across Meta Quest, SteamVR, Windows Mixed Reality, HTC Vive, Varjo, Magic Leap, and most major platforms except Apple.[125][126]
Version 1.1 (April 2024) consolidated proven extensions into core specification, with action-based input mapping letting runtimes translate abstract actions like "grab" to platform-specific button configurations. Major runtimes including Meta Quest OpenXR, SteamVR, Windows Mixed Reality, PICO, and Varjo are officially conformant.[125]
The extension system balances standardization with innovation. Core features work everywhere, while extensions like `XR_FB_foveated_rendering` for Meta's foveated rendering or `XR_FB_passthrough` for mixed reality enable platform-specific capabilities when available.[125]
WebXR
The W3C Immersive Web Working Group developed WebXR Device API as successor to WebVR, reaching Candidate Recommendation Draft with implementation in Chrome/Edge 79+, Opera 66+, Samsung Internet 12+, Oculus Browser, and Safari on visionOS. The JavaScript API provides browser-based VR/AR without requiring native application installation.[127][128]
WebGL and WebGPU integration enables hardware-accelerated 3D rendering. Related specifications include WebXR Augmented Reality Module for hit testing, WebXR Layers API for performance optimization, WebXR Gamepads Module for controller input, and WebXR Hand Input Module for hand tracking access.[127]
Comparison of Input Methods
| Input Method | ! Accuracy | ! Latency | ! Advantages | ! Disadvantages |
|---|---|---|---|---|
| Motion Controllers | 1-2mm | <25ms | Highest precision, haptic feedback, reliable tracking | Learning curve, battery management, occludes hands |
| Hand Tracking | 5-10mm | 30-50ms | Natural interaction, no hardware needed, intuitive | Lower precision, occlusion issues, no haptic feedback |
| Eye Tracking | 0.5-1.0° | 5-10ms | Fast targeting, foveated rendering, natural selection | Calibration required, privacy concerns, vergence issues |
| Voice Input | N/A | 100-300ms | Hands-free, accessible | Environmental noise, privacy concerns, social awkwardness |
| EMG Wristband | Sub-mm | Negative | Works in dark, subtle input, negative latency | Requires tight fit, limited gestures, interference issues |
| Full Body Tracking | 2mm | <20ms | Complete avatar representation, immersive | Setup complexity, cost, space requirements |
Current State and Future Trends
The 2024-2025 period represents inflection point for VR/AR input. Apple Vision Pro launched February 2024 as first major headset without bundled controllers, validating controller-free interaction. Meta Quest 3S (September 2024) brought high-quality hand tracking to $300 price point. HTC Vive Focus Vision (September 2024) demonstrated enterprise commitment to multi-modal input with controllers, hand tracking, and eye tracking simultaneously.[69]
Advanced Haptics
The next major leap in immersion will likely come from advancements in haptic feedback. The goal is to move beyond simple vibrations to provide rich, nuanced tactile information. Key areas of research and development include:
- High-Fidelity Tactile Displays: Devices that can simulate a wide range of surface textures, pressures, and temperatures. This may be achieved through haptic gloves using arrays of microfluidic actuators, electrostimulation, or piezoelectric materials that deform when a current is applied.[74][76]
- Kinesthetic and Force Feedback: Efforts are underway to create more compact, affordable, and ungrounded force-feedback devices. This includes exoskeletal gloves that can apply resistive force to individual fingers and wrist-mounted devices that use gyroscopic or other principles to simulate weight and inertia.
- Mid-Air Haptics: Technologies using phased arrays of ultrasonic transducers to project tactile sensations onto a user's bare hands in mid-air are maturing. This could allow users to "feel" virtual buttons and textures without wearing any device.
Neural Interfaces
EMG wristbands represent most significant emerging technology, with Meta planning 2025-2027 launch with third-generation Ray-Ban glasses. The July 2024 Nature paper demonstrating generalizable models working without user calibration removes major commercialization barrier.[95]
The long-term, paradigm-shifting future of input lies in brain-computer interfaces (BCIs), also known as neural interfaces. These technologies aim to establish a direct communication pathway between the brain and a computer, potentially allowing users to control virtual objects or navigate interfaces through thought alone.[2] Companies like Neuralink are developing invasive BCIs for medical applications, which involve surgically implanted electrodes to read neural signals with high fidelity.[129]
Convergence and Multi-Modal Input
Enterprise haptic gloves found viability at $5,000-7,000 price points for training applications, while Carnegie Mellon's Fluid Reality prototype promises consumer pricing around "a few hundred dollars" if manufacturing scales.[60]
Eye tracking transitions from premium to standard feature, with PlayStation VR2, Apple Vision Pro, and HTC Vive Focus Vision including it as core functionality rather than add-on. Tobii's licensing model enables rapid market expansion across platforms.[65]
The industry converges on multi-modal input supporting simultaneous use of controllers, hand tracking, eye tracking, and voice commands. Users seamlessly switch between input methods depending on task, controllers for gaming precision, hand tracking for social interaction, eye tracking for UI targeting, and voice for explicit commands. In the more immediate future, the most significant trend is the convergence of multiple input streams into a single, cohesive interaction model. Instead of relying on a single input method, future systems will intelligently combine data from eye tracking, hand tracking, voice commands, and biometric sensors to gain a more holistic and context-aware understanding of user intent.
The Apple Vision Pro's primary interaction model is a prominent example of this trend. It uses eye tracking to determine what a user is looking at (the "target") and hand tracking to detect a simple pinch gesture as the confirmation "click."[130] This fusion of two separate input modalities creates an interaction that is fast, intuitive, and requires minimal physical effort. Future systems will likely expand on this, using voice commands to modify properties of the object a user is looking at, or using biometric data to adapt a virtual environment based on a user's emotional state. This multi-modal approach promises to make interaction in XR feel less like operating a computer and more like a natural extension of the user's own body and mind.
See Also
- Input Devices
- Controllers
- Hand Tracking
- Eye Tracking
- Body Tracking
- Brain-Computer Interface
- OpenXR
- Positional Tracking
- 6DOF
- Haptic Technology
- Motion Controller
- Voice Command
- Immersion (virtual reality)
- Foveated Rendering
- Locomotion (virtual reality)
References
- ↑ Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/
- ↑ 2.0 2.1 2.2 2.3 2.4 VR / AR Fundamentals - 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e
- ↑ 3.0 3.1 Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design
- ↑ 4.0 4.1 Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi
- ↑ 5.0 5.1 The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC5439658/
- ↑ The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. Sage Journals. https://journals.sagepub.com/doi/full/10.1177/2041669517708205
- ↑ 7.0 7.1 7.2 7.3 History Of Virtual Reality. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality/history.html
- ↑ 8.00 8.01 8.02 8.03 8.04 8.05 8.06 8.07 8.08 8.09 8.10 8.11 8.12 8.13 8.14 8.15 8.16 8.17 A Brief History of AR and VR: Virtual Reality Timeline. HQSoftware. https://hqsoftwarelab.com/blog/the-history-of-ar-and-vr-a-timeline-of-notable-milestones/
- ↑ History of Virtual Reality: From the 1800s to the 21st Century. Coursera, July 12, 2023. https://www.coursera.org/articles/history-of-virtual-reality
- ↑ The Tremendous VR and CG Systems-of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad
- ↑ The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.
- ↑ A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.
- ↑ 13.0 13.1 13.2 13.3 13.4 13.5 13.6 Virtual reality. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality
- ↑ Augmented reality. Wikipedia. https://en.wikipedia.org/wiki/Augmented_reality
- ↑ VPL Research. Wikipedia. https://en.wikipedia.org/wiki/VPL_Research
- ↑ The History of Virtual Reality. Lumen & Forge. https://lumenandforge.com/the-history-of-virtual-reality
- ↑ Leap Motion. Wikipedia. https://en.wikipedia.org/wiki/Leap_Motion
- ↑ 18.0 18.1 How Does the Leap Motion Controller Work? Medium. https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04
- ↑ Oculus Touch Controllers Are A Lighter and Better Touch Than HTC Vive. Tom's Guide. https://www.tomsguide.com/us/oculus-touch-controllers,review-4072.html
- ↑ 20.0 20.1 20.2 20.3 20.4 20.5 20.6 Controllers - Valve Index® - Upgrade your experience. Valve Corporation. https://www.valvesoftware.com/en/index/controllers
- ↑ 21.0 21.1 Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index
- ↑ 22.0 22.1 Hand tracking technology & haptic feedback. Meta for Work. https://forwork.meta.com/blog/hand-tracking-technology-and-haptic-feedback-mr/
- ↑ 23.0 23.1 Introducing Apple Vision Pro: Apple's first spatial computer. Apple. https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/
- ↑ Degrees of freedom. Google VR. https://developers.google.com/vr/discover/degrees-of-freedom
- ↑ 6DoF vs 3DoF: Degrees of freedom in VR. Strivr. https://www.strivr.com/blog/6dof-vs-3dof-understanding-importance
- ↑ Degrees of freedom in VR/XR. Varjo. https://varjo.com/learning-hub/degrees-of-freedom-in-vr-xr/
- ↑ Virtual reality - Forms and methods. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality
- ↑ What is Inside-Out/Outside-In Tracking. Unity. https://unity.com/glossary/Insideout-outsidein-tracking
- ↑ Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index
- ↑ 30.0 30.1 Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr
- ↑ What types of tracking systems are used in VR (for example inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein
- ↑ Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/
- ↑ What role do motion controllers play in VR, and how do you support them? Milvus. https://milvus.io/ai-quick-reference/what-role-do-motion-controllers-play-in-vr-and-how-do-you-support-them
- ↑ 34.0 34.1 VR Controllers: A Comprehensive Review. SynergyXR. https://synergyxr.com/resources/learn/blogs/vr-controllers-a-comprehensive-review/
- ↑ Meta Quest 2. Wevolver. https://www.wevolver.com/specs/meta-quest-2
- ↑ Meta Quest 3. Wikipedia. https://en.wikipedia.org/wiki/Meta_Quest_3
- ↑ 37.0 37.1 37.2 Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/quest/accessories/quest-touch-pro-controllers-and-charging-dock/
- ↑ 38.0 38.1 38.2 PlayStation VR2. Wikipedia. https://en.wikipedia.org/wiki/PlayStation_VR2
- ↑ 39.0 39.1 39.2 Controller weights comparison... How do you have the stamina to swing fast? Reddit. https://www.reddit.com/r/beatsaber/comments/1b0zmo9/controller_weights_comparisonhow_do_you_have_the/
- ↑ 40.0 40.1 40.2 Meta Quest 3 Review. GSMArena. https://www.gsmarena.com/meta_quest_3_review-news-60375.php
- ↑ The Quest Touch Pro Controllers weigh ~164g each. Reddit. https://www.reddit.com/r/oculus/comments/y545yh/the_quest_touch_pro_controllers_weigh_164g_each/
- ↑ 42.0 42.1 Meta Quest Touch Plus Controller. Meta. https://www.meta.com/quest/accessories/quest-touch-plus-controller/
- ↑ 43.0 43.1 PlayStation VR2 tech specs. PlayStation. https://www.playstation.com/en-se/ps-vr2/ps-vr2-tech-specs/
- ↑ Meta Quest 2 Specifications. University of Giessen. https://www.uni-giessen.de/de/studium/lehre/projekte/nidit/goals/quest2/specifications_quest-2.pdf
- ↑ Meta Quest Touch Pro Controllers. Komete XR. https://komete-xr.com/en/products/meta-quest-touch-pro-controllers
- ↑ Meta Quest 3 - VR & AR Wiki. https://vrarwiki.com/wiki/Meta_Quest_3
- ↑ Oculus Touch. Wikipedia. https://en.wikipedia.org/wiki/Oculus_Touch
- ↑ PlayStation VR2 and PlayStation VR2 Sense controller. PlayStation Blog. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/
- ↑ 49.0 49.1 HTC Vive. Wikipedia. https://en.wikipedia.org/wiki/HTC_Vive
- ↑ Meta Quest 3: Tech Specs. Meta. https://www.meta.com/quest/quest-3/
- ↑ All Hands on Deck: Crank up Hand Responsiveness. Meta for Developers. https://developers.meta.com/horizon/blog/hand-tracking-22-response-time-meta-quest-developers/
- ↑ A methodological framework to assess the accuracy of virtual reality hand-tracking systems. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10830632/
- ↑ Ultraleap Hand Tracking Overview. Ultraleap Documentation. https://docs.ultraleap.com/hand-tracking/
- ↑ Hand Detection Tracking in Python using OpenCV and MediaPipe. Medium. https://gautamaditee.medium.com/hand-recognition-using-opencv-a7b109941c88
- ↑ Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC8321080/
- ↑ 56.0 56.1 I tested Quest 3's hand tracking with a complete VR novice. MIXED News. https://mixed-news.com/en/meta-quest-3-hand-tracking-experiment/
- ↑ Home | HaptX. https://haptx.com/
- ↑ Buy next generation full body haptic suit - bHaptics TactSuit. bHaptics. https://www.bhaptics.com/en/tactsuit/tactglove-dk2/
- ↑ Find out about our New Nova 2 Glove. SenseGlove. https://www.senseglove.com/product/nova-2/
- ↑ 60.0 60.1 Fluid Reality Haptic Gloves Bring Ultra-Sensitive Touch to VR. Carnegie Mellon University. https://www.cs.cmu.edu/news/2024/haptic-gloves
- ↑ What is VR Eye Tracking? iMotions. https://imotions.com/blog/learning/best-practice/vr-eye-tracking/
- ↑ 62.0 62.1 Learn about Eye Tracking on Meta Quest Pro. Meta. https://www.meta.com/help/quest/8107387169303764/
- ↑ Eye Tracking in Virtual Reality: a Broad Review of Applications and Challenges. Frontiers in Virtual Reality, 2024. https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2024.1343773/full
- ↑ Eye tracking in VR – A vital component. Tobii, February 16, 2024. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component
- ↑ 65.0 65.1 Eye tracking in VR – A vital component. Tobii. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component
- ↑ Eye Tracking on VR (Virtual Reality) headsets. Pimax. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets
- ↑ How You Control Apple Vision Pro With Your Eyes & Hands. UploadVR. https://www.uploadvr.com/apple-vision-pro-gesture-controls/
- ↑ Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/
- ↑ 69.0 69.1 VIVE Focus Vision - New Standalone PC VR Headset. VIVE United States. https://www.vive.com/us/product/vive-focus-vision/overview/
- ↑ Haptics. Meta Horizon OS Developers. https://developers.meta.com/horizon/design/haptics-overview/
- ↑ 71.0 71.1 How does Haptic Feedback Work in VR? ARVR Hub. https://arvrhub.com/haptic-feedback/
- ↑ Haptic Feedback for Virtual Reality. G. C. Burdea, 1999. Proceedings of the International Workshop on Virtual Reality and Prototyping. https://www.researchgate.net/publication/2356993_Haptic_Feedback_for_Virtual_Reality
- ↑ HaptX Gloves G1 Review: Getting in Touch with VR. XR Today. https://www.xrtoday.com/reviews/haptx-gloves-g1-review-getting-in-touch-with-vr/
- ↑ 74.0 74.1 Haptic Sensing and Feedback Techniques toward Virtual Reality. Advanced Intelligent Systems, 2024. https://onlinelibrary.wiley.com/doi/10.1002/aisy.202300645
- ↑ What Is Haptic Feedback? | Virtual Reality Medical Simulation. SimX. https://www.simxvr.com/glossary/haptic-feedback-definition/
- ↑ 76.0 76.1 The Different Types of Haptic Feedback. SenseGlove, May 15, 2023. https://www.senseglove.com/what-are-the-different-types-of-haptic-feedback/
- ↑ Voice input - Mixed Reality. Microsoft Learn. https://learn.microsoft.com/en-us/windows/mixed-reality/design/voice-input
- ↑ Meta Quest Voice Commands: The Ultimate Guide. AR/VR Tips. https://arvrtips.com/meta-quest-voice-commands/
- ↑ Use your voice to operate HoloLens. Microsoft Learn. https://learn.microsoft.com/en-us/hololens/hololens-cortana
- ↑ AR/VR Headsets. Cirrus Logic. https://www.cirrus.com/applications/wearables/ar-vr-headsets/
- ↑ 81.0 81.1 Voice-Augmented Virtual Reality Interface for Serious Games. University of Calgary, 2024. https://cspages.ucalgary.ca/~richard.zhao1/publications/2024cog-voice_augmented_VR_interface.pdf
- ↑ Voice SDK Overview. Meta for Developers. https://developers.meta.com/horizon/documentation/unity/voice-sdk-overview/
- ↑ VIVE Tracker (3.0). VIVE United States. https://www.vive.com/us/accessory/tracker3/
- ↑ Markerless Full Body Tracking: Depth-Sensing Technology within Virtual Environments. USC Institute for Creative Technologies. https://ict.usc.edu/pubs/Markerless%20Full%20Body%20Tracking-%20Depth-Sensing%20Technology%20within%20Virtual%20Environments.pdf
- ↑ How Markerless Mocap is Transforming Location-Based VR Experiences. The VR Collective. https://thevrcollective.com/how-markerless-mocap-is-transforming-location-based-vr-experiences/
- ↑ VIVE Ultimate Tracker - Full-Body Tracking. VIVE. https://www.vive.com/us/accessory/vive-ultimate-tracker/
- ↑ SlimeVR Full-Body Trackers. SlimeVR Official. https://slimevr.dev/
- ↑ HaritoraX 2 - Fully wireless full-body tracking device. Shiftall. https://en.shiftall.net/products/haritorax2
- ↑ Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. ACM Digital Library. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921
- ↑ Virtual Reality Input Devices. Virtual Reality Society. https://www.vrs.org.uk/virtual-reality-gear/input-devices.html
- ↑ Compare the Omni Directional Treadmills. Unbound XR. https://unboundxr.com/blogs/compare-vr-treadmills
- ↑ 92.0 92.1 VR Locomotion in the New Era of VR: A Study of Techniques and Comparative Review. Multimodal Technologies and Interaction, 2019. https://www.mdpi.com/2414-4088/3/2/24
- ↑ Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. Frontiers in Human Neuroscience. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full
- ↑ Facebook agrees to acquire brain-computing start-up CTRL-labs. CNBC. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html
- ↑ 95.0 95.1 Meta Details EMG Wristband Gestures. UploadVR. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/
- ↑ Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. UploadVR. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/
- ↑ Valve, OpenBCI & Tobii to Launch VR Brain-computer Interface 'Galea'. Road to VR. https://www.roadtovr.com/valve-openbci-immersive-vr-games/
- ↑ How BCI can elevate the AR/VR experience. EMOTIV. https://www.emotiv.com/blogs/news/bci-applications-for-vr-ar
- ↑ Brain Implant Hooked Up to Control VR Headset. Futurism. https://futurism.com/neoscope/synchron-brain-computer-interface-control-vr-headset
- ↑ T.FLIGHT HOTAS ONE. Thrustmaster. https://eshop.thrustmaster.com/en_us/t-flight-hotas-one.html
- ↑ Thrustmaster HOTAS Warthog Flight Stick and Throttle for PC, VR. Walmart. https://www.walmart.com/ip/Thrustmaster-HOTAS-Warthog-Flight-Stick-and-Throttle-for-PC-VR/15268503
- ↑ Racing. Thrustmaster. https://www.thrustmaster.com/en-us/universe/racing/
- ↑ MOZA Racing Global. MOZA Racing. https://mozaracing.com/
- ↑ Racing Simulator Cockpits. Next Level Racing. https://nextlevelracing.com/racing-cockpits/
- ↑ The Logitech K830 Keyboard And Typing In VR. Medium, August 25, 2021. https://medium.com/xrlo-extended-reality-lowdown/the-logitech-k830-keyboard-and-typing-in-vr-556e2740c48d
- ↑ Can I use mouse and keyboard with the vr headset on pc for vr games instead of controllers? Reddit. https://www.reddit.com/r/oculus/comments/10946c5/can_i_use_mouse_and_keyboard_with_the_vr_headset/
- ↑ The Evolution of VR and AR in Gaming: A Historical Perspective. Cavendish Professionals. https://www.cavendishprofessionals.com/the-evolution-of-vr-and-ar-in-gaming-a-historical-perspective/
- ↑ How to play in VR with Mouse and keyboard? Steam Community. https://steamcommunity.com/app/359320/discussions/0/5311389137862908260/?l=tchinese
- ↑ 109.0 109.1 A Comprehensive Review of Augmented Reality and Virtual Reality. International Journal of Research and Presentations, 2023. https://ijrpr.com/uploads/V4ISSUE4/IJRPR12239.pdf
- ↑ 110.0 110.1 The Role of Virtual Reality in Technical Training. THORS. https://thors.com/the-role-of-virtual-reality-in-technical-training/
- ↑ The History and Applications of Virtual Reality Headsets. Ryan. https://funding.ryan.com/blog/business-strategy/applications-and-history-of-vr-headsets/
- ↑ Applications of Virtual Reality Simulations and Machine. MDPI. https://www.mdpi.com/2673-4591/100/1/19
- ↑ VR Simulations in Education: Transforming Learning. SynergyXR. https://synergyxr.com/resources/learn/blogs/vr-simulations-in-education/
- ↑ VR Art and Creativity: Unleashing the Power of Virtual Reality. SearchMyExpert. https://www.searchmyexpert.com/resources/ar-vr-development/vr-art-creativity
- ↑ Virtual reality applications. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality_applications
- ↑ The Use of Immersive Virtual Reality as a Design Input Tool. Digital Landscape Architecture, 2017. https://gispoint.de/fileadmin/user_upload/paper_gis_open/DLA_2017/537629026.pdf
- ↑ Why is Haptic Feedback important for VR Education? iXR Labs. https://www.ixrlabs.com/blog/why-haptic-feedback-important-for-vr-education/
- ↑ 118.0 118.1 118.2 VR Design Principles. ViroReact. https://viro-community.readme.io/docs/vr-design-principles
- ↑ 119.0 119.1 The Biggest Challenges in AR/VR Design and How to Overcome Them. Medium. https://medium.com/cva-design/the-biggest-challenges-in-ar-vr-design-and-how-to-overcome-them-25210d435a79
- ↑ The Limitations of Virtual Reality. Appy Pie. https://www.appypie.com/blog/virtual-reality-limitations
- ↑ Disadvantages of Haptic Technology. Flatirons. https://flatirons.com/blog/disadvantages-of-haptic-technology/
- ↑ How Bad Haptic Feedback Can Ruin the User Experience. Boreas Technologies. https://pages.boreas.ca/blog/how-bad-haptic-feedback-can-ruin-the-user-experience
- ↑ Exploring the Challenges and Limitations of Virtual Reality. VCD Social Club. https://vcdsocialclub.co.uk/exploring-the-challenges-and-limitations-of-virtual-reality
- ↑ VR in UX Design: Basic Guidelines. Ramotion. https://www.ramotion.com/blog/vr-in-ux-design/
- ↑ 125.0 125.1 125.2 OpenXR - High-performance access to AR and VR. Khronos Group. https://www.khronos.org/openxr/
- ↑ OpenXR. Wikipedia. https://en.wikipedia.org/wiki/OpenXR
- ↑ 127.0 127.1 WebXR Device API. W3C. https://www.w3.org/TR/webxr/
- ↑ WebXR Device API - Web APIs | MDN. MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API
- ↑ Neuralink - Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/
- ↑ Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o