Jump to content

Input: Difference between revisions

No edit summary
No edit summary
 
(One intermediate revision by the same user not shown)
Line 2: Line 2:
{{See also|Input Devices}}
{{See also|Input Devices}}


'''Input''' in [[virtual reality]] ([[VR]]) and [[augmented reality]] ([[AR]]) refers to the various methods and technologies that allow a user to interact with, control, and provide data to a computer-generated environment.<ref name="forwork_meta_guide">Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/</ref> Unlike traditional computing that primarily relies on a [[keyboard and mouse]], [[extended reality]] (XR) input encompasses a wide spectrum of devices and techniques designed to create a sense of [[immersion]] and presence by translating a user's physical actions into digital ones.<ref name="naimark_io">VR / AR Fundamentals 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e</ref>
'''Input''' in [[virtual reality]] ([[VR]]) and [[augmented reality]] ([[AR]]) refers to the various methods and technologies that allow a user to interact with, control, and provide data to a computer-generated environment.<ref name="forwork_meta_guide">Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/</ref> Unlike traditional computing that primarily relies on a [[keyboard and mouse]], [[extended reality]] (XR) input encompasses a wide spectrum of devices and techniques designed to create a sense of [[immersion]] and presence by translating a user's physical actions into digital ones.<ref name="naimark_io">VR / AR Fundamentals - 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e</ref>


Input methods range from traditional devices like [[gamepad]]s to sophisticated [[motion controller]]s that track hand movements, and increasingly, to more natural interfaces such as controller-free [[hand tracking]], [[eye tracking]], and [[voice command]]s.<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref> Modern VR/AR systems typically support multiple input modalities simultaneously, allowing users to seamlessly switch between controllers, hand gestures, gaze-based selection, and voice commands depending on the task and context.<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref>
Input methods range from traditional devices like [[gamepad]]s to sophisticated [[motion controller]]s that track hand movements, and increasingly, to more natural interfaces such as controller-free [[hand tracking]], [[eye tracking]], and [[voice command]]s.<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref> Modern VR/AR systems typically support multiple input modalities simultaneously, allowing users to seamlessly switch between controllers, hand gestures, gaze-based selection, and voice commands depending on the task and context.<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref>
Line 29: Line 29:
The direct lineage of modern VR input began in the 1960s:
The direct lineage of modern VR input began in the 1960s:
* '''1961''': Philco Corporation engineers developed the '''Headsight''', the first [[head-mounted display]] (HMD), which featured a magnetic motion tracking system where head movements would control a remote camera, allowing for intuitive remote viewing.<ref name="vrshistory"/>
* '''1961''': Philco Corporation engineers developed the '''Headsight''', the first [[head-mounted display]] (HMD), which featured a magnetic motion tracking system where head movements would control a remote camera, allowing for intuitive remote viewing.<ref name="vrshistory"/>
* '''1963''': [[Ivan Sutherland]] created the first interactive computer graphics input system with [[Sketchpad]], using a light pen for real-time line drawings on a TX-2 computer at [[MIT]].<ref name="ieeespectrum">The Tremendous VR and CG Systems—of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref>
* '''1963''': [[Ivan Sutherland]] created the first interactive computer graphics input system with [[Sketchpad]], using a light pen for real-time line drawings on a TX-2 computer at [[MIT]].<ref name="ieeespectrum">The Tremendous VR and CG Systems-of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref>
* '''1965''': Sutherland conceptualized the "[[Ultimate Display]]," a theoretical room that could simulate reality so perfectly that a user could not differentiate it from the real world, including not just visual and auditory simulation but also [[haptic technology|haptic feedback]] and interaction with virtual objects.<ref name="sutherland_ultimate_display">The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.</ref>
* '''1965''': Sutherland conceptualized the "[[Ultimate Display]]," a theoretical room that could simulate reality so perfectly that a user could not differentiate it from the real world, including not just visual and auditory simulation but also [[haptic technology|haptic feedback]] and interaction with virtual objects.<ref name="sutherland_ultimate_display">The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.</ref>
* '''1968''': Sutherland and his student Bob Sproull built the first actual VR/AR HMD, nicknamed "[[Sword of Damocles]]." The device was connected to a computer and used a mechanical or ultrasonic head-tracking system to update the user's perspective in real-time as they moved their head, marking the first instance of interactive, computer-generated immersive graphics.<ref name="sutherland_hmd_paper">A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.</ref>
* '''1968''': Sutherland and his student Bob Sproull built the first actual VR/AR HMD, nicknamed "[[Sword of Damocles]]." The device was connected to a computer and used a mechanical or ultrasonic head-tracking system to update the user's perspective in real-time as they moved their head, marking the first instance of interactive, computer-generated immersive graphics.<ref name="sutherland_hmd_paper">A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.</ref>
Line 74: Line 74:
[[Degrees of freedom]] refers to the number of ways a rigid body can move in 3D space. This is a critical concept for understanding the capabilities of a VR system's tracking.<ref name="google_dof">Degrees of freedom. Google VR. https://developers.google.com/vr/discover/degrees-of-freedom</ref>
[[Degrees of freedom]] refers to the number of ways a rigid body can move in 3D space. This is a critical concept for understanding the capabilities of a VR system's tracking.<ref name="google_dof">Degrees of freedom. Google VR. https://developers.google.com/vr/discover/degrees-of-freedom</ref>


* '''Three Degrees of Freedom (3DoF):''' This allows for the tracking of rotational movement only—tracking when a user looks up/down (pitch), left/right (yaw), and tilts their head side-to-side (roll). A 3DoF headset or controller can track these rotations but cannot track the user's physical movement through space. Early mobile VR headsets like the [[Google Cardboard]] and [[Samsung Gear VR]] were 3DoF systems.<ref name="strivr_dof">6DoF vs 3DoF: Degrees of freedom in VR. Strivr. https://www.strivr.com/blog/6dof-vs-3dof-understanding-importance</ref>
* '''Three Degrees of Freedom (3DoF):''' This allows for the tracking of rotational movement only, tracking when a user looks up/down (pitch), left/right (yaw), and tilts their head side-to-side (roll). A 3DoF headset or controller can track these rotations but cannot track the user's physical movement through space. Early mobile VR headsets like the [[Google Cardboard]] and [[Samsung Gear VR]] were 3DoF systems.<ref name="strivr_dof">6DoF vs 3DoF: Degrees of freedom in VR. Strivr. https://www.strivr.com/blog/6dof-vs-3dof-understanding-importance</ref>
* '''Six Degrees of Freedom (6DoF):''' This tracks both rotational and translational movement. In addition to the three rotational axes, 6DoF systems can track movement forward/backward (surging), left/right (strafing), and up/down (elevating). This allows a user to physically walk around, duck, and lean within the virtual environment, which is essential for true immersion and is the standard for modern VR systems like the [[Meta Quest 3]] and [[Valve Index]].<ref name="varjo_dof">Degrees of freedom in VR/XR. Varjo. https://varjo.com/learning-hub/degrees-of-freedom-in-vr-xr/</ref>
* '''Six Degrees of Freedom (6DoF):''' This tracks both rotational and translational movement. In addition to the three rotational axes, 6DoF systems can track movement forward/backward (surging), left/right (strafing), and up/down (elevating). This allows a user to physically walk around, duck, and lean within the virtual environment, which is essential for true immersion and is the standard for modern VR systems like the [[Meta Quest 3]] and [[Valve Index]].<ref name="varjo_dof">Degrees of freedom in VR/XR. Varjo. https://varjo.com/learning-hub/degrees-of-freedom-in-vr-xr/</ref>


Line 81: Line 81:
Positional tracking is the technology that enables 6DoF by determining the real-time position and orientation of the headset and controllers. There are two primary methods for achieving this.<ref name="wikipedia_vr_methods">Virtual reality - Forms and methods. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref>
Positional tracking is the technology that enables 6DoF by determining the real-time position and orientation of the headset and controllers. There are two primary methods for achieving this.<ref name="wikipedia_vr_methods">Virtual reality - Forms and methods. Wikipedia. https://en.wikipedia.org/wiki/Virtual_reality</ref>


* '''Outside-In Tracking:''' This method uses external sensors (e.g., cameras or infrared emitters called "base stations" or "lighthouses") placed in the physical environment to track the position of the headset and controllers. These external sensors monitor markers (often infrared LEDs) on the tracked devices. Systems like the original [[HTC Vive]], [[Oculus Rift CV1]], and the [[Valve Index]] use outside-in tracking. This method can provide highly accurate and stable tracking but requires a more complex setup and a dedicated play space.<ref name="unity_tracking">What is Inside-Out/Outside-In Tracking. Unity. https://unity.com/glossary/Insideout-outsidein-tracking</ref><ref name="wikipedia_valve_index">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref>
* '''Outside-In Tracking:''' This method uses external sensors (for example cameras or infrared emitters called "base stations" or "lighthouses") placed in the physical environment to track the position of the headset and controllers. These external sensors monitor markers (often infrared LEDs) on the tracked devices. Systems like the original [[HTC Vive]], [[Oculus Rift CV1]], and the [[Valve Index]] use outside-in tracking. This method can provide highly accurate and stable tracking but requires a more complex setup and a dedicated play space.<ref name="unity_tracking">What is Inside-Out/Outside-In Tracking. Unity. https://unity.com/glossary/Insideout-outsidein-tracking</ref><ref name="wikipedia_valve_index">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref>


* '''Inside-Out Tracking:''' This method places the tracking sensors, typically cameras, directly on the headset itself. These cameras observe the surrounding environment and use computer vision algorithms (such as [[SLAM|simultaneous localization and mapping]]) to calculate the headset's position and orientation relative to fixed points in the room.<ref name="pimax_tracking">Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr</ref> Controllers are tracked by these same headset cameras observing their infrared LEDs. This approach is used by all modern standalone headsets, such as the [[Meta Quest]] series and the [[Pico 4]], as it eliminates the need for external hardware, making setup much simpler and allowing the system to be used in any location.<ref name="zilliz_tracking">What types of tracking systems are used in VR (e.g., inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein</ref>
* '''Inside-Out Tracking:''' This method places the tracking sensors, typically cameras, directly on the headset itself. These cameras observe the surrounding environment and use computer vision algorithms (such as [[SLAM|simultaneous localization and mapping]]) to calculate the headset's position and orientation relative to fixed points in the room.<ref name="pimax_tracking">Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr</ref> Controllers are tracked by these same headset cameras observing their infrared LEDs. This approach is used by all modern standalone headsets, such as the [[Meta Quest]] series and the [[Pico 4]], as it eliminates the need for external hardware, making setup much simpler and allowing the system to be used in any location.<ref name="zilliz_tracking">What types of tracking systems are used in VR (for example inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein</ref>


* '''Self-Tracking (Inside-Out on Controller):''' A newer hybrid approach places cameras directly onto the controllers themselves, as seen with the [[Meta Quest Touch Pro]] controllers. Each controller has its own onboard cameras and a [[Qualcomm Snapdragon]] 662 processor per controller, allowing it to track its own position in 3D space independently of the headset's cameras. This provides more robust tracking, preventing loss of tracking when the controllers are outside the headset's field of view (e.g., behind the user's back).<ref name="meta_controllers_pro">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/</ref>
* '''Self-Tracking (Inside-Out on Controller):''' A newer hybrid approach places cameras directly onto the controllers themselves, as seen with the [[Meta Quest Touch Pro]] controllers. Each controller has its own onboard cameras and a [[Qualcomm Snapdragon]] 662 processor per controller, allowing it to track its own position in 3D space independently of the headset's cameras. This provides more robust tracking, preventing loss of tracking when the controllers are outside the headset's field of view (for example behind the user's back).<ref name="meta_controllers_pro">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/</ref>


===Input Modalities===
===Input Modalities===
Line 231: Line 231:
* '''Analytics:''' In training, research, and marketing, eye tracking provides invaluable data on user attention and behavior, showing what users look at, in what order, and for how long.
* '''Analytics:''' In training, research, and marketing, eye tracking provides invaluable data on user attention and behavior, showing what users look at, in what order, and for how long.


[[Tobii]] dominates commercial VR eye tracking, providing technology for [[PlayStation VR2]], [[HTC Vive Pro Eye]], [[Pimax Crystal]], and [[Varjo]] headsets. Integration enables [[foveated rendering]]—concentrating GPU resources on high-resolution foveal region while rendering periphery at lower detail. PlayStation VR2 achieves 3.6x faster GPU performance through foveated rendering.<ref name="tobii">Eye tracking in VR – A vital component. Tobii. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref><ref name="pimax">Eye Tracking on VR (Virtual Reality) headsets. Pimax. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets</ref>
[[Tobii]] dominates commercial VR eye tracking, providing technology for [[PlayStation VR2]], [[HTC Vive Pro Eye]], [[Pimax Crystal]], and [[Varjo]] headsets. Integration enables [[foveated rendering]], concentrating GPU resources on high-resolution foveal region while rendering periphery at lower detail. PlayStation VR2 achieves 3.6x faster GPU performance through foveated rendering.<ref name="tobii">Eye tracking in VR – A vital component. Tobii. https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component</ref><ref name="pimax">Eye Tracking on VR (Virtual Reality) headsets. Pimax. https://pimax.com/blogs/blogs/eye-tracking-on-vr-virtual-reality-headsets</ref>


[[Apple Vision Pro]]'s eye tracking serves as primary targeting mechanism functioning like a mouse cursor. High-performance infrared cameras and LEDs project patterns analyzed between display frames. Accuracy reaches 1.11 degrees in mixed reality mode and 0.93 degrees in VR mode within central field of view. The "look and pinch" interaction model eliminates need for pointing.<ref name="applevpeye">How You Control Apple Vision Pro With Your Eyes & Hands. UploadVR. https://www.uploadvr.com/apple-vision-pro-gesture-controls/</ref><ref name="pubmedeye">Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/</ref>
[[Apple Vision Pro]]'s eye tracking serves as primary targeting mechanism functioning like a mouse cursor. High-performance infrared cameras and LEDs project patterns analyzed between display frames. Accuracy reaches 1.11 degrees in mixed reality mode and 0.93 degrees in VR mode within central field of view. The "look and pinch" interaction model eliminates need for pointing.<ref name="applevpeye">How You Control Apple Vision Pro With Your Eyes & Hands. UploadVR. https://www.uploadvr.com/apple-vision-pro-gesture-controls/</ref><ref name="pubmedeye">Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy. PubMed Central. https://pmc.ncbi.nlm.nih.gov/articles/PMC10136368/</ref>
Line 269: Line 269:
====Full-Body Tracking====
====Full-Body Tracking====


While most VR systems natively track the head and hands, full-body tracking aims to capture the movement of the entire body—including the torso, legs, and feet—for a more complete and expressive avatar representation.
While most VR systems natively track the head and hands, full-body tracking aims to capture the movement of the entire body, including the torso, legs, and feet, for a more complete and expressive avatar representation.


* '''Marker-based Tracking:''' This is the traditional method used in [[motion capture]] for film and games. It involves the user wearing a suit covered in reflective markers, which are tracked by multiple external infrared cameras. While highly accurate, it is expensive and complex.<ref name="naimark_io"/>
* '''Marker-based Tracking:''' This is the traditional method used in [[motion capture]] for film and games. It involves the user wearing a suit covered in reflective markers, which are tracked by multiple external infrared cameras. While highly accurate, it is expensive and complex.<ref name="naimark_io"/>
Line 281: Line 281:
[[HaritoraX]] 2 (2024) improved IMU tracking with built-in [[LiDAR]] sensors in ankle trackers detecting foot position relative to floor, plus geomagnetic compensation reducing rotational drift. Ultra-compact sensors enable up to 50 hours battery life.<ref name="haritorax">HaritoraX 2 - Fully wireless full-body tracking device. Shiftall. https://en.shiftall.net/products/haritorax2</ref>
[[HaritoraX]] 2 (2024) improved IMU tracking with built-in [[LiDAR]] sensors in ankle trackers detecting foot position relative to floor, plus geomagnetic compensation reducing rotational drift. Ultra-compact sensors enable up to 50 hours battery life.<ref name="haritorax">HaritoraX 2 - Fully wireless full-body tracking device. Shiftall. https://en.shiftall.net/products/haritorax2</ref>


Research validates tracking accuracy. HTC Vive achieves approximately 2mm positional error and less than 1-degree orientation error. [[Oculus Quest]] 2 inside-out tracking shows 1.66mm ± 0.74mm translation accuracy and 0.34 ± 0.38 degrees rotation accuracy—comparable to external tracking systems.<ref name="acmtracking">Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. ACM Digital Library. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921</ref>
Research validates tracking accuracy. HTC Vive achieves approximately 2mm positional error and less than 1-degree orientation error. [[Oculus Quest]] 2 inside-out tracking shows 1.66mm ± 0.74mm translation accuracy and 0.34 ± 0.38 degrees rotation accuracy, comparable to external tracking systems.<ref name="acmtracking">Comparing the Accuracy and Precision of SteamVR Tracking 2.0 and Oculus Quest 2. ACM Digital Library. https://dl.acm.org/doi/fullHtml/10.1145/3463914.3463921</ref>


====Locomotion====
====Locomotion====
Line 298: Line 298:
[[Brain-computer interface|Brain-computer interfaces]] detect electrical signals from brain or nervous system, translating neural activity into digital commands. Non-invasive BCIs use [[electroencephalography]] measuring brain waves from scalp electrodes, while invasive approaches implant electrodes in brain tissue. [[Electromyography]] offers middle ground, measuring muscle activation signals from skin surface sensors.<ref name="frontiersbci">Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. Frontiers in Human Neuroscience. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full</ref>
[[Brain-computer interface|Brain-computer interfaces]] detect electrical signals from brain or nervous system, translating neural activity into digital commands. Non-invasive BCIs use [[electroencephalography]] measuring brain waves from scalp electrodes, while invasive approaches implant electrodes in brain tissue. [[Electromyography]] offers middle ground, measuring muscle activation signals from skin surface sensors.<ref name="frontiersbci">Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality. Frontiers in Human Neuroscience. https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2020.00144/full</ref>


[[Meta]]'s [[EMG wristband]] (developed by acquired [[CTRL-labs]]) detects electrical signals from forearm muscles as motor neurons transmit movement commands. Signals are detected before fingers physically move, enabling negative latency. A July 2024 Nature paper demonstrated machine learning models working without user-specific calibration—the first generalizable neural interface.<ref name="ctrlabs">Facebook agrees to acquire brain-computing start-up CTRL-labs. CNBC. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html</ref><ref name="metaemg">Meta Details EMG Wristband Gestures. UploadVR. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/</ref>
[[Meta]]'s [[EMG wristband]] (developed by acquired [[CTRL-labs]]) detects electrical signals from forearm muscles as motor neurons transmit movement commands. Signals are detected before fingers physically move, enabling negative latency. A July 2024 Nature paper demonstrated machine learning models working without user-specific calibration, the first generalizable neural interface.<ref name="ctrlabs">Facebook agrees to acquire brain-computing start-up CTRL-labs. CNBC. https://www.cnbc.com/2019/09/23/facebook-announces-acquisition-of-brain-computing-start-up-ctrl-labs.html</ref><ref name="metaemg">Meta Details EMG Wristband Gestures. UploadVR. https://www.uploadvr.com/meta-semg-wristband-gestures-nature-paper/</ref>


Mark Zuckerberg stated neural wristbands will ship "in the next few years," with leaked roadmaps indicating 2025-2027 launch alongside third-generation [[Ray-Ban]] smart glasses. The wristband enables handwriting in air, typing on surfaces, and precise finger tracking in any lighting without cameras.<ref name="zuckerbergwristband">Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. UploadVR. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/</ref>
Mark Zuckerberg stated neural wristbands will ship "in the next few years," with leaked roadmaps indicating 2025-2027 launch alongside third-generation [[Ray-Ban]] smart glasses. The wristband enables handwriting in air, typing on surfaces, and precise finger tracking in any lighting without cameras.<ref name="zuckerbergwristband">Zuckerberg: Neural Wristband To Ship In 'Next Few Years'. UploadVR. https://www.uploadvr.com/zuckerberg-neural-wristband-will-ship-in-the-next-few-years/</ref>
Line 314: Line 314:
* '''Flight Simulators:''' [[HOTAS]] (Hands On Throttle-And-Stick) systems, which replicate the joystick and throttle controls of an aircraft, are essential for flight simulation in VR. Popular models include the [[Thrustmaster]] HOTAS Warthog and T.Flight series.<ref name="thrustmaster_hotas_one">T.FLIGHT HOTAS ONE. Thrustmaster. https://eshop.thrustmaster.com/en_us/t-flight-hotas-one.html</ref><ref name="walmart_warthog">Thrustmaster HOTAS Warthog Flight Stick and Throttle for PC, VR. Walmart. https://www.walmart.com/ip/Thrustmaster-HOTAS-Warthog-Flight-Stick-and-Throttle-for-PC-VR/15268503</ref>
* '''Flight Simulators:''' [[HOTAS]] (Hands On Throttle-And-Stick) systems, which replicate the joystick and throttle controls of an aircraft, are essential for flight simulation in VR. Popular models include the [[Thrustmaster]] HOTAS Warthog and T.Flight series.<ref name="thrustmaster_hotas_one">T.FLIGHT HOTAS ONE. Thrustmaster. https://eshop.thrustmaster.com/en_us/t-flight-hotas-one.html</ref><ref name="walmart_warthog">Thrustmaster HOTAS Warthog Flight Stick and Throttle for PC, VR. Walmart. https://www.walmart.com/ip/Thrustmaster-HOTAS-Warthog-Flight-Stick-and-Throttle-for-PC-VR/15268503</ref>
* '''Racing Simulators:''' A [[racing wheel]] and pedal set is crucial for a realistic driving experience. High-end models from companies like Thrustmaster and [[MOZA]] feature powerful force feedback motors that simulate the torque on the steering wheel and the feel of the road.<ref name="thrustmaster_racing">Racing. Thrustmaster. https://www.thrustmaster.com/en-us/universe/racing/</ref><ref name="moza_racing">MOZA Racing Global. MOZA Racing. https://mozaracing.com/</ref> These are often mounted in dedicated racing cockpits for maximum stability and immersion.<ref name="nextlevel_racing">Racing Simulator Cockpits. Next Level Racing. https://nextlevelracing.com/racing-cockpits/</ref>
* '''Racing Simulators:''' A [[racing wheel]] and pedal set is crucial for a realistic driving experience. High-end models from companies like Thrustmaster and [[MOZA]] feature powerful force feedback motors that simulate the torque on the steering wheel and the feel of the road.<ref name="thrustmaster_racing">Racing. Thrustmaster. https://www.thrustmaster.com/en-us/universe/racing/</ref><ref name="moza_racing">MOZA Racing Global. MOZA Racing. https://mozaracing.com/</ref> These are often mounted in dedicated racing cockpits for maximum stability and immersion.<ref name="nextlevel_racing">Racing Simulator Cockpits. Next Level Racing. https://nextlevelracing.com/racing-cockpits/</ref>
* '''Traditional Peripherals:''' [[Keyboard and mouse]] and traditional [[gamepad]]s can still be used in VR, typically for seated experiences, ports of non-VR games, or for productivity tasks. Some platforms, like Meta Quest, have begun to integrate tracking for specific models of physical keyboards (e.g., the [[Logitech K830]]), allowing users to see a virtual representation of their keyboard and hands while typing, which greatly improves usability for work and text entry in VR.<ref name="medium_k830">The Logitech K830 Keyboard And Typing In VR. Medium, August 25, 2021. https://medium.com/xrlo-extended-reality-lowdown/the-logitech-k830-keyboard-and-typing-in-vr-556e2740c48d</ref><ref name="reddit_kb_mouse_vr">Can I use mouse and keyboard with the vr headset on pc for vr games instead of controllers? Reddit. https://www.reddit.com/r/oculus/comments/10946c5/can_i_use_mouse_and_keyboard_with_the_vr_headset/</ref>
* '''Traditional Peripherals:''' [[Keyboard and mouse]] and traditional [[gamepad]]s can still be used in VR, typically for seated experiences, ports of non-VR games, or for productivity tasks. Some platforms, like Meta Quest, have begun to integrate tracking for specific models of physical keyboards (for example the [[Logitech K830]]), allowing users to see a virtual representation of their keyboard and hands while typing, which greatly improves usability for work and text entry in VR.<ref name="medium_k830">The Logitech K830 Keyboard And Typing In VR. Medium, August 25, 2021. https://medium.com/xrlo-extended-reality-lowdown/the-logitech-k830-keyboard-and-typing-in-vr-556e2740c48d</ref><ref name="reddit_kb_mouse_vr">Can I use mouse and keyboard with the vr headset on pc for vr games instead of controllers? Reddit. https://www.reddit.com/r/oculus/comments/10946c5/can_i_use_mouse_and_keyboard_with_the_vr_headset/</ref>


==Applications of Input Across Industries==
==Applications of Input Across Industries==
Line 354: Line 354:
===Technical Hurdles===
===Technical Hurdles===


* '''Tracking Fidelity and Occlusion:''' While modern tracking systems are robust, they are not flawless. Inside-out systems can lose track of controllers when they are held outside the headset cameras' field of view (e.g., behind the back or too close to the face).<ref name="pimax_tracking"/> Hand tracking can be unreliable during fast movements, complex finger interactions, or when one hand occludes the other.<ref name="mixed_news_hand_tracking"/> These tracking failures can break immersion and cause user frustration.
* '''Tracking Fidelity and Occlusion:''' While modern tracking systems are robust, they are not flawless. Inside-out systems can lose track of controllers when they are held outside the headset cameras' field of view (for example behind the back or too close to the face).<ref name="pimax_tracking"/> Hand tracking can be unreliable during fast movements, complex finger interactions, or when one hand occludes the other.<ref name="mixed_news_hand_tracking"/> These tracking failures can break immersion and cause user frustration.
* '''Haptic Fidelity and Cost:''' The haptic feedback in most consumer VR controllers is limited to simple vibrations. Creating realistic tactile sensations—such as the texture of a surface, the weight of an object, or the precise feeling of pressure—is extremely challenging.<ref name="flatirons_haptics_disadvantages">Disadvantages of Haptic Technology. Flatirons. https://flatirons.com/blog/disadvantages-of-haptic-technology/</ref> Advanced haptic devices like force-feedback exoskeletons or microfluidic gloves exist, but they are currently very expensive, bulky, and largely confined to research and enterprise applications.<ref name="boreas_bad_haptics">How Bad Haptic Feedback Can Ruin the User Experience. Boreas Technologies. https://pages.boreas.ca/blog/how-bad-haptic-feedback-can-ruin-the-user-experience</ref>
* '''Haptic Fidelity and Cost:''' The haptic feedback in most consumer VR controllers is limited to simple vibrations. Creating realistic tactile sensations, such as the texture of a surface, the weight of an object, or the precise feeling of pressure, is extremely challenging.<ref name="flatirons_haptics_disadvantages">Disadvantages of Haptic Technology. Flatirons. https://flatirons.com/blog/disadvantages-of-haptic-technology/</ref> Advanced haptic devices like force-feedback exoskeletons or microfluidic gloves exist, but they are currently very expensive, bulky, and largely confined to research and enterprise applications.<ref name="boreas_bad_haptics">How Bad Haptic Feedback Can Ruin the User Experience. Boreas Technologies. https://pages.boreas.ca/blog/how-bad-haptic-feedback-can-ruin-the-user-experience</ref>
* '''Hardware Constraints:''' Standalone VR headsets operate under significant power and thermal constraints. The onboard processing power limits the complexity of the physics simulations, the number of tracked objects, and the sophistication of the rendering, which in turn affects the realism of interactions. Limited battery life also curtails the duration of untethered VR sessions.<ref name="medium_vr_challenges"/><ref name="vcd_vr_challenges">Exploring the Challenges and Limitations of Virtual Reality. VCD Social Club. https://vcdsocialclub.co.uk/exploring-the-challenges-and-limitations-of-virtual-reality</ref>
* '''Hardware Constraints:''' Standalone VR headsets operate under significant power and thermal constraints. The onboard processing power limits the complexity of the physics simulations, the number of tracked objects, and the sophistication of the rendering, which in turn affects the realism of interactions. Limited battery life also curtails the duration of untethered VR sessions.<ref name="medium_vr_challenges"/><ref name="vcd_vr_challenges">Exploring the Challenges and Limitations of Virtual Reality. VCD Social Club. https://vcdsocialclub.co.uk/exploring-the-challenges-and-limitations-of-virtual-reality</ref>


Line 362: Line 362:
Designing a [[user interface]] (UI) for a 3D space requires a fundamental rethinking of principles from 2D design.
Designing a [[user interface]] (UI) for a 3D space requires a fundamental rethinking of principles from 2D design.


* '''Spatial UI:''' UI elements cannot be fixed to the screen; they must exist within the 3D world. Designers must consider the optimal placement of menus and information to be within the user's "comfort zone"—typically a 94° horizontal and 32° vertical arc in front of the user—and at a comfortable viewing distance (generally between 0.5 meters and 10 meters) to avoid eye strain and maintain stereoscopic depth perception.<ref name="viroreact_design_principles"/>
* '''Spatial UI:''' UI elements cannot be fixed to the screen; they must exist within the 3D world. Designers must consider the optimal placement of menus and information to be within the user's "comfort zone", typically a 94° horizontal and 32° vertical arc in front of the user, and at a comfortable viewing distance (generally between 0.5 meters and 10 meters) to avoid eye strain and maintain stereoscopic depth perception.<ref name="viroreact_design_principles"/>
* '''Interaction Abstraction:''' A core challenge is deciding on the level of abstraction for an interaction. A "natural" interaction, like picking up an object with tracked hands, is intuitive but can be imprecise and lacks tactile feedback. An "abstract" interaction, like pressing a button to grab an object, is reliable and provides clear feedback but is less immersive.<ref name="ramotion_vr_ux">VR in UX Design: Basic Guidelines. Ramotion. https://www.ramotion.com/blog/vr-in-ux-design/</ref> Designers must constantly balance the trade-offs between intuitiveness, reliability, and user comfort for every interaction.
* '''Interaction Abstraction:''' A core challenge is deciding on the level of abstraction for an interaction. A "natural" interaction, like picking up an object with tracked hands, is intuitive but can be imprecise and lacks tactile feedback. An "abstract" interaction, like pressing a button to grab an object, is reliable and provides clear feedback but is less immersive.<ref name="ramotion_vr_ux">VR in UX Design: Basic Guidelines. Ramotion. https://www.ramotion.com/blog/vr-in-ux-design/</ref> Designers must constantly balance the trade-offs between intuitiveness, reliability, and user comfort for every interaction.


Line 443: Line 443:
[[EMG wristband|EMG wristbands]] represent most significant emerging technology, with Meta planning 2025-2027 launch with third-generation Ray-Ban glasses. The July 2024 Nature paper demonstrating generalizable models working without user calibration removes major commercialization barrier.<ref name="metaemg"/>
[[EMG wristband|EMG wristbands]] represent most significant emerging technology, with Meta planning 2025-2027 launch with third-generation Ray-Ban glasses. The July 2024 Nature paper demonstrating generalizable models working without user calibration removes major commercialization barrier.<ref name="metaemg"/>


The long-term, paradigm-shifting future of input lies in [[brain-computer interface]]s (BCIs), also known as neural interfaces. These technologies aim to establish a direct communication pathway between the brain and a computer, potentially allowing users to control virtual objects or navigate interfaces through thought alone.<ref name="naimark_io"/> Companies like [[Neuralink]] are developing invasive BCIs for medical applications, which involve surgically implanted electrodes to read neural signals with high fidelity.<ref name="neuralink_homepage">Neuralink Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/</ref>
The long-term, paradigm-shifting future of input lies in [[brain-computer interface]]s (BCIs), also known as neural interfaces. These technologies aim to establish a direct communication pathway between the brain and a computer, potentially allowing users to control virtual objects or navigate interfaces through thought alone.<ref name="naimark_io"/> Companies like [[Neuralink]] are developing invasive BCIs for medical applications, which involve surgically implanted electrodes to read neural signals with high fidelity.<ref name="neuralink_homepage">Neuralink - Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/</ref>


===Convergence and Multi-Modal Input===
===Convergence and Multi-Modal Input===
Line 451: Line 451:
Eye tracking transitions from premium to standard feature, with [[PlayStation VR2]], [[Apple Vision Pro]], and [[HTC Vive Focus Vision]] including it as core functionality rather than add-on. [[Tobii]]'s licensing model enables rapid market expansion across platforms.<ref name="tobii"/>
Eye tracking transitions from premium to standard feature, with [[PlayStation VR2]], [[Apple Vision Pro]], and [[HTC Vive Focus Vision]] including it as core functionality rather than add-on. [[Tobii]]'s licensing model enables rapid market expansion across platforms.<ref name="tobii"/>


The industry converges on multi-modal input supporting simultaneous use of controllers, hand tracking, eye tracking, and voice commands. Users seamlessly switch between input methods depending on task—controllers for gaming precision, hand tracking for social interaction, eye tracking for UI targeting, and voice for explicit commands. In the more immediate future, the most significant trend is the convergence of multiple input streams into a single, cohesive interaction model. Instead of relying on a single input method, future systems will intelligently combine data from eye tracking, hand tracking, voice commands, and biometric sensors to gain a more holistic and context-aware understanding of user intent.
The industry converges on multi-modal input supporting simultaneous use of controllers, hand tracking, eye tracking, and voice commands. Users seamlessly switch between input methods depending on task, controllers for gaming precision, hand tracking for social interaction, eye tracking for UI targeting, and voice for explicit commands. In the more immediate future, the most significant trend is the convergence of multiple input streams into a single, cohesive interaction model. Instead of relying on a single input method, future systems will intelligently combine data from eye tracking, hand tracking, voice commands, and biometric sensors to gain a more holistic and context-aware understanding of user intent.


The [[Apple Vision Pro]]'s primary interaction model is a prominent example of this trend. It uses eye tracking to determine what a user is looking at (the "target") and hand tracking to detect a simple pinch gesture as the confirmation "click."<ref name="youtube_controller_tierlist">Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o</ref> This fusion of two separate input modalities creates an interaction that is fast, intuitive, and requires minimal physical effort. Future systems will likely expand on this, using voice commands to modify properties of the object a user is looking at, or using biometric data to adapt a virtual environment based on a user's emotional state. This multi-modal approach promises to make interaction in XR feel less like operating a computer and more like a natural extension of the user's own body and mind.
The [[Apple Vision Pro]]'s primary interaction model is a prominent example of this trend. It uses eye tracking to determine what a user is looking at (the "target") and hand tracking to detect a simple pinch gesture as the confirmation "click."<ref name="youtube_controller_tierlist">Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o</ref> This fusion of two separate input modalities creates an interaction that is fast, intuitive, and requires minimal physical effort. Future systems will likely expand on this, using voice commands to modify properties of the object a user is looking at, or using biometric data to adapt a virtual environment based on a user's emotional state. This multi-modal approach promises to make interaction in XR feel less like operating a computer and more like a natural extension of the user's own body and mind.
Line 476: Line 476:
<references>
<references>
<ref name="forwork_meta_guide">Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/</ref>
<ref name="forwork_meta_guide">Virtual Reality Guide. Meta for Work. https://forwork.meta.com/blog/virtual-reality-guide/</ref>
<ref name="naimark_io">VR / AR Fundamentals 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e</ref>
<ref name="naimark_io">VR / AR Fundamentals - 4) Input & Interactivity. Michael Naimark, March 2, 2018. https://michaelnaimark.medium.com/vr-ar-fundamentals-4-input-interactivity-8d6d066c954e</ref>
<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref>
<ref name="gitbook">Input Method and Interaction Design. The Design of Virtual and Augmented Reality. https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/an-introduction-to-spatial-design/input-method-and-interaction-design</ref>
<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref>
<ref name="fiveable">Input methods and interaction paradigms in VR/AR. Fiveable. https://fiveable.me/immersive-and-virtual-reality-art/unit-6/input-methods-interaction-paradigms-vrar/study-guide/Fi52EZ1Qr1nuisEi</ref>
Line 484: Line 484:
<ref name="hqsoftware_history">A Brief History of AR and VR: Virtual Reality Timeline. HQSoftware. https://hqsoftwarelab.com/blog/the-history-of-ar-and-vr-a-timeline-of-notable-milestones/</ref>
<ref name="hqsoftware_history">A Brief History of AR and VR: Virtual Reality Timeline. HQSoftware. https://hqsoftwarelab.com/blog/the-history-of-ar-and-vr-a-timeline-of-notable-milestones/</ref>
<ref name="coursera_history_vr">History of Virtual Reality: From the 1800s to the 21st Century. Coursera, July 12, 2023. https://www.coursera.org/articles/history-of-virtual-reality</ref>
<ref name="coursera_history_vr">History of Virtual Reality: From the 1800s to the 21st Century. Coursera, July 12, 2023. https://www.coursera.org/articles/history-of-virtual-reality</ref>
<ref name="ieeespectrum">The Tremendous VR and CG Systems—of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref>
<ref name="ieeespectrum">The Tremendous VR and CG Systems-of the 1960s. IEEE Spectrum. https://spectrum.ieee.org/sketchpad</ref>
<ref name="sutherland_ultimate_display">The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.</ref>
<ref name="sutherland_ultimate_display">The Ultimate Display. I. E. Sutherland, 1965. Proceedings of IFIP Congress 1965, Volume 2, pages 506-508.</ref>
<ref name="sutherland_hmd_paper">A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.</ref>
<ref name="sutherland_hmd_paper">A head-mounted three dimensional display. I. E. Sutherland, 1968. Proceedings of the Fall Joint Computer Conference, Volume 33, pages 757-764.</ref>
Line 510: Line 510:
<ref name="wikipedia_valve_index">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref>
<ref name="wikipedia_valve_index">Valve Index. Wikipedia. https://en.wikipedia.org/wiki/Valve_Index</ref>
<ref name="pimax_tracking">Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr</ref>
<ref name="pimax_tracking">Pose Tracking Methods: Outside-in VS Inside-out Tracking in VR. Pimax. https://pimax.com/blogs/blogs/pose-tracking-methods-outside-in-vs-inside-out-tracking-in-vr</ref>
<ref name="zilliz_tracking">What types of tracking systems are used in VR (e.g., inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein</ref>
<ref name="zilliz_tracking">What types of tracking systems are used in VR (for example inside-out vs. outside-in)? Zilliz. https://zilliz.com/ai-faq/what-types-of-tracking-systems-are-used-in-vr-eg-insideout-vs-outsidein</ref>
<ref name="meta_controllers_pro">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/</ref>
<ref name="meta_controllers_pro">Meta Quest Touch Pro Controllers. Meta. https://www.meta.com/help/quest/667591367977925/</ref>
<ref name="milvus_motion_controllers">What role do motion controllers play in VR, and how do you support them? Milvus. https://milvus.io/ai-quick-reference/what-role-do-motion-controllers-play-in-vr-and-how-do-you-support-them</ref>
<ref name="milvus_motion_controllers">What role do motion controllers play in VR, and how do you support them? Milvus. https://milvus.io/ai-quick-reference/what-role-do-motion-controllers-play-in-vr-and-how-do-you-support-them</ref>
Line 603: Line 603:
<ref name="webxr">WebXR Device API. W3C. https://www.w3.org/TR/webxr/</ref>
<ref name="webxr">WebXR Device API. W3C. https://www.w3.org/TR/webxr/</ref>
<ref name="mdnwebxr">WebXR Device API - Web APIs | MDN. MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API</ref>
<ref name="mdnwebxr">WebXR Device API - Web APIs | MDN. MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API</ref>
<ref name="neuralink_homepage">Neuralink Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/</ref>
<ref name="neuralink_homepage">Neuralink - Pioneering Brain Computer Interfaces. Neuralink. https://neuralink.com/</ref>
<ref name="youtube_controller_tierlist">Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o</ref>
<ref name="youtube_controller_tierlist">Ranking Every VR Controller Ever Made. YouTube. https://www.youtube.com/watch?v=uk1oqcEAm6o</ref>
</references>
</references>