Jump to content

Haptics

From VR & AR Wiki
Revision as of 08:35, 29 April 2025 by Xinreality (talk | contribs)

Template:Short description

Haptics (from Ancient Greek ἁπτικός haptikós, “pertaining to touch”) is the science and technology of creating and interpreting tactile ​and kinesthetic sensations by means of specialised hardware and software. Contemporary scholars treat haptics as a superset of cutane­ous (skin-based) and kinesthetic (muscle- and joint-based) feedback rather than a synonym of mere vibration or “rumble”.​:contentReference[oaicite:0]{index=0}

Immersive VR and AR systems commonly combine real-time graphics with haptic devices so that users not only see and hear virtual objects but also feel their weight, surface texture, or temperature, leading to measurably higher presence scores and task-performance gains in controlled studies.​:contentReference[oaicite:1]{index=1}

Physiology of touch

Human skin contains four principal mechanoreceptor types—Merkel, Meissner, Ruffini and Pacinian endings—each tuned to different spatial and temporal patterns of deformation.​:contentReference[oaicite:2]{index=2} Signals from these receptors are integrated with proprioceptive input from muscle spindles and joint afferents to form a multimodal “haptic channel.”​:contentReference[oaicite:3]{index=3}

History

  • 1948–54 – telemanipulation: Raymond Goertz’s master–slave manipulators at Argonne National Laboratory introduced bilateral force-feedback for handling radioactive materials.​:contentReference[oaicite:4]{index=4}
  • 1970s – academic force displays: the University of North Carolina and MIT produced early computer-linked force-feedback arms for 3-D design and molecular docking.​:contentReference[oaicite:5]{index=5}
  • 1997 – consumer vibrotactile feedback: Nintendo’s Rumble Pak popularised vibration in mainstream gaming controllers.​:contentReference[oaicite:6]{index=6}
  • 2016 – mass-market VR controllers: Oculus Touch launched on 6 Dec 2016 with six-DOF tracking and per-hand LRAs for programmable haptics.​:contentReference[oaicite:7]{index=7}

Major feedback classes

Tactile (cutaneous)

Vibrotactile

Eccentric-rotating-mass and linear-resonant actuators deliver frequencies between 50–250 Hz and underpin smartphones, gamepads and the Valve Index controller.​:contentReference[oaicite:8]{index=8}

Electrotactile

Devices such as the full-body Teslasuit stimulate afferent nerves through short current pulses to reproduce contact, impact and temperature cues.

Thermal

Peltier-based modules in research prototypes vary skin temperature ±8 °C within 1–2 s to reinforce material identity or environmental effects.​:contentReference[oaicite:9]{index=9}

Ultrasonic mid-air

Ultraleap arrays focus ultrasound to 1–5 mm “points” on a bare hand, enabling mid-air buttons and sliders with no worn hardware.​:contentReference[oaicite:10]{index=10}

Kinesthetic

Grounded force displays

Commercial systems such as the Geomagic Touch render up to 7 N continuous force for CAD, surgical rehearsal and scientific visualisation.​:contentReference[oaicite:11]{index=11}

Ungrounded force illusions

Hand-held devices exploit asymmetric vibration to create directional pull or push sensations—e.g. Buru-Navi3 and Traxion.​:contentReference[oaicite:12]{index=12}

Wearable resistive gloves

The 2023 HaptX Gloves G1 use 135 micro­fluidic actuators per glove plus tendon-locking brakes to render pressure and rigidity.​:contentReference[oaicite:13]{index=13}

Haptic rendering

Real-time haptic loops typically run at 500–1 000 Hz, sampling user motion, computing contact forces, and updating actuators. Advances in penalty-based and constraint-based algorithms now support multi-point hand contact and soft-body interaction in OpenXR 1.1 runtimes.​:contentReference[oaicite:14]{index=14}

Devices and platforms

  • VR/AR controllers – Meta Quest 3, Valve Index, and PlayStation VR2 Sense controllers combine high-bandwidth LRAs with trigger-level force modulation.​:contentReference[oaicite:15]{index=15}
  • Haptic vests & suitsbHaptics TactSuit Pro (32 motors, wireless, Quest native) delivers full-torso impact and environmental cues.​:contentReference[oaicite:16]{index=16}
  • Mid-air kiosks – automotive and public-display prototypes integrate Ultraleap arrays for touch-free UI control.​:contentReference[oaicite:17]{index=17}
  • Accessibility wearablesWayband uses vibrotactile cues on the wrist to guide blind runners through GPS waypoints.​:contentReference[oaicite:18]{index=18}
  • Research prototypes – Microsoft Research’s CLAW handheld controller adds index-finger force and texture rendering.​:contentReference[oaicite:19]{index=19}

Standards

  • ISO 9241-910/-920 define terminology, interaction models and ergonomic guidance for tactile/haptic systems.​:contentReference[oaicite:20]{index=20}
  • OpenXR 1.1 (Khronos, 2023) folds common amplitude/frequency control extensions into the core spec, enabling cross-platform haptic playback.​:contentReference[oaicite:21]{index=21}

Applications

Domain Representative systems Typical benefits
Gaming & entertainment PS VR2 Sense, bHaptics TactSuit Higher presence, weapon recoil, texture cues
Medical & surgical training Geomagic Touch simulators Objective skill metrics, reduced error rates
Industrial design & prototyping HaptX Gloves, Ultraleap kiosks Rapid ergonomic evaluation without physical mock-ups
Accessibility & navigation Wayband wristband Eyes-free GPS guidance for visually-impaired users
Teleoperation & robotics HaptX–Shadow Robot telepresence hand Sub-millimetre manipulation with remote tactile feedback

Force-illusion demonstrators

At SIGGRAPH 2014, Tomohiro Amemiya and Hiroaki Gomi exhibited the pocket-sized Buru-Navi3 (40 Hz asymmetric vibration) alongside Jun Rekimoto’s Traxion device, both capable of guiding users along a path while the actuator remains stationary in the hand.​:contentReference[oaicite:22]{index=22}

See also

References