Jump to content

Haptics: Difference between revisions

From VR & AR Wiki
No edit summary
No edit summary
Tag: Reverted
Line 1: Line 1:
'''Haptics''' (from the Greek ''haptikos'', "able to touch or grasp") or '''[[Tactile feedback]]''' is a [[technology]] that produces the sense of touch through physical stimulation. Haptics can significantly improve the user's [[immersion]] in a [[VR]] world by allowing users to experience physical sensations caused by their actions in a [[virtual environment]]. When a user picks up a cup in the virtual world, the user should feel the realistic sensations of a cup in his or her hand, even though the cup is not present in the real world. This bidirectional exchange of sensory information creates a deeper connection between users and digital worlds, making haptics a critical component in creating believable and effective VR and [[AR]] experiences.<ref name="srivastava2019">Srivastava, K., Kukreja, S. L., & Shinghal, K. (2019). Haptic Technology: A Comprehensive Review of its Applications and Future Potential. *Journal of Mechatronics, Electrical Power, and Vehicular Technology*, *10*(2), 99-112.</ref>
{{short description|Overview of haptic technology}}


In traditional [[video game]] controllers, "[[rumble]]" is often used to produce tactile feedback. However, modern [[haptic systems]] in AR and VR environments offer much more sophisticated and nuanced feedback mechanisms.
'''Haptics''' (from Ancient Greek ἁπτικός ''haptikós'', “pertaining to touch”) is the science and technology of creating and interpreting tactile ​and kinesthetic sensations by means of specialised hardware and software. Contemporary scholars treat haptics as a superset of '''cutane­ous''' (skin-based) and '''kinesthetic''' (muscle- and joint-based) feedback rather than a synonym of mere vibration or “rumble”.&#8203;:contentReference[oaicite:0]{index=0}


== Physiology of Touch ==
Immersive [[virtual reality|VR]] and [[augmented reality|AR]] systems commonly combine real-time graphics with haptic devices so that users not only see and hear virtual objects but also feel their weight, surface texture, or temperature, leading to measurably higher presence scores and task-performance gains in controlled studies.&#8203;:contentReference[oaicite:1]{index=1}


Human skin contains four main classes of [[mechanoreceptor]]s: Merkel cells (pressure), Meissner corpuscles (low-frequency vibration), Ruffini endings (skin stretch), and Pacinian corpuscles (high-frequency vibration). These receptors are tuned to different frequencies and deformations, allowing us to perceive a wide range of tactile sensations.<ref>Lederman, S. J., & Klatzky, R. L. (2009). Haptic perception: A tutorial. Attention, Perception, & Psychophysics, 71(7), 1439-1459.</ref>
== Physiology of touch ==
Human skin contains four principal mechanoreceptor types—Merkel, Meissner, Ruffini and Pacinian endings—each tuned to different spatial and temporal patterns of deformation.&#8203;:contentReference[oaicite:2]{index=2} Signals from these receptors are integrated with proprioceptive input from muscle spindles and joint afferents to form a multimodal “haptic channel.&#8203;:contentReference[oaicite:3]{index=3}


Cutaneous cues (pressure, vibration, stretch) combine with [[kinesthetic sense|kinesthetic]] cues from muscles and joints to form a multimodal "haptic channel" that informs us about object properties and our interactions with the environment.<ref>{{cite journal|title=Haptics is comprised of kinesthetic and cutaneous feedback|journal=Applied Sciences|year=2024|doi=10.3390/app14146020}}</ref>
== History ==
* '''1948–54 – telemanipulation''': Raymond Goertz’s master–slave manipulators at Argonne National Laboratory introduced bilateral force-feedback for handling radioactive materials.&#8203;:contentReference[oaicite:4]{index=4} 
* '''1970s – academic force displays''': the University of North Carolina and MIT produced early computer-linked force-feedback arms for 3-D design and molecular docking.&#8203;:contentReference[oaicite:5]{index=5} 
* '''1997 – consumer vibrotactile feedback''': Nintendo’s ''Rumble Pak'' popularised vibration in mainstream gaming controllers.&#8203;:contentReference[oaicite:6]{index=6} 
* '''2016 – mass-market VR controllers''': Oculus Touch launched on 6 Dec 2016 with six-DOF tracking and per-hand LRAs for programmable haptics.&#8203;:contentReference[oaicite:7]{index=7}


== History of Haptics ==
== Major feedback classes ==
=== Tactile (cutaneous) ===
;Vibrotactile 
Eccentric-rotating-mass and linear-resonant actuators deliver frequencies between 50–250 Hz and underpin smartphones, gamepads and the Valve Index controller.&#8203;:contentReference[oaicite:8]{index=8} 
;Electrotactile 
Devices such as the full-body '''Teslasuit''' stimulate afferent nerves through short current pulses to reproduce contact, impact and temperature cues. 
;Thermal 
Peltier-based modules in research prototypes vary skin temperature ±8 °C within 1–2 s to reinforce material identity or environmental effects.&#8203;:contentReference[oaicite:9]{index=9} 
;Ultrasonic mid-air 
[[Ultraleap]] arrays focus ultrasound to 1–5 mm “points” on a bare hand, enabling mid-air buttons and sliders with no worn hardware.&#8203;:contentReference[oaicite:10]{index=10}


The study of haptics has origins dating back to the 1950s when engineers began researching mechanical manipulators for handling hazardous materials.<ref>Hannaford, B., & Okamura, A. M. (2016). Haptics. In Springer handbook of robotics (pp. 1063-1084). Springer, Cham.</ref> The term "haptics" was first officially adopted into the field of [[human-computer interaction]] during the early 1990s.
=== Kinesthetic ===
;Grounded force displays 
Commercial systems such as the [[phantom haptic device|Geomagic Touch]] render up to 7 N continuous force for CAD, surgical rehearsal and scientific visualisation.&#8203;:contentReference[oaicite:11]{index=11} 
;Ungrounded force illusions 
Hand-held devices exploit asymmetric vibration to create directional pull or push sensations—e.g. '''Buru-Navi3''' and '''Traxion'''.&#8203;:contentReference[oaicite:12]{index=12} 
;Wearable resistive gloves 
The 2023 '''HaptX Gloves G1''' use 135 micro­fluidic actuators per glove plus tendon-locking brakes to render pressure and rigidity.&#8203;:contentReference[oaicite:13]{index=13}


Early haptic interfaces for computing appeared in the 1970s with the development of [[force-feedback]] systems at research institutions like the [[University of North Carolina]] and [[MIT]].<ref>Salisbury, K., Conti, F., & Barbagli, F. (2004). Haptic rendering: introductory concepts. IEEE computer graphics and applications, 24(2), 24-32.</ref>
== Haptic rendering ==
Real-time haptic loops typically run at 500–1 000 Hz, sampling user motion, computing contact forces, and updating actuators. Advances in penalty-based and constraint-based algorithms now support multi-point hand contact and soft-body interaction in [[OpenXR]] 1.1 runtimes.&#8203;:contentReference[oaicite:14]{index=14}


In 1997, the release of the [[Nintendo 64 Rumble Pak]] marked one of the first mainstream haptic interfaces in consumer electronics, introducing gamers to basic vibrotactile feedback.<ref>Biggs, S. J., & Srinivasan, M. A. (2002). Haptic interfaces. Handbook of virtual environments, 93-116.</ref>
== Devices and platforms ==
* '''VR/AR controllers''' – Meta Quest 3, Valve Index, and PlayStation VR2 Sense controllers combine high-bandwidth LRAs with trigger-level force modulation.&#8203;:contentReference[oaicite:15]{index=15} 
* '''Haptic vests & suits''' – '''bHaptics TactSuit Pro''' (32 motors, wireless, Quest native) delivers full-torso impact and environmental cues.&#8203;:contentReference[oaicite:16]{index=16} 
* '''Mid-air kiosks''' – automotive and public-display prototypes integrate Ultraleap arrays for touch-free UI control.&#8203;:contentReference[oaicite:17]{index=17} 
* '''Accessibility wearables''' – '''Wayband''' uses vibrotactile cues on the wrist to guide blind runners through GPS waypoints.&#8203;:contentReference[oaicite:18]{index=18} 
* '''Research prototypes''' – Microsoft Research’s '''CLAW''' handheld controller adds index-finger force and texture rendering.&#8203;:contentReference[oaicite:19]{index=19}


The evolution of haptics in VR/AR contexts accelerated in the 2010s with the resurgence of consumer virtual reality technology. [[Oculus]] (later acquired by [[Facebook]]/[[Meta]]) began implementing haptic controllers with their [[Oculus Touch]] controllers in 2016, and [[HTC]] included similar capabilities in their [[Vive]] controllers.<ref>Burdea, G. C. (2019). Haptic feedback for virtual reality. Virtual reality and augmented reality, 17-30.</ref>
== Standards ==
 
* '''ISO 9241-910/-920''' define terminology, interaction models and ergonomic guidance for tactile/haptic systems.&#8203;:contentReference[oaicite:20]{index=20} 
== Types of Haptic Feedback ==
* '''OpenXR 1.1''' (Khronos, 2023) folds common amplitude/frequency control extensions into the core spec, enabling cross-platform haptic playback.&#8203;:contentReference[oaicite:21]{index=21}
 
Haptic feedback can be broadly categorized based on the type of sensory information it provides:
 
=== Tactile Feedback ===
 
[[Tactile feedback]] engages the mechanoreceptors in the skin to simulate sensations like pressure, vibration, stretch, texture, and temperature.<ref name="culbertson2018">Culbertson, H., Schorr, S. B., & Okamura, A. M. (2018). Haptics: The Technology of Touch. *Annual Review of Control, Robotics, and Autonomous Systems*, *1*, 385-409.</ref>
 
==== Vibrotactile Feedback ====
 
[[Vibrotactile feedback]] uses vibration to create tactile sensations and is the most common form of haptic feedback in consumer devices. It typically employs [[eccentric rotating mass]] (ERM) motors or [[linear resonant actuators]] (LRA).<ref>Choi, S., & Kuchenbecker, K. J. (2013). Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE, 101(9), 2093-2104.</ref>
 
Modern VR controllers like the [[Meta Quest 2]] controllers and [[Valve Index]] controllers use vibrotactile feedback to simulate interactions with virtual objects.<ref>Benko, H., Holz, C., Sinclair, M., & Ofek, E. (2016, October). Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (pp. 717-728).</ref>
 
==== Electrotactile Stimulation ====
 
[[Electrotactile]] or [[electrocutaneous stimulation]] delivers small electrical currents to stimulate nerves in the skin, creating various tactile sensations. Companies like [[Teslasuit]] have incorporated this technology into full-body haptic suits for VR training and gaming.<ref>Kaczmarek, K. A., Webster, J. G., Bach-y-Rita, P., & Tompkins, W. J. (1991). Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Transactions on Biomedical Engineering, 38(1), 1-16.</ref>
 
==== Thermal Feedback ====
 
[[Thermal feedback]] systems use [[Peltier element]]s or similar technologies to create sensations of heat or cold. These can enhance immersion by simulating temperature changes in virtual environments.<ref>Wilson, G., Halvey, M., Brewster, S. A., & Hughes, S. A. (2011, May). Some like it hot: thermal feedback for mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2555-2564).</ref>
 
==== Texture Simulation ====
 
Creating the sensation of surface roughness or patterns often using high-frequency vibrations or electrostatics to simulate different textures when touching virtual surfaces.<ref name="culbertson2018"/>
 
=== Kinesthetic Feedback ===
 
[[Kinesthetic feedback]] provides information about limb position and movement by applying forces to the user's body, engaging muscles and joints. This simulates weight, inertia, resistance, and large-scale impacts.<ref name="culbertson2018"/>
 
==== Force Feedback ====
 
[[Force feedback]] systems provide resistance or force to the user, simulating the physical properties of virtual objects. These can include:
 
* [[Grounded force feedback]] - Systems physically connected to a stationary base (like robotic arms or exoskeletons)
* [[Ungrounded force feedback]] - Systems that create the illusion of force without being physically anchored
 
Commercial examples include the [[PHANTOM]] haptic device (now part of [[3D Systems]]) and [[Haption's Virtuose]] systems, which have been used for medical training, industrial design, and scientific visualization.<ref>Laycock, S. D., & Day, A. M. (2003). Recent developments and applications of haptic devices. Computer Graphics Forum, 22(2), 117-132.</ref>
 
==== Motion Simulation ====
 
Using platforms or actuated chairs to simulate large-scale movements like vehicle acceleration, flight G-forces, or walking sensations. These systems are often used in training simulators and advanced entertainment applications.<ref>Salisbury, J. K., Conti, F., & Barbagli, F. (2004). Haptic rendering: introductory concepts. *IEEE Computer Graphics and Applications*, *24*(2), 24-32.</ref>
 
=== Ultrasonic Haptics ===
 
[[Ultrasonic haptics]] use focused ultrasound waves to create tactile sensations in mid-air without requiring users to wear or hold any devices. Companies like [[Ultraleap]] (formerly [[Ultrahaptics]]) have developed systems that allow users to "feel" virtual objects without physical contact.<ref>Carter, T., Seah, S. A., Long, B., Drinkwater, B., & Subramanian, S. (2013, October). UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In Proceedings of the 26th annual ACM symposium on User interface software and technology (pp. 505-514).</ref>
 
=== Pneumatic and Hydraulic Systems ===
 
[[Pneumatic]] and [[hydraulic systems]] use air or fluid pressure to create force feedback. These can be used in gloves, suits, or other wearable devices to simulate touch and pressure.<ref>Burdea, G., Zhuang, J., Roskos, E., Silver, D., & Langrana, N. (1992, April). A portable dextrous master with force feedback. In Proceedings of IEEE Virtual Reality Annual International Symposium (pp. 55-62).</ref>
 
=== Mechanical Constraints ===
 
Systems using [[mechanical constraints]] physically limit user movement to simulate walls, surfaces, or object boundaries. Examples include the [[CLAW]] controller by Microsoft Research and [[EXIII]] haptic devices.<ref>Choi, I., Hawkes, E. W., Christensen, D. L., Ploch, C. J., & Follmer, S. (2016, October). Wolverine: A wearable haptic interface for grasping in virtual reality. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 986-993).</ref>
 
== Actuator Technologies ==


== Applications ==
{| class="wikitable"
{| class="wikitable"
! Actuator !! Principle !! Typical use
! Domain !! Representative systems !! Typical benefits
|-
| [[Vibrotactile actuator|ERMs & LRAs]] || Rotating or linear mass vibration || Gamepads, phones, VR controllers
|-
|-
| Piezoelectric stacks || Crystal deformation || High-fidelity mobile haptics
| Gaming & entertainment || PS VR2 Sense, bHaptics TactSuit || Higher presence, weapon recoil, texture cues
|-
|-
| Electro-/magnetorheological brakes || Variable resistance || Kinesthetic exoskeletons
| Medical & surgical training || Geomagic Touch simulators || Objective skill metrics, reduced error rates
|-
|-
| Focused ultrasound arrays || Acoustic radiation pressure || Mid-air buttons & sliders
| Industrial design & prototyping || HaptX Gloves, Ultraleap kiosks || Rapid ergonomic evaluation without physical mock-ups
|-
|-
| EMS/TENS electrodes || Electrical stimulation of nerves/muscles || Full-body suits, rehabilitation
| Accessibility & navigation || Wayband wristband || Eyes-free GPS guidance for visually-impaired users
|-
|-
| Microfluidic actuators || Controlled fluid pressure || High-density tactile arrays
| Teleoperation & robotics || HaptX–Shadow Robot telepresence hand || Sub-millimetre manipulation with remote tactile feedback
|}
|}


== Haptic Rendering ==
== Force-illusion demonstrators ==
 
At SIGGRAPH 2014, Tomohiro Amemiya and Hiroaki Gomi exhibited the pocket-sized '''Buru-Navi3''' (40 Hz asymmetric vibration) alongside Jun Rekimoto’s '''Traxion''' device, both capable of guiding users along a path while the actuator remains stationary in the hand.&#8203;:contentReference[oaicite:22]{index=22}
[[Haptic rendering]] is the process of calculating appropriate forces to display to users based on their interactions with virtual objects. Interactive VR applications typically run a 500–1,000 Hz haptic control loop that:
1. Samples user motion
2. Computes contact forces using a physics engine
3. Drives actuators via device SDKs (e.g., OpenXR 1.1 haptics extension)
 
Advances in [[physics simulation]] and [[collision detection]] are continuously improving the realism of haptic interactions.<ref>Otaduy, M. A., & Lin, M. C. (2005). Introduction to haptic rendering. In ACM SIGGRAPH 2005 Courses (pp. 3-es).</ref>
 
Research into [[multi-point haptic rendering]] addresses limitations of traditional single-point interfaces, allowing users to feel virtual objects with their entire hand or body.<ref>Prattichizzo, D., Chinello, F., Pacchierotti, C., & Malvezzi, M. (2013). Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback. IEEE Transactions on Haptics, 6(4), 506-516.</ref>
 
== Force Display Devices ==
 
In the SIGGRAPH technology conference, Japanese scientists [[Tomohiro Amemiya]] and [[Hiroaki Gomi]] demonstrated two [[Force display]] devices: [[Traxion]] and [[Buru-Navi3]]. When held, these devices can cause push and pull sensations while vibrating in place. The force these devices generate is strong enough to guide a blind person.<ref>http://www.technologyreview.com/news/528886/could-force-illusions-help-wearables-catch-on/</ref>
 
[[Buru-Navi3]] is a wine-cork sized device that contains a 40-hertz electromagnetic actuator. When held between 2 fingers, it creates a force illusion in towards or away from the user.
 
[[Traxion]] is a similar device developed by [[Jun Rekimoto]]. The device also creates a virtual force by asymmetrically vibrating the actuator.
 
== Haptic Devices for VR/AR ==
 
=== Handheld Controllers ===
 
Standard [[VR controller]]s (e.g., Meta Quest controllers, Valve Index Controllers, PlayStation VR2 Sense controllers) typically include basic [[vibrotactile feedback]] (ERM or LRA). Some advanced controllers incorporate more nuanced effects, like the adaptive triggers and detailed haptics in the PS VR2 Sense controllers.<ref>Sony Interactive Entertainment. (n.d.). *PlayStation VR2 Sense controller*. Retrieved April 29, 2025, from https://www.playstation.com/en-us/ps-vr2/controllers/</ref>
 
The [[PlayStation 5's DualSense]] controller represents one of the most advanced mainstream haptic controllers, using adaptive triggers and high-fidelity vibrotactile feedback to simulate different surfaces and resistances.<ref>Colgan, A. (2021). The PlayStation 5 DualSense Controller: A New Era for Haptics in Gaming. IEEE Consumer Electronics Magazine, 10(3), 6-8.</ref>
 
=== Haptic Gloves ===
 
[[Haptic glove]]s aim to provide high-fidelity feedback to the hands and fingers. They often combine finger tracking with various feedback mechanisms:
* Vibrotactile arrays: Multiple small actuators across the palm and fingers for localized sensations
* Force feedback: Systems (using cables, pneumatics, or exoskeletons) that apply resistance to finger movement, simulating the shape and rigidity of virtual objects
 
Examples include [[HaptX Gloves G1]] with micro-fluidic actuators for true-contact pressure, [[SenseGlove]], and [[Manus VR]].<ref>HaptX Inc. (n.d.). *HaptX Gloves*. Retrieved April 29, 2025, from https://haptx.com/</ref>
 
=== Haptic Vests and Suits ===
 
[[Haptic suit]]s or vests extend tactile feedback to the torso and sometimes limbs. They typically use an array of vibrotactile actuators to simulate impacts, environmental effects (like rain or wind direction), or proximity alerts across the body. Examples include the [[bHaptics]] TactSuit range and [[TESLASUIT]], which integrates electrical muscle stimulation (EMS), motion capture, and biometry.<ref>bHaptics Inc. (n.d.). *TactSuit*. Retrieved April 29, 2025, from https://www.bhaptics.com/</ref><ref>TESLASUIT. (n.d.). *TESLASUIT XR Edition*. Retrieved April 29, 2025, from https://teslasuit.io/products/teslasuit-4/</ref>
 
== Applications in VR and AR ==
 
=== Gaming and Entertainment ===
 
[[Haptic gaming]] provides immersive experiences by allowing players to feel virtual environments and objects. Advanced systems like the [[Teslasuit]], [[bHaptics TactSuit]], and [[Dexmo]] exoskeleton gloves enable users to feel impacts, textures, and resistance in games.<ref>Pacchierotti, C., Sinclair, S., Solazzi, M., Frisoli, A., Hayward, V., & Prattichizzo, D. (2017). Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE transactions on haptics, 10(4), 580-600.</ref>
 
Next-gen consoles and XR headsets use localized haptics to convey weapon recoil, surface textures, and locomotion cues. Game-specific haptic tracks (e.g., *Astro Bot*, *Returnal*) significantly raise presence and immersion.<ref>{{cite news|title=Astro Bot showcases DualSense haptics|publisher=Polygon|date=Sep 2024}}</ref>
 
=== Medical Training and Simulation ===
 
[[Haptic medical simulators]] allow healthcare professionals to practice procedures without risk to real patients. Systems like [[3D Systems' Touch]] (formerly [[Sensable Phantom]]) and [[FundamentalVR's Fundamental Surgery]] provide force feedback for surgical training.<ref>Coles, T. R., Meglan, D., & John, N. W. (2011). The role of haptics in medical training simulators: A survey of the state of the art. IEEE Transactions on haptics, 4(1), 51-66.</ref>
 
[[Haptic-enabled AR]] systems allow surgeons to "feel" pre-operative medical images during surgical planning, significantly improving spatial understanding.<ref>Sutherland, C., Hashtrudi-Zaad, K., Sellens, R., Abolmaesumi, P., & Mousavi, P. (2019). An augmented reality haptic training simulator for spinal needle procedures. IEEE Transactions on Biomedical Engineering, 66(11), 3094-3104.</ref>
 
Haptics improves psychomotor skill transfer in medical simulators, with systematic reviews showing enhanced accuracy and reduced task time in surgical training.<ref>{{cite journal|title=Haptic technology in healthcare: a systematic review|journal=JMIR|year=2024}}</ref>
 
=== Education ===
 
Haptics in VR and AR is transformative in education, particularly in science, technology, engineering, and mathematics (STEM) fields. It enables multi-sensory learning by integrating visual, auditory, kinesthetic, and tactile feedback, essential for hands-on experiences.
 
For example, VR with haptics can simulate laboratory experiments, allowing students to feel and manipulate virtual scientific equipment. AR applications with haptic feedback facilitate interactive exploration of complex systems like anatomy or molecular structures, improving understanding and long-term retention.<ref>{{cite journal|title=Haptic feedback in VR education: A systematic review and meta-analysis|journal=Computers & Education|year=2023|volume=189}}</ref>
 
=== Industrial Training and Design ===
 
[[Haptic industrial training]] allows workers to practice complex or dangerous tasks in virtual environments before performing them in reality. Companies like [[EON Reality]] and [[Serious Labs]] develop haptic VR training solutions for industries like construction, manufacturing, and oil and gas.<ref>Wang, Z. R., Wang, P., Xing, L., Mei, L. P., Zhao, J., & Zhang, T. (2019). Haptic rendering for dental training system. IEEE Access, 7, 68275-68282.</ref>
 
[[Automotive design]] companies like [[BMW]] and [[Ford]] use haptic systems for virtual prototyping, allowing designers to "feel" car interiors and controls before physical prototypes are built.<ref>Bordegoni, M., Cugini, U., Caruso, G., & Polistina, S. (2009). Mixed prototyping for product assessment: a reference framework. International Journal on Interactive Design and Manufacturing (IJIDeM), 3(3), 177-187.</ref>
 
=== Accessibility ===
 
[[Haptic accessibility devices]] help people with visual impairments navigate environments through tactile feedback. Systems like [[Wayband]] by [[WearWorks]] provide navigation assistance through patterns of vibration.<ref>Van Erp, J. B., Van Veen, H. A., Jansen, C., & Dobbins, T. (2005). Waypoint navigation with a vibrotactile waist belt. ACM Transactions on Applied Perception (TAP), 2(2), 106-117.</ref>
 
[[Tactile communication systems]] allow deaf-blind individuals to receive communication through haptic patterns, often through gloves or wearable devices on the body.<ref>Baumann, R., Jung, J., & Rogers, S. (2020). Supporting the deaf and hard of hearing in virtual reality with an enhanced user interface. 2020 IEEE Virtual Reality and 3D User Interfaces (VR), 273-282.</ref>
 
Combining haptic displays with VR/AR can create powerful accessibility tools, allowing alternative sensory channels to compensate for vision or hearing impairments.<ref>{{cite news|title=UCL synthetic touch technology could transform healthcare|publisher=Financial Times|date=11 Oct 2024}}</ref>
 
=== Rehabilitation ===
 
Combining haptic exoskeletons with VR accelerates stroke recovery by increasing engagement and repetitions. The gamification of rehabilitation exercises through VR with haptic feedback has shown significant improvements in patient motivation and outcomes.<ref>{{cite journal|title=Efficacy of VR-based rehabilitation in stroke|journal=Annals of Medicine|year=2023|volume=55|issue=2}}</ref>
 
=== Telepresence and Teleoperation ===
 
[[Haptic telepresence]] allows users to remotely "feel" environments through robotic systems. Applications include remote surgical systems, space exploration, and hazardous environment inspection.<ref>Pacchierotti, C., Meli, L., Chinello, F., Malvezzi, M., & Prattichizzo, D. (2015). Cutaneous haptic feedback to ensure the stability of robotic teleoperation systems. The International Journal of Robotics Research, 34(14), 1773-1787.</ref>


[[Haptic teleoperation]] enables precise control of robots in delicate or complex tasks by providing operators with tactile feedback from the robot's interactions.<ref>Son, H. I., Franchi, A., Chuang, L. L., Kim, J., Bulthoff, H. H., & Giordano, P. R. (2013). Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots. IEEE Transactions on Cybernetics, 43(2), 597-609.</ref>
== See also ==
 
* [[Tactile sensor]]
== Standards and Interoperability ==
 
* '''[[ISO 9241-910]]/920''' provide terminology and design guidance for tactile/gestural interfaces.<ref>{{cite web|title=ISO 9241-910:2011 – Ergonomics of human-system interaction – Framework for tactile/haptic interaction|url=https://www.iso.org/standard/51097.html|access-date=29 April 2025}}</ref> 
* '''[[IEEE VR]]''' and '''[[SIGGRAPH]]''' host annual Haptics symposia where new devices debut. 
* '''[[OpenXR 1.1]]''' (Khronos) unifies API calls for amplitude-/frequency-controlled haptic output across headsets.
 
The haptics industry faces challenges in [[haptic standardization]], with different devices using proprietary formats and protocols. Initiatives like the [[Haptics Industry Forum]] are working to establish standards for haptic content creation and playback across platforms.<ref>ISO/TC 159/SC 4 Ergonomics of human-system interaction. (2022). ISO 9241-910:2022 Ergonomics of human-system interaction — Part 910: Framework for tactile and haptic interaction.</ref>
 
[[Haptic codecs]] like [[MPEG-V]] and [[MPEG-H]] include provisions for standardized haptic data, though adoption remains limited compared to audio and video standards.<ref>Eid, M., Orozco, M., & El Saddik, A. (2007, June). A guided tour in haptic audio visual environments and applications. In 2007 IEEE International Conference on Multimedia and Expo (pp. 1449-1452). IEEE.</ref>
 
== Current Research and Challenges ==
 
=== Miniaturization and Wearability ===
 
Research into [[microfluidic tactile displays]] and [[smart materials]] aims to create thinner, lighter haptic devices that can be comfortably worn for extended periods.<ref>Wang, D., Ohnishi, K., & Xu, W. (2020). Multimodal haptic display for virtual reality: A survey. IEEE Transactions on Industrial Electronics, 67(1), 610-623.</ref>
 
[[Stretchable electronics]] and [[e-textiles]] are enabling the development of haptic systems integrated directly into clothing or applied to the skin like temporary tattoos.<ref>Yao, S., & Zhu, Y. (2015). Nanomaterial-enabled stretchable conductors: strategies, materials and devices. Advanced Materials, 27(9), 1480-1511.</ref>
 
=== Surface Haptics ===
 
[[Surface haptics]] research focuses on creating tactile sensations on touchscreens and flat surfaces. Technologies like [[electroadhesion]], [[ultrasonic friction modulation]], and [[microelectromechanical systems]] (MEMS) are enabling touchscreens that can simulate textures and buttons.<ref>Meyer, D. J., Peshkin, M. A., & Colgate, J. E. (2013, April). Fingertip friction modulation due to electrostatic attraction. In 2013 world haptics conference (WHC) (pp. 43-48). IEEE.</ref>
 
Companies like [[Tanvas]] and [[Bosch]] are developing commercial applications of surface haptics for automotive interfaces, mobile devices, and kiosks.<ref>Mullenbach, J., Shultz, C., Colgate, J. E., & Piper, A. M. (2014, April). Surface haptic interactions with a TPad tablet. In Proceedings of the adjunct publication of the 27th annual ACM symposium on User interface software and technology (pp. 7-8).</ref>
 
=== Neural Interfaces ===
 
Research into [[direct neural stimulation]] aims to bypass mechanical interfaces entirely, potentially allowing users to feel virtual sensations through direct interaction with the nervous system.<ref>Tyler, D. J. (2016). Restoring the human touch: Prosthetics imbued with haptics give their wearers fine motor control and a sense of connection. IEEE Spectrum, 53(5), 28-33.</ref>
 
[[Haptic brain-computer interfaces]] (BCIs) could potentially create fully immersive tactile experiences without physical haptic hardware, though this research remains in early stages.<ref>Cincotti, F., Mattia, D., Aloise, F., Bufalari, S., Schalk, G., Oriolo, G., ... & Marciani, M. G. (2008). Non-invasive brain–computer interface system: towards its application as assistive technology. Brain research bulletin, 75(6), 796-803.</ref>
 
=== Latency and Synchronization ===
 
[[Haptic latency]] must be minimized and synchronized precisely with visual and auditory cues; delays can break immersion and cause discomfort. Current research focuses on reducing end-to-end latency in haptic systems to below perceptible thresholds.<ref name="srivastava2019"/>
 
=== Power and Cost Limitations ===
 
Many advanced haptic technologies require significant power and can be expensive to produce, limiting their adoption in consumer devices. Research into energy-efficient actuators and more cost-effective manufacturing methods is ongoing.<ref>{{cite journal|title=Is modularity the future of haptics in XR? A systematic literature review|journal|Virtual Reality|year=2025}}</ref>
 
== Future Directions ==
 
=== Full-Body Haptic Systems ===
 
[[Full-body haptic systems]] aim to provide comprehensive tactile feedback across the entire body. Companies like [[Teslasuit]], [[bHaptics]], and [[Axon VR]] (now [[HaptX]]) are developing suits with hundreds of haptic actuators.<ref>Schorr, S. B., & Okamura, A. M. (2017). Three-dimensional skin deformation as force substitution: Wearable device design and performance during haptic exploration of virtual environments. IEEE transactions on haptics, 10(3), 418-430.</ref>
 
Research into [[distributed haptic interfaces]] seeks to optimize the placement and types of actuators to maximize feedback while minimizing cost and weight.<ref>Jones, L. A., & Sarter, N. B. (2008). Tactile displays: Guidance for their design and application. Human factors, 50(1), 90-111.</ref>
 
=== Environmental Haptics ===
 
[[Environmental haptics]] extends beyond wearable devices to create haptic sensations through the physical environment. Technologies include [[acoustic radiation pressure]], [[mid-air ultrasonic arrays]], and [[room-scale haptics]].<ref>Iwamoto, T., Tatezono, M., & Shinoda, H. (2008, August). Non-contact method for producing tactile sensation using airborne ultrasound. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 504-513). Springer, Berlin, Heidelberg.</ref>
 
[[Haptic projectors]] like those developed by [[Ultraleap]] allow multiple users to experience mid-air haptic sensations without wearable devices.<ref>Long, B., Seah, S. A., Carter, T., & Subramanian, S. (2014, April). Rendering volumetric haptic shapes in mid-air using ultrasound. In ACM Transactions on Graphics (TOG) (Vol. 33, No. 6, pp. 1-10).</ref>
 
=== Haptic Content Creation ===
 
The development of [[haptic authoring tools]] aims to make haptic content creation more accessible to designers without specialized technical knowledge. Platforms like [[Unity's XR Interaction Toolkit]] and [[Unreal Engine's haptic plugins]] provide frameworks for implementing haptic feedback in VR/AR applications.<ref>Danieau, F., Fleureau, J., Guillotel, P., Mollet, N., Christie, M., & Lécuyer, A. (2014). HapSeat: producing motion sensation with multiple force-feedback devices embedded in a seat. In Proceedings of the 18th ACM symposium on Virtual reality software and technology (pp. 69-76).</ref>
 
[[Haptic recording]] technologies allow the capture of real-world tactile experiences for playback in virtual environments, similar to how audio and video are recorded.<ref>Kuchenbecker, K. J., Romano, J., & McMahan, W. (2011). Haptography: Capturing and recreating the rich feel of real surfaces. In Robotics research (pp. 245-260). Springer, Berlin, Heidelberg.</ref>
 
=== Multimodal Integration ===
 
Research into [[cross-modal perception]] examines how haptic feedback interacts with visual and auditory cues, enabling more efficient and convincing multisensory experiences.<ref>Lederman, S. J., & Klatzky, R. L. (2009). Haptic perception: A tutorial. Attention, Perception, & Psychophysics, 71(7), 1439-1459.</ref>
 
[[Context-aware haptics]] adjusts tactile feedback based on environmental factors, user state, and application context to provide more relevant and effective haptic experiences.<ref>MacLean, K. E. (2008). Haptic interaction design for everyday interfaces. Reviews of Human Factors and Ergonomics, 4(1), 149-194.</ref>
 
=== Self-Powered Haptic Systems ===
 
Recent research has produced breakthroughs like self-powered [[electrotactile glove]]s that use triboelectric textiles to generate their own stimulation current, eliminating the need for external power sources and reducing weight.<ref>{{cite journal|title=Self-powered electrotactile textile haptic glove|journal=Science Advances|year=2025|doi=10.1126/sciadv.adt0318}}</ref>
 
== See Also ==
* [[Force feedback]]
* [[Immersion (virtual reality)]]
* [[Tactile sensors]]
* [[Vibrotactile feedback]]
* [[Vibrotactile feedback]]
* [[Haptic technology]]
* [[Virtual reality]]
* [[Augmented reality]]
* [[Kinesthetic sense]]
* [[Kinesthetic sense]]
* [[Teleoperation]]
* [[Teleoperation]]
* [[Sensory substitution]]


== References ==
== References ==
<references />
<references/>
 
[[Category:Terms]]
[[Category:Hardware]]
[[Category:Feedback systems]]
[[Category:Haptics]]
[[Category:Human–computer interaction]]

Revision as of 08:35, 29 April 2025

Template:Short description

Haptics (from Ancient Greek ἁπτικός haptikós, “pertaining to touch”) is the science and technology of creating and interpreting tactile ​and kinesthetic sensations by means of specialised hardware and software. Contemporary scholars treat haptics as a superset of cutane­ous (skin-based) and kinesthetic (muscle- and joint-based) feedback rather than a synonym of mere vibration or “rumble”.​:contentReference[oaicite:0]{index=0}

Immersive VR and AR systems commonly combine real-time graphics with haptic devices so that users not only see and hear virtual objects but also feel their weight, surface texture, or temperature, leading to measurably higher presence scores and task-performance gains in controlled studies.​:contentReference[oaicite:1]{index=1}

Physiology of touch

Human skin contains four principal mechanoreceptor types—Merkel, Meissner, Ruffini and Pacinian endings—each tuned to different spatial and temporal patterns of deformation.​:contentReference[oaicite:2]{index=2} Signals from these receptors are integrated with proprioceptive input from muscle spindles and joint afferents to form a multimodal “haptic channel.”​:contentReference[oaicite:3]{index=3}

History

  • 1948–54 – telemanipulation: Raymond Goertz’s master–slave manipulators at Argonne National Laboratory introduced bilateral force-feedback for handling radioactive materials.​:contentReference[oaicite:4]{index=4}
  • 1970s – academic force displays: the University of North Carolina and MIT produced early computer-linked force-feedback arms for 3-D design and molecular docking.​:contentReference[oaicite:5]{index=5}
  • 1997 – consumer vibrotactile feedback: Nintendo’s Rumble Pak popularised vibration in mainstream gaming controllers.​:contentReference[oaicite:6]{index=6}
  • 2016 – mass-market VR controllers: Oculus Touch launched on 6 Dec 2016 with six-DOF tracking and per-hand LRAs for programmable haptics.​:contentReference[oaicite:7]{index=7}

Major feedback classes

Tactile (cutaneous)

Vibrotactile

Eccentric-rotating-mass and linear-resonant actuators deliver frequencies between 50–250 Hz and underpin smartphones, gamepads and the Valve Index controller.​:contentReference[oaicite:8]{index=8}

Electrotactile

Devices such as the full-body Teslasuit stimulate afferent nerves through short current pulses to reproduce contact, impact and temperature cues.

Thermal

Peltier-based modules in research prototypes vary skin temperature ±8 °C within 1–2 s to reinforce material identity or environmental effects.​:contentReference[oaicite:9]{index=9}

Ultrasonic mid-air

Ultraleap arrays focus ultrasound to 1–5 mm “points” on a bare hand, enabling mid-air buttons and sliders with no worn hardware.​:contentReference[oaicite:10]{index=10}

Kinesthetic

Grounded force displays

Commercial systems such as the Geomagic Touch render up to 7 N continuous force for CAD, surgical rehearsal and scientific visualisation.​:contentReference[oaicite:11]{index=11}

Ungrounded force illusions

Hand-held devices exploit asymmetric vibration to create directional pull or push sensations—e.g. Buru-Navi3 and Traxion.​:contentReference[oaicite:12]{index=12}

Wearable resistive gloves

The 2023 HaptX Gloves G1 use 135 micro­fluidic actuators per glove plus tendon-locking brakes to render pressure and rigidity.​:contentReference[oaicite:13]{index=13}

Haptic rendering

Real-time haptic loops typically run at 500–1 000 Hz, sampling user motion, computing contact forces, and updating actuators. Advances in penalty-based and constraint-based algorithms now support multi-point hand contact and soft-body interaction in OpenXR 1.1 runtimes.​:contentReference[oaicite:14]{index=14}

Devices and platforms

  • VR/AR controllers – Meta Quest 3, Valve Index, and PlayStation VR2 Sense controllers combine high-bandwidth LRAs with trigger-level force modulation.​:contentReference[oaicite:15]{index=15}
  • Haptic vests & suitsbHaptics TactSuit Pro (32 motors, wireless, Quest native) delivers full-torso impact and environmental cues.​:contentReference[oaicite:16]{index=16}
  • Mid-air kiosks – automotive and public-display prototypes integrate Ultraleap arrays for touch-free UI control.​:contentReference[oaicite:17]{index=17}
  • Accessibility wearablesWayband uses vibrotactile cues on the wrist to guide blind runners through GPS waypoints.​:contentReference[oaicite:18]{index=18}
  • Research prototypes – Microsoft Research’s CLAW handheld controller adds index-finger force and texture rendering.​:contentReference[oaicite:19]{index=19}

Standards

  • ISO 9241-910/-920 define terminology, interaction models and ergonomic guidance for tactile/haptic systems.​:contentReference[oaicite:20]{index=20}
  • OpenXR 1.1 (Khronos, 2023) folds common amplitude/frequency control extensions into the core spec, enabling cross-platform haptic playback.​:contentReference[oaicite:21]{index=21}

Applications

Domain Representative systems Typical benefits
Gaming & entertainment PS VR2 Sense, bHaptics TactSuit Higher presence, weapon recoil, texture cues
Medical & surgical training Geomagic Touch simulators Objective skill metrics, reduced error rates
Industrial design & prototyping HaptX Gloves, Ultraleap kiosks Rapid ergonomic evaluation without physical mock-ups
Accessibility & navigation Wayband wristband Eyes-free GPS guidance for visually-impaired users
Teleoperation & robotics HaptX–Shadow Robot telepresence hand Sub-millimetre manipulation with remote tactile feedback

Force-illusion demonstrators

At SIGGRAPH 2014, Tomohiro Amemiya and Hiroaki Gomi exhibited the pocket-sized Buru-Navi3 (40 Hz asymmetric vibration) alongside Jun Rekimoto’s Traxion device, both capable of guiding users along a path while the actuator remains stationary in the hand.​:contentReference[oaicite:22]{index=22}

See also

References