Jump to content

Haptics: Difference between revisions

No edit summary
No edit summary
Line 1: Line 1:
'''Haptics''' or '''Tactile feedback''' is a [[technology]] that produces the sense of touch through physical stimulation. The term comes from the Greek word "haptikos," meaning "able to touch or grasp." Haptics can improve the user's [[immersion]] in a [[VR]] world. It allows users to experience physical sensations caused by their actions in a [[virtual environment]]. When a user picks up a cup in the virtual world, the user should feel the realistic sensations of a cup in his or her hand, even though the cup is not present in the real world.
'''Haptics''' (from the Greek ''haptikos'', "able to touch or grasp") or '''[[Tactile feedback]]''' is a [[technology]] that produces the sense of touch through physical stimulation. Haptics can significantly improve the user's [[immersion]] in a [[VR]] world by allowing users to experience physical sensations caused by their actions in a [[virtual environment]]. When a user picks up a cup in the virtual world, the user should feel the realistic sensations of a cup in his or her hand, even though the cup is not present in the real world. This bidirectional exchange of sensory information creates a deeper connection between users and digital worlds, making haptics a critical component in creating believable and effective VR and [[AR]] experiences.<ref name="srivastava2019">Srivastava, K., Kukreja, S. L., & Shinghal, K. (2019). Haptic Technology: A Comprehensive Review of its Applications and Future Potential. *Journal of Mechatronics, Electrical Power, and Vehicular Technology*, *10*(2), 99-112.</ref>


In traditional [[video game]] controllers, "[[rumble]]" is often used to produce tactile feedback. However, modern [[haptic systems]] in [[AR]] and VR environments offer much more sophisticated and nuanced feedback mechanisms.
In traditional [[video game]] controllers, "[[rumble]]" is often used to produce tactile feedback. However, modern [[haptic systems]] in AR and VR environments offer much more sophisticated and nuanced feedback mechanisms.
 
== Physiology of Touch ==
 
Human skin contains four main classes of [[mechanoreceptor]]s: Merkel cells (pressure), Meissner corpuscles (low-frequency vibration), Ruffini endings (skin stretch), and Pacinian corpuscles (high-frequency vibration). These receptors are tuned to different frequencies and deformations, allowing us to perceive a wide range of tactile sensations.<ref>Lederman, S. J., & Klatzky, R. L. (2009). Haptic perception: A tutorial. Attention, Perception, & Psychophysics, 71(7), 1439-1459.</ref>
 
Cutaneous cues (pressure, vibration, stretch) combine with [[kinesthetic sense|kinesthetic]] cues from muscles and joints to form a multimodal "haptic channel" that informs us about object properties and our interactions with the environment.<ref>{{cite journal|title=Haptics is comprised of kinesthetic and cutaneous feedback|journal=Applied Sciences|year=2024|doi=10.3390/app14146020}}</ref>


== History of Haptics ==
== History of Haptics ==
Line 13: Line 19:
The evolution of haptics in VR/AR contexts accelerated in the 2010s with the resurgence of consumer virtual reality technology. [[Oculus]] (later acquired by [[Facebook]]/[[Meta]]) began implementing haptic controllers with their [[Oculus Touch]] controllers in 2016, and [[HTC]] included similar capabilities in their [[Vive]] controllers.<ref>Burdea, G. C. (2019). Haptic feedback for virtual reality. Virtual reality and augmented reality, 17-30.</ref>
The evolution of haptics in VR/AR contexts accelerated in the 2010s with the resurgence of consumer virtual reality technology. [[Oculus]] (later acquired by [[Facebook]]/[[Meta]]) began implementing haptic controllers with their [[Oculus Touch]] controllers in 2016, and [[HTC]] included similar capabilities in their [[Vive]] controllers.<ref>Burdea, G. C. (2019). Haptic feedback for virtual reality. Virtual reality and augmented reality, 17-30.</ref>


== Types of Haptic Technology ==
== Types of Haptic Feedback ==
 
Haptic feedback can be broadly categorized based on the type of sensory information it provides:
 
=== Tactile Feedback ===
 
[[Tactile feedback]] engages the mechanoreceptors in the skin to simulate sensations like pressure, vibration, stretch, texture, and temperature.<ref name="culbertson2018">Culbertson, H., Schorr, S. B., & Okamura, A. M. (2018). Haptics: The Technology of Touch. *Annual Review of Control, Robotics, and Autonomous Systems*, *1*, 385-409.</ref>


=== Vibrotactile Feedback ===
==== Vibrotactile Feedback ====


[[Vibrotactile feedback]] uses vibration to create tactile sensations and is the most common form of haptic feedback in consumer devices. It typically employs [[eccentric rotating mass]] (ERM) motors or [[linear resonant actuators]] (LRA).<ref>Choi, S., & Kuchenbecker, K. J. (2013). Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE, 101(9), 2093-2104.</ref>
[[Vibrotactile feedback]] uses vibration to create tactile sensations and is the most common form of haptic feedback in consumer devices. It typically employs [[eccentric rotating mass]] (ERM) motors or [[linear resonant actuators]] (LRA).<ref>Choi, S., & Kuchenbecker, K. J. (2013). Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE, 101(9), 2093-2104.</ref>
Line 21: Line 33:
Modern VR controllers like the [[Meta Quest 2]] controllers and [[Valve Index]] controllers use vibrotactile feedback to simulate interactions with virtual objects.<ref>Benko, H., Holz, C., Sinclair, M., & Ofek, E. (2016, October). Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (pp. 717-728).</ref>
Modern VR controllers like the [[Meta Quest 2]] controllers and [[Valve Index]] controllers use vibrotactile feedback to simulate interactions with virtual objects.<ref>Benko, H., Holz, C., Sinclair, M., & Ofek, E. (2016, October). Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (pp. 717-728).</ref>


=== Force Feedback ===
==== Electrotactile Stimulation ====
 
[[Electrotactile]] or [[electrocutaneous stimulation]] delivers small electrical currents to stimulate nerves in the skin, creating various tactile sensations. Companies like [[Teslasuit]] have incorporated this technology into full-body haptic suits for VR training and gaming.<ref>Kaczmarek, K. A., Webster, J. G., Bach-y-Rita, P., & Tompkins, W. J. (1991). Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Transactions on Biomedical Engineering, 38(1), 1-16.</ref>
 
==== Thermal Feedback ====
 
[[Thermal feedback]] systems use [[Peltier element]]s or similar technologies to create sensations of heat or cold. These can enhance immersion by simulating temperature changes in virtual environments.<ref>Wilson, G., Halvey, M., Brewster, S. A., & Hughes, S. A. (2011, May). Some like it hot: thermal feedback for mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2555-2564).</ref>
 
==== Texture Simulation ====
 
Creating the sensation of surface roughness or patterns often using high-frequency vibrations or electrostatics to simulate different textures when touching virtual surfaces.<ref name="culbertson2018"/>
 
=== Kinesthetic Feedback ===
 
[[Kinesthetic feedback]] provides information about limb position and movement by applying forces to the user's body, engaging muscles and joints. This simulates weight, inertia, resistance, and large-scale impacts.<ref name="culbertson2018"/>
 
==== Force Feedback ====


[[Force feedback]] systems provide resistance or force to the user, simulating the physical properties of virtual objects. These can include:
[[Force feedback]] systems provide resistance or force to the user, simulating the physical properties of virtual objects. These can include:
Line 30: Line 58:
Commercial examples include the [[PHANTOM]] haptic device (now part of [[3D Systems]]) and [[Haption's Virtuose]] systems, which have been used for medical training, industrial design, and scientific visualization.<ref>Laycock, S. D., & Day, A. M. (2003). Recent developments and applications of haptic devices. Computer Graphics Forum, 22(2), 117-132.</ref>
Commercial examples include the [[PHANTOM]] haptic device (now part of [[3D Systems]]) and [[Haption's Virtuose]] systems, which have been used for medical training, industrial design, and scientific visualization.<ref>Laycock, S. D., & Day, A. M. (2003). Recent developments and applications of haptic devices. Computer Graphics Forum, 22(2), 117-132.</ref>


=== Electrotactile Stimulation ===
==== Motion Simulation ====


[[Electrotactile]] or [[electrocutaneous stimulation]] delivers small electrical currents to stimulate nerves in the skin, creating various tactile sensations. Companies like [[Teslasuit]] have incorporated this technology into full-body haptic suits for VR training and gaming.<ref>Kaczmarek, K. A., Webster, J. G., Bach-y-Rita, P., & Tompkins, W. J. (1991). Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Transactions on Biomedical Engineering, 38(1), 1-16.</ref>
Using platforms or actuated chairs to simulate large-scale movements like vehicle acceleration, flight G-forces, or walking sensations. These systems are often used in training simulators and advanced entertainment applications.<ref>Salisbury, J. K., Conti, F., & Barbagli, F. (2004). Haptic rendering: introductory concepts. *IEEE Computer Graphics and Applications*, *24*(2), 24-32.</ref>
 
=== Thermal Feedback ===
 
[[Thermal feedback]] systems use [[Peltier elements]] or similar technologies to create sensations of heat or cold. These can enhance immersion by simulating temperature changes in virtual environments.<ref>Wilson, G., Halvey, M., Brewster, S. A., & Hughes, S. A. (2011, May). Some like it hot: thermal feedback for mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2555-2564).</ref>


=== Ultrasonic Haptics ===
=== Ultrasonic Haptics ===
Line 49: Line 73:


Systems using [[mechanical constraints]] physically limit user movement to simulate walls, surfaces, or object boundaries. Examples include the [[CLAW]] controller by Microsoft Research and [[EXIII]] haptic devices.<ref>Choi, I., Hawkes, E. W., Christensen, D. L., Ploch, C. J., & Follmer, S. (2016, October). Wolverine: A wearable haptic interface for grasping in virtual reality. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 986-993).</ref>
Systems using [[mechanical constraints]] physically limit user movement to simulate walls, surfaces, or object boundaries. Examples include the [[CLAW]] controller by Microsoft Research and [[EXIII]] haptic devices.<ref>Choi, I., Hawkes, E. W., Christensen, D. L., Ploch, C. J., & Follmer, S. (2016, October). Wolverine: A wearable haptic interface for grasping in virtual reality. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 986-993).</ref>
== Actuator Technologies ==
{| class="wikitable"
! Actuator !! Principle !! Typical use
|-
| [[Vibrotactile actuator|ERMs & LRAs]] || Rotating or linear mass vibration || Gamepads, phones, VR controllers
|-
| Piezoelectric stacks || Crystal deformation || High-fidelity mobile haptics
|-
| Electro-/magnetorheological brakes || Variable resistance || Kinesthetic exoskeletons
|-
| Focused ultrasound arrays || Acoustic radiation pressure || Mid-air buttons & sliders
|-
| EMS/TENS electrodes || Electrical stimulation of nerves/muscles || Full-body suits, rehabilitation
|-
| Microfluidic actuators || Controlled fluid pressure || High-density tactile arrays
|}
== Haptic Rendering ==
[[Haptic rendering]] is the process of calculating appropriate forces to display to users based on their interactions with virtual objects. Interactive VR applications typically run a 500–1,000 Hz haptic control loop that:
1. Samples user motion
2. Computes contact forces using a physics engine
3. Drives actuators via device SDKs (e.g., OpenXR 1.1 haptics extension)
Advances in [[physics simulation]] and [[collision detection]] are continuously improving the realism of haptic interactions.<ref>Otaduy, M. A., & Lin, M. C. (2005). Introduction to haptic rendering. In ACM SIGGRAPH 2005 Courses (pp. 3-es).</ref>
Research into [[multi-point haptic rendering]] addresses limitations of traditional single-point interfaces, allowing users to feel virtual objects with their entire hand or body.<ref>Prattichizzo, D., Chinello, F., Pacchierotti, C., & Malvezzi, M. (2013). Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback. IEEE Transactions on Haptics, 6(4), 506-516.</ref>


== Force Display Devices ==
== Force Display Devices ==
Line 57: Line 110:


[[Traxion]] is a similar device developed by [[Jun Rekimoto]]. The device also creates a virtual force by asymmetrically vibrating the actuator.
[[Traxion]] is a similar device developed by [[Jun Rekimoto]]. The device also creates a virtual force by asymmetrically vibrating the actuator.
== Haptic Devices for VR/AR ==
=== Handheld Controllers ===
Standard [[VR controller]]s (e.g., Meta Quest controllers, Valve Index Controllers, PlayStation VR2 Sense controllers) typically include basic [[vibrotactile feedback]] (ERM or LRA). Some advanced controllers incorporate more nuanced effects, like the adaptive triggers and detailed haptics in the PS VR2 Sense controllers.<ref>Sony Interactive Entertainment. (n.d.). *PlayStation VR2 Sense controller*. Retrieved April 29, 2025, from https://www.playstation.com/en-us/ps-vr2/controllers/</ref>
The [[PlayStation 5's DualSense]] controller represents one of the most advanced mainstream haptic controllers, using adaptive triggers and high-fidelity vibrotactile feedback to simulate different surfaces and resistances.<ref>Colgan, A. (2021). The PlayStation 5 DualSense Controller: A New Era for Haptics in Gaming. IEEE Consumer Electronics Magazine, 10(3), 6-8.</ref>
=== Haptic Gloves ===
[[Haptic glove]]s aim to provide high-fidelity feedback to the hands and fingers. They often combine finger tracking with various feedback mechanisms:
* Vibrotactile arrays: Multiple small actuators across the palm and fingers for localized sensations
* Force feedback: Systems (using cables, pneumatics, or exoskeletons) that apply resistance to finger movement, simulating the shape and rigidity of virtual objects
Examples include [[HaptX Gloves G1]] with micro-fluidic actuators for true-contact pressure, [[SenseGlove]], and [[Manus VR]].<ref>HaptX Inc. (n.d.). *HaptX Gloves*. Retrieved April 29, 2025, from https://haptx.com/</ref>
=== Haptic Vests and Suits ===
[[Haptic suit]]s or vests extend tactile feedback to the torso and sometimes limbs. They typically use an array of vibrotactile actuators to simulate impacts, environmental effects (like rain or wind direction), or proximity alerts across the body. Examples include the [[bHaptics]] TactSuit range and [[TESLASUIT]], which integrates electrical muscle stimulation (EMS), motion capture, and biometry.<ref>bHaptics Inc. (n.d.). *TactSuit*. Retrieved April 29, 2025, from https://www.bhaptics.com/</ref><ref>TESLASUIT. (n.d.). *TESLASUIT XR Edition*. Retrieved April 29, 2025, from https://teslasuit.io/products/teslasuit-4/</ref>


== Applications in VR and AR ==
== Applications in VR and AR ==
Line 64: Line 137:
[[Haptic gaming]] provides immersive experiences by allowing players to feel virtual environments and objects. Advanced systems like the [[Teslasuit]], [[bHaptics TactSuit]], and [[Dexmo]] exoskeleton gloves enable users to feel impacts, textures, and resistance in games.<ref>Pacchierotti, C., Sinclair, S., Solazzi, M., Frisoli, A., Hayward, V., & Prattichizzo, D. (2017). Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE transactions on haptics, 10(4), 580-600.</ref>
[[Haptic gaming]] provides immersive experiences by allowing players to feel virtual environments and objects. Advanced systems like the [[Teslasuit]], [[bHaptics TactSuit]], and [[Dexmo]] exoskeleton gloves enable users to feel impacts, textures, and resistance in games.<ref>Pacchierotti, C., Sinclair, S., Solazzi, M., Frisoli, A., Hayward, V., & Prattichizzo, D. (2017). Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE transactions on haptics, 10(4), 580-600.</ref>


The [[PlayStation 5's DualSense]] controller represents one of the most advanced mainstream haptic controllers, using adaptive triggers and high-fidelity vibrotactile feedback to simulate different surfaces and resistances.<ref>Colgan, A. (2021). The PlayStation 5 DualSense Controller: A New Era for Haptics in Gaming. IEEE Consumer Electronics Magazine, 10(3), 6-8.</ref>
Next-gen consoles and XR headsets use localized haptics to convey weapon recoil, surface textures, and locomotion cues. Game-specific haptic tracks (e.g., *Astro Bot*, *Returnal*) significantly raise presence and immersion.<ref>{{cite news|title=Astro Bot showcases DualSense haptics|publisher=Polygon|date=Sep 2024}}</ref>


=== Medical Training and Simulation ===
=== Medical Training and Simulation ===
Line 71: Line 144:


[[Haptic-enabled AR]] systems allow surgeons to "feel" pre-operative medical images during surgical planning, significantly improving spatial understanding.<ref>Sutherland, C., Hashtrudi-Zaad, K., Sellens, R., Abolmaesumi, P., & Mousavi, P. (2019). An augmented reality haptic training simulator for spinal needle procedures. IEEE Transactions on Biomedical Engineering, 66(11), 3094-3104.</ref>
[[Haptic-enabled AR]] systems allow surgeons to "feel" pre-operative medical images during surgical planning, significantly improving spatial understanding.<ref>Sutherland, C., Hashtrudi-Zaad, K., Sellens, R., Abolmaesumi, P., & Mousavi, P. (2019). An augmented reality haptic training simulator for spinal needle procedures. IEEE Transactions on Biomedical Engineering, 66(11), 3094-3104.</ref>
Haptics improves psychomotor skill transfer in medical simulators, with systematic reviews showing enhanced accuracy and reduced task time in surgical training.<ref>{{cite journal|title=Haptic technology in healthcare: a systematic review|journal=JMIR|year=2024}}</ref>
=== Education ===
Haptics in VR and AR is transformative in education, particularly in science, technology, engineering, and mathematics (STEM) fields. It enables multi-sensory learning by integrating visual, auditory, kinesthetic, and tactile feedback, essential for hands-on experiences.
For example, VR with haptics can simulate laboratory experiments, allowing students to feel and manipulate virtual scientific equipment. AR applications with haptic feedback facilitate interactive exploration of complex systems like anatomy or molecular structures, improving understanding and long-term retention.<ref>{{cite journal|title=Haptic feedback in VR education: A systematic review and meta-analysis|journal=Computers & Education|year=2023|volume=189}}</ref>


=== Industrial Training and Design ===
=== Industrial Training and Design ===
Line 83: Line 164:


[[Tactile communication systems]] allow deaf-blind individuals to receive communication through haptic patterns, often through gloves or wearable devices on the body.<ref>Baumann, R., Jung, J., & Rogers, S. (2020). Supporting the deaf and hard of hearing in virtual reality with an enhanced user interface. 2020 IEEE Virtual Reality and 3D User Interfaces (VR), 273-282.</ref>
[[Tactile communication systems]] allow deaf-blind individuals to receive communication through haptic patterns, often through gloves or wearable devices on the body.<ref>Baumann, R., Jung, J., & Rogers, S. (2020). Supporting the deaf and hard of hearing in virtual reality with an enhanced user interface. 2020 IEEE Virtual Reality and 3D User Interfaces (VR), 273-282.</ref>
Combining haptic displays with VR/AR can create powerful accessibility tools, allowing alternative sensory channels to compensate for vision or hearing impairments.<ref>{{cite news|title=UCL synthetic touch technology could transform healthcare|publisher=Financial Times|date=11 Oct 2024}}</ref>
=== Rehabilitation ===
Combining haptic exoskeletons with VR accelerates stroke recovery by increasing engagement and repetitions. The gamification of rehabilitation exercises through VR with haptic feedback has shown significant improvements in patient motivation and outcomes.<ref>{{cite journal|title=Efficacy of VR-based rehabilitation in stroke|journal=Annals of Medicine|year=2023|volume=55|issue=2}}</ref>


=== Telepresence and Teleoperation ===
=== Telepresence and Teleoperation ===
Line 89: Line 176:


[[Haptic teleoperation]] enables precise control of robots in delicate or complex tasks by providing operators with tactile feedback from the robot's interactions.<ref>Son, H. I., Franchi, A., Chuang, L. L., Kim, J., Bulthoff, H. H., & Giordano, P. R. (2013). Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots. IEEE Transactions on Cybernetics, 43(2), 597-609.</ref>
[[Haptic teleoperation]] enables precise control of robots in delicate or complex tasks by providing operators with tactile feedback from the robot's interactions.<ref>Son, H. I., Franchi, A., Chuang, L. L., Kim, J., Bulthoff, H. H., & Giordano, P. R. (2013). Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots. IEEE Transactions on Cybernetics, 43(2), 597-609.</ref>
== Standards and Interoperability ==
* '''[[ISO 9241-910]]/920''' provide terminology and design guidance for tactile/gestural interfaces.<ref>{{cite web|title=ISO 9241-910:2011 – Ergonomics of human-system interaction – Framework for tactile/haptic interaction|url=https://www.iso.org/standard/51097.html|access-date=29 April 2025}}</ref> 
* '''[[IEEE VR]]''' and '''[[SIGGRAPH]]''' host annual Haptics symposia where new devices debut. 
* '''[[OpenXR 1.1]]''' (Khronos) unifies API calls for amplitude-/frequency-controlled haptic output across headsets.
The haptics industry faces challenges in [[haptic standardization]], with different devices using proprietary formats and protocols. Initiatives like the [[Haptics Industry Forum]] are working to establish standards for haptic content creation and playback across platforms.<ref>ISO/TC 159/SC 4 Ergonomics of human-system interaction. (2022). ISO 9241-910:2022 Ergonomics of human-system interaction — Part 910: Framework for tactile and haptic interaction.</ref>
[[Haptic codecs]] like [[MPEG-V]] and [[MPEG-H]] include provisions for standardized haptic data, though adoption remains limited compared to audio and video standards.<ref>Eid, M., Orozco, M., & El Saddik, A. (2007, June). A guided tour in haptic audio visual environments and applications. In 2007 IEEE International Conference on Multimedia and Expo (pp. 1449-1452). IEEE.</ref>


== Current Research and Challenges ==
== Current Research and Challenges ==
Line 97: Line 194:


[[Stretchable electronics]] and [[e-textiles]] are enabling the development of haptic systems integrated directly into clothing or applied to the skin like temporary tattoos.<ref>Yao, S., & Zhu, Y. (2015). Nanomaterial-enabled stretchable conductors: strategies, materials and devices. Advanced Materials, 27(9), 1480-1511.</ref>
[[Stretchable electronics]] and [[e-textiles]] are enabling the development of haptic systems integrated directly into clothing or applied to the skin like temporary tattoos.<ref>Yao, S., & Zhu, Y. (2015). Nanomaterial-enabled stretchable conductors: strategies, materials and devices. Advanced Materials, 27(9), 1480-1511.</ref>
=== Haptic Rendering and Algorithms ===
[[Haptic rendering]] is the process of calculating appropriate forces to display to users based on their interactions with virtual objects. Advances in [[physics simulation]] and [[collision detection]] are improving the realism of haptic interactions.<ref>Otaduy, M. A., & Lin, M. C. (2005). Introduction to haptic rendering. In ACM SIGGRAPH 2005 Courses (pp. 3-es).</ref>
Research into [[multi-point haptic rendering]] addresses limitations of traditional single-point interfaces, allowing users to feel virtual objects with their entire hand or body.<ref>Prattichizzo, D., Chinello, F., Pacchierotti, C., & Malvezzi, M. (2013). Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback. IEEE Transactions on Haptics, 6(4), 506-516.</ref>


=== Surface Haptics ===
=== Surface Haptics ===
Line 116: Line 207:
[[Haptic brain-computer interfaces]] (BCIs) could potentially create fully immersive tactile experiences without physical haptic hardware, though this research remains in early stages.<ref>Cincotti, F., Mattia, D., Aloise, F., Bufalari, S., Schalk, G., Oriolo, G., ... & Marciani, M. G. (2008). Non-invasive brain–computer interface system: towards its application as assistive technology. Brain research bulletin, 75(6), 796-803.</ref>
[[Haptic brain-computer interfaces]] (BCIs) could potentially create fully immersive tactile experiences without physical haptic hardware, though this research remains in early stages.<ref>Cincotti, F., Mattia, D., Aloise, F., Bufalari, S., Schalk, G., Oriolo, G., ... & Marciani, M. G. (2008). Non-invasive brain–computer interface system: towards its application as assistive technology. Brain research bulletin, 75(6), 796-803.</ref>


=== Standardization and Interoperability ===
=== Latency and Synchronization ===
 
[[Haptic latency]] must be minimized and synchronized precisely with visual and auditory cues; delays can break immersion and cause discomfort. Current research focuses on reducing end-to-end latency in haptic systems to below perceptible thresholds.<ref name="srivastava2019"/>


The haptics industry faces challenges in [[haptic standardization]], with different devices using proprietary formats and protocols. Initiatives like the [[Haptics Industry Forum]] are working to establish standards for haptic content creation and playback across platforms.<ref>ISO/TC 159/SC 4 Ergonomics of human-system interaction. (2022). ISO 9241-910:2022 Ergonomics of human-system interaction — Part 910: Framework for tactile and haptic interaction.</ref>
=== Power and Cost Limitations ===


[[Haptic codecs]] like [[MPEG-V]] and [[MPEG-H]] include provisions for standardized haptic data, though adoption remains limited compared to audio and video standards.<ref>Eid, M., Orozco, M., & El Saddik, A. (2007, June). A guided tour in haptic audio visual environments and applications. In 2007 IEEE International Conference on Multimedia and Expo (pp. 1449-1452). IEEE.</ref>
Many advanced haptic technologies require significant power and can be expensive to produce, limiting their adoption in consumer devices. Research into energy-efficient actuators and more cost-effective manufacturing methods is ongoing.<ref>{{cite journal|title=Is modularity the future of haptics in XR? A systematic literature review|journal|Virtual Reality|year=2025}}</ref>


== Future Directions ==
== Future Directions ==
Line 147: Line 240:


[[Context-aware haptics]] adjusts tactile feedback based on environmental factors, user state, and application context to provide more relevant and effective haptic experiences.<ref>MacLean, K. E. (2008). Haptic interaction design for everyday interfaces. Reviews of Human Factors and Ergonomics, 4(1), 149-194.</ref>
[[Context-aware haptics]] adjusts tactile feedback based on environmental factors, user state, and application context to provide more relevant and effective haptic experiences.<ref>MacLean, K. E. (2008). Haptic interaction design for everyday interfaces. Reviews of Human Factors and Ergonomics, 4(1), 149-194.</ref>
=== Self-Powered Haptic Systems ===
Recent research has produced breakthroughs like self-powered [[electrotactile glove]]s that use triboelectric textiles to generate their own stimulation current, eliminating the need for external power sources and reducing weight.<ref>{{cite journal|title=Self-powered electrotactile textile haptic glove|journal=Science Advances|year=2025|doi=10.1126/sciadv.adt0318}}</ref>


== See Also ==
== See Also ==
Line 156: Line 253:
* [[Virtual reality]]
* [[Virtual reality]]
* [[Augmented reality]]
* [[Augmented reality]]
* [[Kinesthetic sense]]
* [[Teleoperation]]
* [[Sensory substitution]]


== References ==
== References ==
Line 164: Line 264:
[[Category:Feedback systems]]
[[Category:Feedback systems]]
[[Category:Haptics]]
[[Category:Haptics]]
[[Category:Human–computer interaction]]