Haptics: Difference between revisions
Appearance
Shadowdawn (talk | contribs) No edit summary |
Xinreality (talk | contribs) No edit summary |
||
Line 1: | Line 1: | ||
'''Haptics''' or '''Tactile feedback''' is a [[technology]] that produces the sense of touch through physical stimulation. The term comes from the Greek word "haptikos," meaning "able to touch or grasp." Haptics can improve the user's [[immersion]] in a [[VR]] world. It allows users to experience physical sensations caused by their actions in a [[virtual environment]]. When a user picks up a cup in the virtual world, the user should feel the realistic sensations of a cup in his or her hand, even though the cup is not present in the real world. | |||
'''Haptics''' or '''Tactile feedback''' is a technology that produces the sense of touch through physical stimulation. Haptics can improve the user's [[immersion]] in a [[VR]] world. It allows users to experience physical sensations caused by their actions in a virtual environment. When a user picks up a cup in the virtual world, the user should feel the realistic sensations of a cup in his or her hand | |||
In traditional video game controllers, "rumble" is often used to produce tactile feedback. | In traditional [[video game]] controllers, "[[rumble]]" is often used to produce tactile feedback. However, modern [[haptic systems]] in [[AR]] and VR environments offer much more sophisticated and nuanced feedback mechanisms. | ||
== | == History of Haptics == | ||
The study of haptics has origins dating back to the 1950s when engineers began researching mechanical manipulators for handling hazardous materials.<ref>Hannaford, B., & Okamura, A. M. (2016). Haptics. In Springer handbook of robotics (pp. 1063-1084). Springer, Cham.</ref> The term "haptics" was first officially adopted into the field of [[human-computer interaction]] during the early 1990s. | |||
Early haptic interfaces for computing appeared in the 1970s with the development of [[force-feedback]] systems at research institutions like the [[University of North Carolina]] and [[MIT]].<ref>Salisbury, K., Conti, F., & Barbagli, F. (2004). Haptic rendering: introductory concepts. IEEE computer graphics and applications, 24(2), 24-32.</ref> | |||
==References== | In 1997, the release of the [[Nintendo 64 Rumble Pak]] marked one of the first mainstream haptic interfaces in consumer electronics, introducing gamers to basic vibrotactile feedback.<ref>Biggs, S. J., & Srinivasan, M. A. (2002). Haptic interfaces. Handbook of virtual environments, 93-116.</ref> | ||
The evolution of haptics in VR/AR contexts accelerated in the 2010s with the resurgence of consumer virtual reality technology. [[Oculus]] (later acquired by [[Facebook]]/[[Meta]]) began implementing haptic controllers with their [[Oculus Touch]] controllers in 2016, and [[HTC]] included similar capabilities in their [[Vive]] controllers.<ref>Burdea, G. C. (2019). Haptic feedback for virtual reality. Virtual reality and augmented reality, 17-30.</ref> | |||
== Types of Haptic Technology == | |||
=== Vibrotactile Feedback === | |||
[[Vibrotactile feedback]] uses vibration to create tactile sensations and is the most common form of haptic feedback in consumer devices. It typically employs [[eccentric rotating mass]] (ERM) motors or [[linear resonant actuators]] (LRA).<ref>Choi, S., & Kuchenbecker, K. J. (2013). Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE, 101(9), 2093-2104.</ref> | |||
Modern VR controllers like the [[Meta Quest 2]] controllers and [[Valve Index]] controllers use vibrotactile feedback to simulate interactions with virtual objects.<ref>Benko, H., Holz, C., Sinclair, M., & Ofek, E. (2016, October). Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (pp. 717-728).</ref> | |||
=== Force Feedback === | |||
[[Force feedback]] systems provide resistance or force to the user, simulating the physical properties of virtual objects. These can include: | |||
* [[Grounded force feedback]] - Systems physically connected to a stationary base (like robotic arms or exoskeletons) | |||
* [[Ungrounded force feedback]] - Systems that create the illusion of force without being physically anchored | |||
Commercial examples include the [[PHANTOM]] haptic device (now part of [[3D Systems]]) and [[Haption's Virtuose]] systems, which have been used for medical training, industrial design, and scientific visualization.<ref>Laycock, S. D., & Day, A. M. (2003). Recent developments and applications of haptic devices. Computer Graphics Forum, 22(2), 117-132.</ref> | |||
=== Electrotactile Stimulation === | |||
[[Electrotactile]] or [[electrocutaneous stimulation]] delivers small electrical currents to stimulate nerves in the skin, creating various tactile sensations. Companies like [[Teslasuit]] have incorporated this technology into full-body haptic suits for VR training and gaming.<ref>Kaczmarek, K. A., Webster, J. G., Bach-y-Rita, P., & Tompkins, W. J. (1991). Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Transactions on Biomedical Engineering, 38(1), 1-16.</ref> | |||
=== Thermal Feedback === | |||
[[Thermal feedback]] systems use [[Peltier elements]] or similar technologies to create sensations of heat or cold. These can enhance immersion by simulating temperature changes in virtual environments.<ref>Wilson, G., Halvey, M., Brewster, S. A., & Hughes, S. A. (2011, May). Some like it hot: thermal feedback for mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2555-2564).</ref> | |||
=== Ultrasonic Haptics === | |||
[[Ultrasonic haptics]] use focused ultrasound waves to create tactile sensations in mid-air without requiring users to wear or hold any devices. Companies like [[Ultraleap]] (formerly [[Ultrahaptics]]) have developed systems that allow users to "feel" virtual objects without physical contact.<ref>Carter, T., Seah, S. A., Long, B., Drinkwater, B., & Subramanian, S. (2013, October). UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In Proceedings of the 26th annual ACM symposium on User interface software and technology (pp. 505-514).</ref> | |||
=== Pneumatic and Hydraulic Systems === | |||
[[Pneumatic]] and [[hydraulic systems]] use air or fluid pressure to create force feedback. These can be used in gloves, suits, or other wearable devices to simulate touch and pressure.<ref>Burdea, G., Zhuang, J., Roskos, E., Silver, D., & Langrana, N. (1992, April). A portable dextrous master with force feedback. In Proceedings of IEEE Virtual Reality Annual International Symposium (pp. 55-62).</ref> | |||
=== Mechanical Constraints === | |||
Systems using [[mechanical constraints]] physically limit user movement to simulate walls, surfaces, or object boundaries. Examples include the [[CLAW]] controller by Microsoft Research and [[EXIII]] haptic devices.<ref>Choi, I., Hawkes, E. W., Christensen, D. L., Ploch, C. J., & Follmer, S. (2016, October). Wolverine: A wearable haptic interface for grasping in virtual reality. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 986-993).</ref> | |||
== Force Display Devices == | |||
In the SIGGRAPH technology conference, Japanese scientists [[Tomohiro Amemiya]] and [[Hiroaki Gomi]] demonstrated two [[Force display]] devices: [[Traxion]] and [[Buru-Navi3]]. When held, these devices can cause push and pull sensations while vibrating in place. The force these devices generate is strong enough to guide a blind person.<ref>http://www.technologyreview.com/news/528886/could-force-illusions-help-wearables-catch-on/</ref> | |||
[[Buru-Navi3]] is a wine-cork sized device that contains a 40-hertz electromagnetic actuator. When held between 2 fingers, it creates a force illusion in towards or away from the user. | |||
[[Traxion]] is a similar device developed by [[Jun Rekimoto]]. The device also creates a virtual force by asymmetrically vibrating the actuator. | |||
== Applications in VR and AR == | |||
=== Gaming and Entertainment === | |||
[[Haptic gaming]] provides immersive experiences by allowing players to feel virtual environments and objects. Advanced systems like the [[Teslasuit]], [[bHaptics TactSuit]], and [[Dexmo]] exoskeleton gloves enable users to feel impacts, textures, and resistance in games.<ref>Pacchierotti, C., Sinclair, S., Solazzi, M., Frisoli, A., Hayward, V., & Prattichizzo, D. (2017). Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE transactions on haptics, 10(4), 580-600.</ref> | |||
The [[PlayStation 5's DualSense]] controller represents one of the most advanced mainstream haptic controllers, using adaptive triggers and high-fidelity vibrotactile feedback to simulate different surfaces and resistances.<ref>Colgan, A. (2021). The PlayStation 5 DualSense Controller: A New Era for Haptics in Gaming. IEEE Consumer Electronics Magazine, 10(3), 6-8.</ref> | |||
=== Medical Training and Simulation === | |||
[[Haptic medical simulators]] allow healthcare professionals to practice procedures without risk to real patients. Systems like [[3D Systems' Touch]] (formerly [[Sensable Phantom]]) and [[FundamentalVR's Fundamental Surgery]] provide force feedback for surgical training.<ref>Coles, T. R., Meglan, D., & John, N. W. (2011). The role of haptics in medical training simulators: A survey of the state of the art. IEEE Transactions on haptics, 4(1), 51-66.</ref> | |||
[[Haptic-enabled AR]] systems allow surgeons to "feel" pre-operative medical images during surgical planning, significantly improving spatial understanding.<ref>Sutherland, C., Hashtrudi-Zaad, K., Sellens, R., Abolmaesumi, P., & Mousavi, P. (2019). An augmented reality haptic training simulator for spinal needle procedures. IEEE Transactions on Biomedical Engineering, 66(11), 3094-3104.</ref> | |||
=== Industrial Training and Design === | |||
[[Haptic industrial training]] allows workers to practice complex or dangerous tasks in virtual environments before performing them in reality. Companies like [[EON Reality]] and [[Serious Labs]] develop haptic VR training solutions for industries like construction, manufacturing, and oil and gas.<ref>Wang, Z. R., Wang, P., Xing, L., Mei, L. P., Zhao, J., & Zhang, T. (2019). Haptic rendering for dental training system. IEEE Access, 7, 68275-68282.</ref> | |||
[[Automotive design]] companies like [[BMW]] and [[Ford]] use haptic systems for virtual prototyping, allowing designers to "feel" car interiors and controls before physical prototypes are built.<ref>Bordegoni, M., Cugini, U., Caruso, G., & Polistina, S. (2009). Mixed prototyping for product assessment: a reference framework. International Journal on Interactive Design and Manufacturing (IJIDeM), 3(3), 177-187.</ref> | |||
=== Accessibility === | |||
[[Haptic accessibility devices]] help people with visual impairments navigate environments through tactile feedback. Systems like [[Wayband]] by [[WearWorks]] provide navigation assistance through patterns of vibration.<ref>Van Erp, J. B., Van Veen, H. A., Jansen, C., & Dobbins, T. (2005). Waypoint navigation with a vibrotactile waist belt. ACM Transactions on Applied Perception (TAP), 2(2), 106-117.</ref> | |||
[[Tactile communication systems]] allow deaf-blind individuals to receive communication through haptic patterns, often through gloves or wearable devices on the body.<ref>Baumann, R., Jung, J., & Rogers, S. (2020). Supporting the deaf and hard of hearing in virtual reality with an enhanced user interface. 2020 IEEE Virtual Reality and 3D User Interfaces (VR), 273-282.</ref> | |||
=== Telepresence and Teleoperation === | |||
[[Haptic telepresence]] allows users to remotely "feel" environments through robotic systems. Applications include remote surgical systems, space exploration, and hazardous environment inspection.<ref>Pacchierotti, C., Meli, L., Chinello, F., Malvezzi, M., & Prattichizzo, D. (2015). Cutaneous haptic feedback to ensure the stability of robotic teleoperation systems. The International Journal of Robotics Research, 34(14), 1773-1787.</ref> | |||
[[Haptic teleoperation]] enables precise control of robots in delicate or complex tasks by providing operators with tactile feedback from the robot's interactions.<ref>Son, H. I., Franchi, A., Chuang, L. L., Kim, J., Bulthoff, H. H., & Giordano, P. R. (2013). Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots. IEEE Transactions on Cybernetics, 43(2), 597-609.</ref> | |||
== Current Research and Challenges == | |||
=== Miniaturization and Wearability === | |||
Research into [[microfluidic tactile displays]] and [[smart materials]] aims to create thinner, lighter haptic devices that can be comfortably worn for extended periods.<ref>Wang, D., Ohnishi, K., & Xu, W. (2020). Multimodal haptic display for virtual reality: A survey. IEEE Transactions on Industrial Electronics, 67(1), 610-623.</ref> | |||
[[Stretchable electronics]] and [[e-textiles]] are enabling the development of haptic systems integrated directly into clothing or applied to the skin like temporary tattoos.<ref>Yao, S., & Zhu, Y. (2015). Nanomaterial-enabled stretchable conductors: strategies, materials and devices. Advanced Materials, 27(9), 1480-1511.</ref> | |||
=== Haptic Rendering and Algorithms === | |||
[[Haptic rendering]] is the process of calculating appropriate forces to display to users based on their interactions with virtual objects. Advances in [[physics simulation]] and [[collision detection]] are improving the realism of haptic interactions.<ref>Otaduy, M. A., & Lin, M. C. (2005). Introduction to haptic rendering. In ACM SIGGRAPH 2005 Courses (pp. 3-es).</ref> | |||
Research into [[multi-point haptic rendering]] addresses limitations of traditional single-point interfaces, allowing users to feel virtual objects with their entire hand or body.<ref>Prattichizzo, D., Chinello, F., Pacchierotti, C., & Malvezzi, M. (2013). Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback. IEEE Transactions on Haptics, 6(4), 506-516.</ref> | |||
=== Surface Haptics === | |||
[[Surface haptics]] research focuses on creating tactile sensations on touchscreens and flat surfaces. Technologies like [[electroadhesion]], [[ultrasonic friction modulation]], and [[microelectromechanical systems]] (MEMS) are enabling touchscreens that can simulate textures and buttons.<ref>Meyer, D. J., Peshkin, M. A., & Colgate, J. E. (2013, April). Fingertip friction modulation due to electrostatic attraction. In 2013 world haptics conference (WHC) (pp. 43-48). IEEE.</ref> | |||
Companies like [[Tanvas]] and [[Bosch]] are developing commercial applications of surface haptics for automotive interfaces, mobile devices, and kiosks.<ref>Mullenbach, J., Shultz, C., Colgate, J. E., & Piper, A. M. (2014, April). Surface haptic interactions with a TPad tablet. In Proceedings of the adjunct publication of the 27th annual ACM symposium on User interface software and technology (pp. 7-8).</ref> | |||
=== Neural Interfaces === | |||
Research into [[direct neural stimulation]] aims to bypass mechanical interfaces entirely, potentially allowing users to feel virtual sensations through direct interaction with the nervous system.<ref>Tyler, D. J. (2016). Restoring the human touch: Prosthetics imbued with haptics give their wearers fine motor control and a sense of connection. IEEE Spectrum, 53(5), 28-33.</ref> | |||
[[Haptic brain-computer interfaces]] (BCIs) could potentially create fully immersive tactile experiences without physical haptic hardware, though this research remains in early stages.<ref>Cincotti, F., Mattia, D., Aloise, F., Bufalari, S., Schalk, G., Oriolo, G., ... & Marciani, M. G. (2008). Non-invasive brain–computer interface system: towards its application as assistive technology. Brain research bulletin, 75(6), 796-803.</ref> | |||
=== Standardization and Interoperability === | |||
The haptics industry faces challenges in [[haptic standardization]], with different devices using proprietary formats and protocols. Initiatives like the [[Haptics Industry Forum]] are working to establish standards for haptic content creation and playback across platforms.<ref>ISO/TC 159/SC 4 Ergonomics of human-system interaction. (2022). ISO 9241-910:2022 Ergonomics of human-system interaction — Part 910: Framework for tactile and haptic interaction.</ref> | |||
[[Haptic codecs]] like [[MPEG-V]] and [[MPEG-H]] include provisions for standardized haptic data, though adoption remains limited compared to audio and video standards.<ref>Eid, M., Orozco, M., & El Saddik, A. (2007, June). A guided tour in haptic audio visual environments and applications. In 2007 IEEE International Conference on Multimedia and Expo (pp. 1449-1452). IEEE.</ref> | |||
== Future Directions == | |||
=== Full-Body Haptic Systems === | |||
[[Full-body haptic systems]] aim to provide comprehensive tactile feedback across the entire body. Companies like [[Teslasuit]], [[bHaptics]], and [[Axon VR]] (now [[HaptX]]) are developing suits with hundreds of haptic actuators.<ref>Schorr, S. B., & Okamura, A. M. (2017). Three-dimensional skin deformation as force substitution: Wearable device design and performance during haptic exploration of virtual environments. IEEE transactions on haptics, 10(3), 418-430.</ref> | |||
Research into [[distributed haptic interfaces]] seeks to optimize the placement and types of actuators to maximize feedback while minimizing cost and weight.<ref>Jones, L. A., & Sarter, N. B. (2008). Tactile displays: Guidance for their design and application. Human factors, 50(1), 90-111.</ref> | |||
=== Environmental Haptics === | |||
[[Environmental haptics]] extends beyond wearable devices to create haptic sensations through the physical environment. Technologies include [[acoustic radiation pressure]], [[mid-air ultrasonic arrays]], and [[room-scale haptics]].<ref>Iwamoto, T., Tatezono, M., & Shinoda, H. (2008, August). Non-contact method for producing tactile sensation using airborne ultrasound. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 504-513). Springer, Berlin, Heidelberg.</ref> | |||
[[Haptic projectors]] like those developed by [[Ultraleap]] allow multiple users to experience mid-air haptic sensations without wearable devices.<ref>Long, B., Seah, S. A., Carter, T., & Subramanian, S. (2014, April). Rendering volumetric haptic shapes in mid-air using ultrasound. In ACM Transactions on Graphics (TOG) (Vol. 33, No. 6, pp. 1-10).</ref> | |||
=== Haptic Content Creation === | |||
The development of [[haptic authoring tools]] aims to make haptic content creation more accessible to designers without specialized technical knowledge. Platforms like [[Unity's XR Interaction Toolkit]] and [[Unreal Engine's haptic plugins]] provide frameworks for implementing haptic feedback in VR/AR applications.<ref>Danieau, F., Fleureau, J., Guillotel, P., Mollet, N., Christie, M., & Lécuyer, A. (2014). HapSeat: producing motion sensation with multiple force-feedback devices embedded in a seat. In Proceedings of the 18th ACM symposium on Virtual reality software and technology (pp. 69-76).</ref> | |||
[[Haptic recording]] technologies allow the capture of real-world tactile experiences for playback in virtual environments, similar to how audio and video are recorded.<ref>Kuchenbecker, K. J., Romano, J., & McMahan, W. (2011). Haptography: Capturing and recreating the rich feel of real surfaces. In Robotics research (pp. 245-260). Springer, Berlin, Heidelberg.</ref> | |||
=== Multimodal Integration === | |||
Research into [[cross-modal perception]] examines how haptic feedback interacts with visual and auditory cues, enabling more efficient and convincing multisensory experiences.<ref>Lederman, S. J., & Klatzky, R. L. (2009). Haptic perception: A tutorial. Attention, Perception, & Psychophysics, 71(7), 1439-1459.</ref> | |||
[[Context-aware haptics]] adjusts tactile feedback based on environmental factors, user state, and application context to provide more relevant and effective haptic experiences.<ref>MacLean, K. E. (2008). Haptic interaction design for everyday interfaces. Reviews of Human Factors and Ergonomics, 4(1), 149-194.</ref> | |||
== See Also == | |||
* [[Force feedback]] | |||
* [[Immersion (virtual reality)]] | |||
* [[Tactile sensors]] | |||
* [[Vibrotactile feedback]] | |||
* [[Haptic technology]] | |||
* [[Virtual reality]] | |||
* [[Augmented reality]] | |||
== References == | |||
<references /> | <references /> | ||
[[Category:Terms]] | [[Category:Terms]] | ||
[[Category:Hardware]] | |||
[[Category:Feedback systems]] | |||
[[Category:Haptics]] |