Jump to content

Eye tracking: Difference between revisions

Undo revision 34459 by Xinreality (talk)
Tag: Undo
No edit summary
Line 57: Line 57:
*  '''Automatic [[Interpupillary distance|IPD]] Adjustment''': Some headsets utilize the eye tracking cameras to automatically measure the user's interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort.
*  '''Automatic [[Interpupillary distance|IPD]] Adjustment''': Some headsets utilize the eye tracking cameras to automatically measure the user's interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort.


*  '''[[Accessibility]]''': Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR.<ref name="Yuan2020">Yuan, Z., Bi, T., Muntean, G. M., & Ghinea, G. (2020). Perceived synchronization of mulsemedia services. *IEEE Transactions on Multimedia, 17*(7), 957-966. Note: While this ref discusses synchronization, accessibility is a widely cited benefit of eye tracking.</ref>
*  '''[[Accessibility]]''': Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR.


*  '''Adaptive Optics / [[Varifocal display]]s''': Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.<ref name="Akeley2004">Akeley, K., Watt, S.J., Girshick, A.R., & Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.</ref>
*  '''Adaptive Optics / [[Varifocal display]]s''': Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.<ref name="Akeley2004">Akeley, K., Watt, S.J., Girshick, A.R., & Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.</ref>
Line 142: Line 142:
<ref name="Duchowski2018">Duchowski, A. T., Krejtz, K., Krejtz, I., Biele, C., Niedzielska, A., Kiefer, P., Raubal, M., & Giannopoulos, I. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. *CHI Conference on Human Factors in Computing Systems*, 1-13.</ref>
<ref name="Duchowski2018">Duchowski, A. T., Krejtz, K., Krejtz, I., Biele, C., Niedzielska, A., Kiefer, P., Raubal, M., & Giannopoulos, I. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. *CHI Conference on Human Factors in Computing Systems*, 1-13.</ref>


== Software Development and APIs ==
==Software Development and APIs==
 
Integrating eye tracking into applications requires specific software support:
Integrating eye tracking into applications requires specific software support:


### Development Frameworks and APIs
===Development Frameworks and APIs===
*   '''[[Unity (game engine)|Unity]]''': Provides APIs through its XR Interaction Toolkit and potential vendor-specific SDKs (e.g., Tobii XR SDK).
*'''[[Unity (game engine)|Unity]]''': Provides APIs through its XR Interaction Toolkit and potential vendor-specific SDKs (e.g., Tobii XR SDK).
*   '''[[Unreal Engine]]''': Offers native interfaces and plugins for accessing eye tracking data.
*'''[[Unreal Engine]]''': Offers native interfaces and plugins for accessing eye tracking data.
*   '''[[OpenXR]]''': The cross-platform standard includes an `XR_EXT_eye_gaze_interaction` extension, allowing developers to write more portable code.
*'''[[OpenXR]]''': The cross-platform standard includes an `XR_EXT_eye_gaze_interaction` extension, allowing developers to write more portable code.
*   '''Vendor SDKs''': Companies like [[Tobii]] provide dedicated Software Development Kits offering fine-grained control and optimized features for their hardware.<ref name="TobiiSDK">Tobii Technology. (2023). *Tobii XR SDK Documentation*. Tobii.com.</ref>
*'''Vendor SDKs''': Companies like [[Tobii]] provide dedicated Software Development Kits offering fine-grained control and optimized features for their hardware.<ref name="TobiiSDK">Tobii Technology. (2023). *Tobii XR SDK Documentation*. Tobii.com.</ref>


### Data Access Levels
===Data Access Levels===
Developers typically access eye tracking data at different levels of abstraction:
Developers typically access eye tracking data at different levels of abstraction:
*   '''Raw Data''': Direct coordinates of pupil centers, glints, eye openness, etc. Requires significant processing by the application.
*'''Raw Data''': Direct coordinates of pupil centers, glints, eye openness, etc. Requires significant processing by the application.
*   '''Gaze Data''': Processed output providing calibrated gaze origin and direction vectors, or intersection points on virtual surfaces.
*'''Gaze Data''': Processed output providing calibrated gaze origin and direction vectors, or intersection points on virtual surfaces.
*   '''Eye Movement Events''': Classified data identifying fixations, saccades, and blinks.
*'''Eye Movement Events''': Classified data identifying fixations, saccades, and blinks.
*   '''Semantic Data''': Higher-level interpretations, such as the specific object being looked at (gaze target) or estimated attention levels.
*'''Semantic Data''': Higher-level interpretations, such as the specific object being looked at (gaze target) or estimated attention levels.
<ref name="Kumar2016">Kumar, D., Dutta, A., Das, A., & Lahiri, U. (2016). SmartEye: Developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24*(10), 1051-1059.</ref>
<ref name="Kumar2016">Kumar, D., Dutta, A., Das, A., & Lahiri, U. (2016). SmartEye: Developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24*(10), 1051-1059.</ref>


== Ethical and Privacy Considerations ==
==Ethical and Privacy Considerations==
 
The collection and use of eye tracking data necessitate careful ethical consideration:
The collection and use of eye tracking data necessitate careful ethical consideration:


*   '''Biometric Data Privacy''': Gaze patterns can be unique identifiers. This data requires strong [[data security]] measures and compliance with regulations like [[GDPR]] (which may classify it as sensitive biometric data).
*'''Biometric Data Privacy''': Gaze patterns can be unique identifiers. This data requires strong [[data security]] measures and compliance with regulations like [[GDPR]] (which may classify it as sensitive biometric data).
*   '''Inference and Profiling''': The potential to infer sensitive information (health, emotions, interests) without explicit consent raises ethical questions.
*'''Inference and Profiling''': The potential to infer sensitive information (health, emotions, interests) without explicit consent raises ethical questions.
*   '''Attention Monitoring''': Concerns about workplace surveillance or manipulative advertising based on attention analytics.
*'''Attention Monitoring''': Concerns about workplace surveillance or manipulative advertising based on attention analytics.
*   '''Informed Consent''': Users must be clearly informed about what eye data is collected, how it is processed, stored, and shared, and for what purposes. Opt-in mechanisms are crucial.
*'''Informed Consent''': Users must be clearly informed about what eye data is collected, how it is processed, stored, and shared, and for what purposes. Opt-in mechanisms are crucial.
*   '''Data Minimization and Anonymization''': Collect only necessary data, anonymize or aggregate whenever possible, and define clear data retention policies.
*'''Data Minimization and Anonymization''': Collect only necessary data, anonymize or aggregate whenever possible, and define clear data retention policies.
<ref name="Kroger2020"/>
<ref name="Kroger2020"/>


== Standards and Regulations ==
==Standards and Regulations==
 
Efforts are underway to standardize aspects of eye tracking technology and address its implications:
Efforts are underway to standardize aspects of eye tracking technology and address its implications:
*   '''[[Khronos Group]] / [[OpenXR]]''': Defines standard APIs for accessing eye tracking data (e.g., `XR_EXT_eye_gaze_interaction`).
*'''[[Khronos Group]] / [[OpenXR]]''': Defines standard APIs for accessing eye tracking data (e.g., `XR_EXT_eye_gaze_interaction`).
*   '''[[IEEE Standards Association|IEEE]]''': Working groups like IEEE P2048.5 focus on standards for immersive learning, potentially including eye tracking metrics.
*'''[[IEEE Standards Association|IEEE]]''': Working groups like IEEE P2048.5 focus on standards for immersive learning, potentially including eye tracking metrics.
*   '''[[Virtual Reality Industry Forum]] (VRIF)''': Develops guidelines for VR implementation, potentially covering eye tracking best practices.
*'''[[Virtual Reality Industry Forum]] (VRIF)''': Develops guidelines for VR implementation, potentially covering eye tracking best practices.
*   '''Data Protection Regulations''': [[GDPR]] (Europe), [[CCPA]]/[[CPRA]] (California), and similar laws globally impose requirements on handling personal and biometric data, including eye tracking data.
*'''Data Protection Regulations''': [[GDPR]] (Europe), [[CCPA]]/[[CPRA]] (California), and similar laws globally impose requirements on handling personal and biometric data, including eye tracking data.
*   '''[[ISO]]/[[IEC]] JTC 1/SC 24''': Committee working on international standards for computer graphics, image processing, and environmental data representation, relevant to VR/AR interfaces.<ref name="ISO_SC24">International Organization for Standardization. (2023). *ISO/IEC JTC 1/SC 24 - Computer graphics, image processing and environmental data representation*. ISO.org.</ref>
*'''[[ISO]]/[[IEC]] JTC 1/SC 24''': Committee working on international standards for computer graphics, image processing, and environmental data representation, relevant to VR/AR interfaces.<ref name="ISO_SC24">International Organization for Standardization. (2023). *ISO/IEC JTC 1/SC 24 - Computer graphics, image processing and environmental data representation*. ISO.org.</ref>


== See Also ==
==See Also==
*  [[Augmented reality]]
*  [[Augmented reality]]
*  [[AR headset]]
*  [[AR headset]]
Line 205: Line 202:
*  [[VR headset]]
*  [[VR headset]]


== References ==
==References==
<references />
<references />