Jump to content

Eye tracking

From VR & AR Wiki
Revision as of 16:21, 1 May 2025 by Xinreality (talk | contribs)
See also: Terms and Technical Terms

Eye tracking is a sensor technology that measures eye positions and eye movement. In the context of virtual reality (VR) and augmented reality (AR), it refers to the integration of sensors within VR headsets or AR headsets to determine precisely where the user is looking (gaze) in real time inside a virtual environment or across an overlaid digital interface. By accurately monitoring characteristics such as pupil position, corneal reflections, and eye movements (for example saccades and fixations), the technology enables more immersive, efficient, and intuitive user experiences. Eye tracking has become a critical feature of modern head-mounted displays (HMDs), driving advances in rendering, interaction, analytics, and social presence.[1]

History

Systematic study of eye movements began in the 19th century. Louis Émile Javal observed in 1879 that reading involves discrete fixations and rapid saccades.[2] Early instruments included Edmund Huey’s contact-lens tracker (≈1908),[3] Guy T. Buswell’s film trackers in the 1930s, and Alfred L. Yarbus’s seminal work on task-dependent gaze in the 1960s.[4] Video-based systems emerged in the 1970s, while compact infrared trackers suitable for HMDs appeared in the 2010s.

Technical Principles

Tracking methods

Most contemporary HMDs use one of four foundations:

  • Pupil-centre/corneal-reflection (PCCR): Infrared (IR) LEDs illuminate the eye; IR cameras capture images; computer-vision algorithms locate the pupil centre and one or more corneal “glints”; the vector between them yields 3-D gaze.[5][6]
  • Video-feature tracking: Cameras analyse iris texture, pupil outline, or eye-surface vessels; PCCR can be treated as a subset.[7]
  • Electrooculography (EOG): Electrodes around the eye sense corneo-retinal potential changes; useful for coarse movements or eyelid activity when cameras are unsuitable.[8]
  • Scleral search coil: A wire coil embedded in a contact lens induces current in a surrounding magnetic field, giving sub-arc-minute precision—reserved for laboratory research.[9]

Key components (PCCR)

  • Infrared illuminators
  • High-speed IR cameras
  • Real-time processing (onboard system-on-chip or discrete DSP)
  • Per-user calibration routines[10]

Eye-movement metrics

Systems measure fixations, saccades, smooth pursuit, vergence, pupil diameter, and blink events, each informing attention, cognitive load, or depth cues.[11]

Applications in VR and AR

  • Natural gaze interaction (point-and-confirm, dwell, or gesture) streamlines UI control.[15]
  • Social presence improves when trackers drive avatar eyes and facial expression.[16]
  • Analytics & research: Heat-mapping, UX testing, and training assessment benefit from quantified gaze.[17]

Current Implementations

VR headsets with eye tracking

AR headsets

Technical specifications and performance metrics

Accuracy
Mean angular error; consumer HMDs achieve 0.5°–1.5°.
Precision
RMS sample-to-sample variation, commonly 0.1°–0.5°.
Sampling rate
30 Hz–120 Hz (consumer) to >1000 Hz (research).
Latency
End-to-end delay ideally < 20 ms for foveated rendering; reported values 45–80 ms in some systems.[29]
Robustness
Percentage of valid samples under motion, glasses, etc.
Head box / eye box
Spatial region within which tracking is maintained.[30]

Calibration methods

  • Point-by-point (static targets)
  • Smooth-pursuit (moving target)
  • Implicit / online refinement (during normal use)

Calibration can drift with headset slippage; repeated calibration or automatic re-calibration compensates.[31]

Challenges and Limitations

Technical

  • Inter-user anatomical variation
  • Glasses or contact-lens reflections
  • Processing load and power budget
  • Accuracy–latency trade-offs[32]

User-experience

  • Calibration burden
  • Motion discomfort if latency is high
  • Privacy: gaze reveals identity, intent, and health.[33][34]

Accessibility

  • Eye conditions (strabismus, nystagmus) may defeat tracking
  • Cosmetic products can occlude IR glints[35]

Future Developments

  • Low-power, smaller sensors
  • Deep-learning-enhanced robustness (for example CNN pupil detectors)[36]
  • Predictive gaze and perceptual super-sampling
  • Emotion and cognitive-state inference[37]

Software Development and APIs

Major toolkits include Unity XR Interaction Toolkit, Unreal Engine Eye Tracker Interface, and OpenXR (`XR_EXT_eye_gaze_interaction`). Vendors such as Tobii supply dedicated SDKs.[38] Depending on the API layer, developers can access raw eye vectors, classified events (fixation/saccade), or semantic object-gaze hits.[39]

Ethical and Privacy Considerations

Gaze data are treated as sensitive biometric information under GDPR, CCPA/CPRA, and similar regulations. Best practice requires informed consent, minimal data retention, encryption in transit and at rest, and transparency around secondary uses.[33]

Standards and Regulations

  • Khronos Group – OpenXR extensions
  • IEEE – draft standard P2048.5 for XR learning metrics
  • VRIF – implementation guidelines
  • ISO/IEC JTC 1/SC 24 – graphics and XR data standards[40]

See also

References

  1. Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice. Springer.
  2. Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.
  3. Huey, E. B. (1908). The Psychology and Pedagogy of Reading. Macmillan.
  4. Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press.
  5. Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.
  6. Lang, B. (2023, May 2). Eye-tracking Is a Game-Changer for XR That Goes Far Beyond Foveated Rendering. Road to VR. https://www.roadtovr.com/why-eye-tracking-is-a-game-changer-for-vr-headsets-virtual-reality/
  7. Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500.
  8. Bulling, A., Ward, J. A., Gellersen, H., & Tröster, G. (2011). Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(4), 741–753.
  9. Robinson, D. A. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Transactions on Biomedical Engineering, BME-10(4), 137–145.
  10. Kar, A., & Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access, 5, 16495–16519.
  11. Leigh, R. J., & Zee, D. S. (2015). The Neurology of Eye Movements. Oxford University Press.
  12. Patney, A. et al. (2016). Towards foveated rendering for gaze-tracked VR. ACM Transactions on Graphics, 35(6), 179.
  13. Tobii Blog: Eye Tracking in VR — A Vital Component. 2024-02-16.
  14. NVIDIA Corp. (n.d.). Maximize VR Performance with Foveated Rendering. https://developer.nvidia.com/vrworks
  15. Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive VR. IEEE Symposium on 3D User Interfaces, 36–39.
  16. 16.0 16.1 Meta Platforms, Inc. (2022, Oct 11). Meta Quest Pro: A New Way to Work, Create and Collaborate. https://www.meta.com/blog/quest/meta-quest-pro-vr-headset-features-price-release-date/
  17. Clay, V., König, P., & König, S. (2019). Eye tracking in virtual reality. Journal of Eye Movement Research, 12(1).
  18. Akeley, K., Watt, S. J., Girshick, A. R., & Banks, M. S. (2004). A stereo display prototype with multiple focal distances. ACM Transactions on Graphics, 23(3), 804–813.
  19. Apple Inc. (n.d.). Apple Vision Pro. https://www.apple.com/apple-vision-pro/
  20. Sony Interactive Entertainment. (2023). PS VR2 Features. https://www.playstation.com/en-us/ps-vr2/ps-vr2-features/
  21. HTC Corp. (n.d.). VIVE Pro Eye. https://www.vive.com/us/product/vive-pro-eye/overview/
  22. Varjo Technologies Oy. (n.d.). Varjo Aero. https://varjo.com/products/aero/
  23. Varjo Technologies. (2021). Varjo XR-3 Technical Specifications. https://varjo.com/products/varjo-xr-3/
  24. Pimax Technology (Shanghai) Co. Ltd. (n.d.). Pimax Crystal. https://pimax.com/crystal/
  25. Pico Interactive. (2021). Pico Neo 3 Pro Eye Specifications. https://pico-interactive.com/en/products/neo3-pro-eye
  26. HP Dev. Co. L.P. (n.d.). HP Reverb G2 Omnicept Edition VR Headset. https://www.hp.com/us-en/vr/reverb-g2-vr-headset-omnicept-edition.html
  27. Microsoft Corp. (2019). HoloLens 2 Hardware Details. https://learn.microsoft.com/hololens/hololens2-hardware
  28. Magic Leap, Inc. (2022). Magic Leap 2 Technical Overview. https://www.magicleap.com/magic-leap-2
  29. Mack, S., et al. (2023). Eye tracking in virtual reality: a broad review of applications and challenges. Virtual Reality, 27, 1481–1505. https://link.springer.com/article/10.1007/s10055-022-00738-z
  30. Blignaut, P. (2018). Using eye tracking to assess user experience: A case of a mobile banking application. In ACM International Conference Proceeding Series, 219–228.
  31. Santini, T., Fuhl, W., & Kasneci, E. (2017). CalibMe: Fast and unsupervised eye-tracker calibration for gaze-based pervasive HCI. CHI Conference on Human Factors in Computing Systems, 2594–2605.
  32. Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human-computer interaction. In Advances in Physiological Computing, 39–65. Springer.
  33. 33.0 33.1 Kröger, J. L., Lutz, O. H. M., & Müller, F. (2020). What does your gaze reveal about you? On the privacy implications of eye tracking. In Privacy and Identity Management, 226–241.
  34. Crockford, K. (2020, Nov 19). The Privacy Bird Isn’t Real: Your VR/AR Data Is. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2020/11/privacy-bird-isnt-real-your-vrar-data
  35. Titz, J., Scholz, A., & Sedlmeier, P. (2018). Comparing eye trackers by correlating their eye-metric data. Behavior Research Methods, 50(5), 1853–1863.
  36. Fuhl, W., Santini, T., Kasneci, G., & Kasneci, E. (2016). PupilNet: Convolutional neural networks for robust pupil detection. CoRR, abs/1601.04902.
  37. Duchowski, A. T., et al. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. CHI Conference on Human Factors in Computing Systems, 1–13.
  38. Tobii Technology. (2023). Tobii XR SDK Documentation. https://vr.tobii.com/sdk/
  39. Kumar, D., Dutta, A., Das, A., & Lahiri, U. (2016). SmartEye: Developing a novel eye-tracking system for quantitative assessment of oculomotor abnormalities. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24(10), 1051–1059.
  40. International Organization for Standardization. (2023). ISO/IEC JTC 1/SC 24 – Computer graphics, image processing and environmental data representation. ISO.org.