Jump to content

Eye tracking

From VR & AR Wiki
Revision as of 16:20, 24 April 2025 by Xinreality (talk | contribs)

Eye tracking is a sensor technology that measures eye positions and eye movements. In the context of virtual reality (VR) and augmented reality (AR), it refers to integrating sensors into a VR headset or AR headset to pinpoint the user’s real-time gaze within a virtual environment or digital overlay. By monitoring pupil position, corneal reflections, and movements such as saccades and fixations, eye tracking enables more immersive, efficient, and intuitive interaction. It is now a core feature of modern head-mounted displays (HMDs), powering advanced rendering, natural input, analytics, and social presence.[1]

History

Systematic study of eye movements began in the 19th century. Louis Émile Javal (1879) observed that reading occurs through fixations and saccades.[2] Early instruments include Huey’s contact-lens tracker (1908)[3] and Guy Thomas Buswell’s film systems (1930s). Alfred L. Yarbus showed in 1967 that gaze depends on viewing task.[4] Video-based and headset-integrated trackers emerged in the 1990s and now dominate consumer VR/AR markets.

Technical Principles

Tracking methods

  • Pupil Center Corneal Reflection (PCCR) – the standard approach in HMDs. IR LEDs illuminate the eye; IR cameras capture pupil and corneal glints; algorithms compute gaze; per-user calibration aligns geometry.[5][6]
  • Video-feature tracking (shape/appearance models) analyses full-eye images without relying exclusively on glints.[7]
  • Electrooculography (EOG) – surface electrodes sense eye-rotation potentials; useful for coarse detection or medical research.[8]
  • Scleral search coil – laboratory gold-standard with sub-arc-minute precision but invasive.[9]

Key components (PCCR)

Infrared illuminators, high-speed cameras, real-time processing on a SoC or dedicated ASIC, and calibration software.[10]

Eye-movement metrics

Fixations, saccades, smooth pursuit, vergence, pupil dilation, and blink rate are captured and quantified.[10]

Applications in VR and AR

  • Foveated rendering – renders full resolution only at the fovea, cutting GPU load by 30–70 %.[11][12]
  • Natural interaction – gaze selection, dwell activation, gaze-steered locomotion, intent prediction.[13]
  • Social presence – driving realistic avatar eyes and facial cues (e.g., Meta Quest Pro).[14]
  • User analytics & research – heat-mapping, attention mapping, cognitive-load inference, training assessment.[15]
  • Accessibility – gaze typing enables hands-free input (≈ 10 words min⁻¹).[16]
  • Adaptive optics / Varifocal displays – eye-driven focus surfaces mitigate the vergence-accommodation conflict.[17]
  • Dynamic distortion compensation – real-time lens-distortion correction based on pupil position.[18]

Current Implementations

VR headsets

AR headsets

Technical specifications & performance

Typical consumer ranges: **accuracy** 0.5–1.5°, **precision** 0.1–0.5°, **sampling rate** 30–120 Hz (research: 1000 Hz), **latency** ≤ 20 ms desirable; some services report 45–81 ms end-to-end.[28][29]

Calibration options include multi-point, smooth-pursuit and implicit schemes; drift can rise 30 % in minutes if the headset slips.[30]

Challenges and limitations

Individual physiology, eyewear, power budget, and privacy hurdles persist.[31] Gaze data are sensitive biometrics, triggering GDPR and other data protection laws.[32][33]

Future developments

Lower-power sensors, machine learning-based robustness (e.g., PupilNet)[34] and cross-modal fusion with electroencephalography are active research fronts. Predictive gaze models aim to mask residual latency, and standardisation via OpenXR & IEEE P2048.5 is ongoing.[35]

Standards and regulations

  • Khronos Group – `XR_EXT_eye_gaze_interaction` in OpenXR.
  • IEEE P2048.5 (eye tracking for immersive learning).
  • VRIF guidelines.
  • GDPR, CCPA, CPRA for data privacy.
  • ISO/IEC JTC 1/SC 24 – graphics & XR interface standards.

See also

Augmented realityAttention mappingAvatarComputer visionHand trackingPrivacyVarifocal display

References

  1. Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice. Springer.
  2. Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.
  3. Huey, E. B. (1908). The Psychology and Pedagogy of Reading. Macmillan.
  4. Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press.
  5. Holmqvist, K. et al. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. OUP.
  6. Kar, A., & Corcoran, P. (2017). Review of gaze-estimation systems in consumer platforms. IEEE Access, 5, 16495-16519.
  7. Hansen, D. W., & Ji, Q. (2010). Models for eyes and gaze. IEEE TPAMI, 32(3), 478-500.
  8. Bulling, A. et al. (2011). Eye-movement analysis via EOG. IEEE TPAMI, 33(4), 741-753.
  9. Robinson, D. A. (1963). Scleral search coil method. IEEE TBME 10(4), 137-145.
  10. 10.0 10.1 Leigh, R. J., & Zee, D. S. (2015). The Neurology of Eye Movements. OUP.
  11. Patney, A. et al. (2016). Towards foveated rendering for gaze-tracked VR. ACM TOG, 35(6), 179.
  12. NVIDIA Developer. Maximize VR Performance with Foveated Rendering. Retrieved 2025-04-24.
  13. Piumsomboon, T. et al. (2017). Exploring natural eye-gaze interaction in VR. In IEEE 3DUI, 36-39.
  14. Meta Platforms Inc. (2022). Quest Pro Blog. Retrieved 2025-04-24.
  15. Clay, V. et al. (2019). Eye tracking in VR. Journal of Eye Movement Research, 12(1).
  16. Rajanna, R., & Hansen, J. P. (2018). Gaze typing in VR. In CHI Extended Abstracts, 10 pp.
  17. Akeley, K. et al. (2004). Stereo display with multiple focal distances. ACM TOG, 23(3), 804-813.
  18. Tobii (2022). Dynamic Distortion Compensation White-paper. Retrieved 2025-04-24.
  19. Apple Inc. (2025). Vision Pro Technical Specifications. apple.com.
  20. Apple Inc. (2025). Responsive, Precision Eye Tracking. apple.com.
  21. Lang, B. (2023). PS VR2 specs. Road to VR. Retrieved 2025-04-24.
  22. HTC Corp. (2025). VIVE Pro Eye User Guide. vive.com.
  23. Varjo Technologies. (2025). XR-3 Product Page. varjo.com.
  24. Pimax Tech. (2024). Crystal Specs. pimax.com.
  25. Hayden, S. (2023). Pico 4 Enterprise eye tracking. Road to VR. Retrieved 2025-04-24.
  26. Microsoft Corp. (2024). HoloLens 2 Hardware Details. microsoft.com.
  27. Magic Leap Inc. (2024). Technical Overview. magicleap.com.
  28. Mack, S. et al. (2022). Survey on eye tracking in VR. Virtual Reality, 27, 1597–1625.
  29. Blignaut, P. (2018). Assessing UX with eye tracking. In ACM Proc. 219-228.
  30. Santini, T. et al. (2018). CalibMe fast unsupervised calibration. In CHI. 1-6.
  31. Majaranta, P., & Bulling, A. (2014). Eye-based HCI. In Advances in Physiological Computing, 39-65.
  32. Kröger, J. L. et al. (2020). Privacy implications of eye tracking. In Privacy & Identity Management, 226-241.
  33. Electronic Frontier Foundation (2020). Your VR/AR Data Is. eff.org.
  34. Fuhl, W. et al. (2016). PupilNet CNN for robust pupil detection. arXiv:1601.04902.
  35. ISO/IEC JTC 1/SC 24 (2023). Committee Scope. iso.org.