Eye tracking: Difference between revisions
Xinreality (talk | contribs) m Text replacement - "e.g.," to "for example" |
Xinreality (talk | contribs) No edit summary Tag: Reverted |
||
Line 1: | Line 1: | ||
{{see also|Terms|Technical Terms}} | {{see also|Terms|Technical Terms}} | ||
[[Eye tracking]] is a sensor technology that measures [[eye]] positions and [[eye movement]]. In the context of [[virtual reality]] (VR) and [[augmented reality]] (AR), it refers to the integration of sensors within [[VR headset]]s or [[AR headset]]s to determine precisely where the user is looking ([[gaze]]) in real | [[Eye tracking]] is a sensor technology that measures [[eye]] positions and [[eye movement]]. In the context of [[virtual reality]] (VR) and [[augmented reality]] (AR), it refers to the integration of sensors within [[VR headset]]s or [[AR headset]]s to determine precisely where the user is looking ([[gaze]]) in real time inside a [[virtual environment]] or across an overlaid digital interface. By accurately monitoring characteristics such as pupil position, corneal reflections, and eye movements (for example [[saccades]] and fixations), the technology enables more immersive, efficient, and intuitive user experiences. Eye tracking has become a critical feature of modern [[head-mounted display]]s ([[HMD]]s), driving advances in rendering, interaction, analytics, and social presence.<ref name="Duchowski2017">Duchowski, A. T. (2017). ''Eye Tracking Methodology: Theory and Practice''. Springer.</ref> | ||
==History== | ==History== | ||
Systematic study of eye movements began in the 19th century. [[Louis Émile Javal]] observed in 1879 that reading involves discrete fixations and rapid saccades.<ref name="Rayner1998">Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. ''Psychological Bulletin, 124''(3), 372–422.</ref> Early instruments included [[Edmund Huey]]’s contact-lens tracker (≈1908),<ref name="Huey1908">Huey, E. B. (1908). ''The Psychology and Pedagogy of Reading''. Macmillan.</ref> [[Guy T. Buswell]]’s film trackers in the 1930s, and [[Alfred L. Yarbus]]’s seminal work on task-dependent gaze in the 1960s.<ref name="Yarbus1967">Yarbus, A. L. (1967). ''Eye Movements and Vision''. Plenum Press.</ref> Video-based systems emerged in the 1970s, while compact infrared trackers suitable for HMDs appeared in the 2010s. | |||
==Technical Principles== | ==Technical Principles== | ||
===Tracking | ===Tracking methods=== | ||
Most | Most contemporary HMDs use one of four foundations: | ||
*'''[[Pupil | *'''[[Pupil-centre/corneal-reflection]] (PCCR)''': Infrared (IR) LEDs illuminate the eye; IR cameras capture images; computer-vision algorithms locate the pupil centre and one or more corneal “glints”; the vector between them yields 3-D gaze.<ref name="Holmqvist2011">Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). ''Eye Tracking: A Comprehensive Guide to Methods and Measures''. Oxford University Press.</ref><ref name="RoadToVREyeTracking">Lang, B. (2023, May 2). ''Eye-tracking Is a Game-Changer for XR That Goes Far Beyond Foveated Rendering''. Road to VR. https://www.roadtovr.com/why-eye-tracking-is-a-game-changer-for-vr-headsets-virtual-reality/</ref> | ||
*'''Video- | *'''Video-feature tracking''': Cameras analyse iris texture, pupil outline, or eye-surface vessels; PCCR can be treated as a subset.<ref name="Hansen2010">Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. ''IEEE Transactions on Pattern Analysis and Machine Intelligence, 32''(3), 478–500.</ref> | ||
*'''[[Electrooculography]] (EOG)''': | *'''[[Electrooculography]] (EOG)''': Electrodes around the eye sense corneo-retinal potential changes; useful for coarse movements or eyelid activity when cameras are unsuitable.<ref name="Bulling2011">Bulling, A., Ward, J. A., Gellersen, H., & Tröster, G. (2011). Eye movement analysis for activity recognition using electrooculography. ''IEEE Transactions on Pattern Analysis and Machine Intelligence, 33''(4), 741–753.</ref> | ||
*'''[[Scleral search coil]]''': | *'''[[Scleral search coil]]''': A wire coil embedded in a contact lens induces current in a surrounding magnetic field, giving sub-arc-minute precision—reserved for laboratory research.<ref name="Robinson1963">Robinson, D. A. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. ''IEEE Transactions on Biomedical Engineering, BME-10''(4), 137–145.</ref> | ||
===Key | ===Key components (PCCR)=== | ||
* Infrared illuminators | |||
* | * High-speed IR cameras | ||
* | * Real-time processing (onboard [[system-on-chip]] or discrete DSP) | ||
* | * Per-user calibration routines<ref name="Kar2017">Kar, A., & Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. ''IEEE Access, 5'', 16495–16519.</ref> | ||
* | |||
===Eye | ===Eye-movement metrics=== | ||
Systems measure fixations, saccades, smooth pursuit, vergence, pupil diameter, and blink events, each informing attention, cognitive load, or depth cues.<ref name="Leigh2015">Leigh, R. J., & Zee, D. S. (2015). ''The Neurology of Eye Movements''. Oxford University Press.</ref> | |||
==Applications in VR and AR== | ==Applications in VR and AR== | ||
Eye | *'''[[Foveated rendering]]''' reduces GPU load by rendering full resolution only at the gaze locus.<ref name="Patney2016">Patney, A. et al. (2016). Towards foveated rendering for gaze-tracked VR. ''ACM Transactions on Graphics, 35''(6), 179.</ref><ref name="TobiiFoveated">[https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Tobii Blog: Eye Tracking in VR — A Vital Component]. 2024-02-16.</ref><ref name="NvidiaFoveated">NVIDIA Corp. (n.d.). ''Maximize VR Performance with Foveated Rendering''. https://developer.nvidia.com/vrworks</ref> | ||
*''' | *'''Natural gaze interaction''' (point-and-confirm, dwell, or gesture) streamlines UI control.<ref name="Piumsomboon2017">Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive VR. ''IEEE Symposium on 3D User Interfaces'', 36–39.</ref> | ||
*''' | *'''Social presence''' improves when trackers drive avatar eyes and facial expression.<ref name="MetaAvatarsEyeTracking">Meta Platforms, Inc. (2022, Oct 11). ''Meta Quest Pro: A New Way to Work, Create and Collaborate''. https://www.meta.com/blog/quest/meta-quest-pro-vr-headset-features-price-release-date/</ref> | ||
*''' | *'''Analytics & research''': Heat-mapping, UX testing, and training assessment benefit from quantified gaze.<ref name="Clay2019">Clay, V., König, P., & König, S. (2019). Eye tracking in virtual reality. ''Journal of Eye Movement Research, 12''(1).</ref> | ||
*''' | *'''Varifocal / adaptive-optics displays''' resolve the [[vergence-accommodation conflict]].<ref name="Akeley2004">Akeley, K., Watt, S. J., Girshick, A. R., & Banks, M. S. (2004). A stereo display prototype with multiple focal distances. ''ACM Transactions on Graphics, 23''(3), 804–813.</ref> | ||
==Current Implementations== | ==Current Implementations== | ||
===VR headsets with eye tracking=== | |||
*'''[[Apple Vision Pro]]''' – primary input modality and foveated rendering.<ref name="AppleVisionPro">Apple Inc. (n.d.). ''Apple Vision Pro''. https://www.apple.com/apple-vision-pro/</ref> | |||
*'''[[Meta Quest Pro]]''' – inward sensors for rendering and avatar expression.<ref name="MetaAvatarsEyeTracking"/> | |||
*'''[[PlayStation VR2]]''' – integrates Tobii tracking for games and foveation.<ref name="SonyPSVR2">Sony Interactive Entertainment. (2023). ''PS VR2 Features''. https://www.playstation.com/en-us/ps-vr2/ps-vr2-features/</ref> | |||
*'''[[HTC VIVE Pro Eye]]''' – enterprise headset with 120 Hz Tobii tracking.<ref name="ViveProEye">HTC Corp. (n.d.). ''VIVE Pro Eye''. https://www.vive.com/us/product/vive-pro-eye/overview/</ref> | |||
*'''[[Varjo]] XR-4 / XR-3 / VR-3 / Aero''' – 200 Hz research-grade tracking.<ref name="VarjoAero">Varjo Technologies Oy. (n.d.). ''Varjo Aero''. https://varjo.com/products/aero/</ref><ref name="VarjoXR3">Varjo Technologies. (2021). ''Varjo XR-3 Technical Specifications''. https://varjo.com/products/varjo-xr-3/</ref> | |||
*'''[[Pimax]] Crystal''' – consumer 12K head-set with eye tracking for IPD and foveation.<ref name="PimaxCrystal">Pimax Technology (Shanghai) Co. Ltd. (n.d.). ''Pimax Crystal''. https://pimax.com/crystal/</ref> | |||
*'''[[Pico 4 Enterprise]]''' (formerly Neo 3 Pro Eye) – Tobii-enabled enterprise unit.<ref name="PicoNeo3Eye">Pico Interactive. (2021). ''Pico Neo 3 Pro Eye Specifications''. https://pico-interactive.com/en/products/neo3-pro-eye</ref> | |||
*'''[[HP Reverb G2 Omnicept Edition]]''' – eye, lip, and heart-rate sensors for training/analytics.<ref name="HPOmnicept">HP Dev. Co. L.P. (n.d.). ''HP Reverb G2 Omnicept Edition VR Headset''. https://www.hp.com/us-en/vr/reverb-g2-vr-headset-omnicept-edition.html</ref> | |||
=== | ===AR headsets=== | ||
*'''[[ | *'''[[Microsoft HoloLens 2]]''' – gaze-based targeting and automatic calibration.<ref name="HoloLens2">Microsoft Corp. (2019). ''HoloLens 2 Hardware Details''. https://learn.microsoft.com/hololens/hololens2-hardware</ref> | ||
*'''[[Magic Leap 2]]''' – eye tracking for input, analytics, and segmented-display foveation.<ref name="MagicLeap2">Magic Leap, Inc. (2022). ''Magic Leap 2 Technical Overview''. https://www.magicleap.com/magic-leap-2</ref> | |||
*'''[[ | |||
== | ==Technical specifications and performance metrics== | ||
;Accuracy | |||
:Mean angular error; consumer HMDs achieve 0.5°–1.5°. | |||
;Precision | |||
:RMS sample-to-sample variation, commonly 0.1°–0.5°. | |||
;Sampling rate | |||
:30 Hz–120 Hz (consumer) to >1000 Hz (research). | |||
;Latency | |||
:End-to-end delay ideally < 20 ms for foveated rendering; reported values 45–80 ms in some systems.<ref name="Mack2023">Mack, S., et al. (2023). Eye tracking in virtual reality: a broad review of applications and challenges. ''Virtual Reality, 27'', 1481–1505. https://link.springer.com/article/10.1007/s10055-022-00738-z</ref> | |||
;Robustness | |||
:Percentage of valid samples under motion, glasses, etc. | |||
;Head box / eye box | |||
:Spatial region within which tracking is maintained.<ref name="Blignaut2018">Blignaut, P. (2018). Using eye tracking to assess user experience: A case of a mobile banking application. In ''ACM International Conference Proceeding Series'', 219–228.</ref> | |||
== | ===Calibration methods=== | ||
*'''Point-by-point''' (static targets) | |||
*'''Smooth-pursuit''' (moving target) | |||
*''' | *'''Implicit / online refinement''' (during normal use) | ||
Calibration can drift with headset slippage; repeated calibration or automatic re-calibration compensates.<ref name="Santini2017">Santini, T., Fuhl, W., & Kasneci, E. (2017). CalibMe: Fast and unsupervised eye-tracker calibration for gaze-based pervasive HCI. ''CHI Conference on Human Factors in Computing Systems'', 2594–2605.</ref> | |||
*''' | |||
*''' | |||
==Challenges and Limitations== | ==Challenges and Limitations== | ||
===Technical=== | |||
* Inter-user anatomical variation | |||
===Technical | * Glasses or contact-lens reflections | ||
* | * Processing load and power budget | ||
* | * Accuracy–latency trade-offs<ref name="Majaranta2014">Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human-computer interaction. In ''Advances in Physiological Computing'', 39–65. Springer.</ref> | ||
* | |||
* | |||
<ref name="Majaranta2014">Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human-computer interaction. In | |||
=== | ===User-experience=== | ||
*''' | * Calibration burden | ||
* Motion discomfort if latency is high | |||
* '''Privacy''': gaze reveals identity, intent, and health.<ref name="Kroger2020">Kröger, J. L., Lutz, O. H. M., & Müller, F. (2020). What does your gaze reveal about you? On the privacy implications of eye tracking. In ''Privacy and Identity Management'', 226–241.</ref><ref name="EFFPrivacyVRAR">Crockford, K. (2020, Nov 19). ''The Privacy Bird Isn’t Real: Your VR/AR Data Is''. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2020/11/privacy-bird-isnt-real-your-vrar-data</ref> | |||
<ref name="Titz2018">Titz, J., Scholz, A., & Sedlmeier, P. (2018). Comparing eye trackers by correlating their eye-metric data. | ===Accessibility=== | ||
* Eye conditions (strabismus, nystagmus) may defeat tracking | |||
* Cosmetic products can occlude IR glints<ref name="Titz2018">Titz, J., Scholz, A., & Sedlmeier, P. (2018). Comparing eye trackers by correlating their eye-metric data. ''Behavior Research Methods, 50''(5), 1853–1863.</ref> | |||
==Future Developments== | ==Future Developments== | ||
* Low-power, smaller sensors | |||
* Deep-learning-enhanced robustness (for example [[CNN]] pupil detectors)<ref name="Fuhl2016">Fuhl, W., Santini, T., Kasneci, G., & Kasneci, E. (2016). PupilNet: Convolutional neural networks for robust pupil detection. ''CoRR, abs/1601.04902''.</ref> | |||
* Predictive gaze and perceptual super-sampling | |||
* | * Emotion and cognitive-state inference<ref name="Duchowski2018">Duchowski, A. T., et al. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. ''CHI Conference on Human Factors in Computing Systems'', 1–13.</ref> | ||
* | |||
* | |||
* | |||
<ref name="Duchowski2018">Duchowski, A. T., | |||
==Software Development and APIs== | ==Software Development and APIs== | ||
Major toolkits include Unity XR Interaction Toolkit, Unreal Engine Eye Tracker Interface, and [[OpenXR]] (`XR_EXT_eye_gaze_interaction`). Vendors such as Tobii supply dedicated SDKs.<ref name="TobiiSDK">Tobii Technology. (2023). ''Tobii XR SDK Documentation''. https://vr.tobii.com/sdk/</ref> Depending on the API layer, developers can access raw eye vectors, classified events (fixation/saccade), or semantic object-gaze hits.<ref name="Kumar2016">Kumar, D., Dutta, A., Das, A., & Lahiri, U. (2016). SmartEye: Developing a novel eye-tracking system for quantitative assessment of oculomotor abnormalities. ''IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24''(10), 1051–1059.</ref> | |||
<ref name="Kumar2016">Kumar, D., Dutta, A., Das, A., & Lahiri, U. (2016). SmartEye: Developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities. | |||
==Ethical and Privacy Considerations== | ==Ethical and Privacy Considerations== | ||
Gaze data are treated as sensitive biometric information under GDPR, CCPA/CPRA, and similar regulations. Best practice requires informed consent, minimal data retention, encryption in transit and at rest, and transparency around secondary uses.<ref name="Kroger2020"/> | |||
<ref name="Kroger2020"/> | |||
==Standards and Regulations== | ==Standards and Regulations== | ||
*'''[[Khronos Group]]''' – OpenXR extensions | |||
*'''[[Khronos Group | *'''IEEE''' – draft standard P2048.5 for XR learning metrics | ||
*''' | *'''VRIF''' – implementation guidelines | ||
*''' | *'''ISO/IEC JTC 1/SC 24''' – graphics and XR data standards<ref name="ISO_SC24">International Organization for Standardization. (2023). ''ISO/IEC JTC 1/SC 24 – Computer graphics, image processing and environmental data representation''. ISO.org.</ref> | ||
*''' | |||
==See | ==See also== | ||
* | * [[Augmented reality]] | ||
* | * [[AR headset]] | ||
* | * [[Attention mapping]] | ||
* | * [[Avatar]] | ||
* | * [[Computer vision]] | ||
* | * [[Foveated rendering]] | ||
* | * [[Gaze]] | ||
* | * [[Hand tracking]] | ||
* | * [[Head-mounted display]] | ||
* | * [[Infrared]] | ||
* [[Interpupillary distance]] | |||
* [[Latency]] | |||
* | * [[Privacy]] in VR/AR | ||
* | * [[Saccade]] | ||
* | * [[Social presence]] | ||
* | * [[Varifocal display]] | ||
* [[Vergence-accommodation conflict]] | |||
* | * [[Virtual reality]] | ||
* | * [[VR headset]] | ||
* | |||
* | |||
* | |||
==References== | ==References== | ||
<references /> | <references /> |
Revision as of 16:21, 1 May 2025
- See also: Terms and Technical Terms
Eye tracking is a sensor technology that measures eye positions and eye movement. In the context of virtual reality (VR) and augmented reality (AR), it refers to the integration of sensors within VR headsets or AR headsets to determine precisely where the user is looking (gaze) in real time inside a virtual environment or across an overlaid digital interface. By accurately monitoring characteristics such as pupil position, corneal reflections, and eye movements (for example saccades and fixations), the technology enables more immersive, efficient, and intuitive user experiences. Eye tracking has become a critical feature of modern head-mounted displays (HMDs), driving advances in rendering, interaction, analytics, and social presence.[1]
History
Systematic study of eye movements began in the 19th century. Louis Émile Javal observed in 1879 that reading involves discrete fixations and rapid saccades.[2] Early instruments included Edmund Huey’s contact-lens tracker (≈1908),[3] Guy T. Buswell’s film trackers in the 1930s, and Alfred L. Yarbus’s seminal work on task-dependent gaze in the 1960s.[4] Video-based systems emerged in the 1970s, while compact infrared trackers suitable for HMDs appeared in the 2010s.
Technical Principles
Tracking methods
Most contemporary HMDs use one of four foundations:
- Pupil-centre/corneal-reflection (PCCR): Infrared (IR) LEDs illuminate the eye; IR cameras capture images; computer-vision algorithms locate the pupil centre and one or more corneal “glints”; the vector between them yields 3-D gaze.[5][6]
- Video-feature tracking: Cameras analyse iris texture, pupil outline, or eye-surface vessels; PCCR can be treated as a subset.[7]
- Electrooculography (EOG): Electrodes around the eye sense corneo-retinal potential changes; useful for coarse movements or eyelid activity when cameras are unsuitable.[8]
- Scleral search coil: A wire coil embedded in a contact lens induces current in a surrounding magnetic field, giving sub-arc-minute precision—reserved for laboratory research.[9]
Key components (PCCR)
- Infrared illuminators
- High-speed IR cameras
- Real-time processing (onboard system-on-chip or discrete DSP)
- Per-user calibration routines[10]
Eye-movement metrics
Systems measure fixations, saccades, smooth pursuit, vergence, pupil diameter, and blink events, each informing attention, cognitive load, or depth cues.[11]
Applications in VR and AR
- Foveated rendering reduces GPU load by rendering full resolution only at the gaze locus.[12][13][14]
- Natural gaze interaction (point-and-confirm, dwell, or gesture) streamlines UI control.[15]
- Social presence improves when trackers drive avatar eyes and facial expression.[16]
- Analytics & research: Heat-mapping, UX testing, and training assessment benefit from quantified gaze.[17]
- Varifocal / adaptive-optics displays resolve the vergence-accommodation conflict.[18]
Current Implementations
VR headsets with eye tracking
- Apple Vision Pro – primary input modality and foveated rendering.[19]
- Meta Quest Pro – inward sensors for rendering and avatar expression.[16]
- PlayStation VR2 – integrates Tobii tracking for games and foveation.[20]
- HTC VIVE Pro Eye – enterprise headset with 120 Hz Tobii tracking.[21]
- Varjo XR-4 / XR-3 / VR-3 / Aero – 200 Hz research-grade tracking.[22][23]
- Pimax Crystal – consumer 12K head-set with eye tracking for IPD and foveation.[24]
- Pico 4 Enterprise (formerly Neo 3 Pro Eye) – Tobii-enabled enterprise unit.[25]
- HP Reverb G2 Omnicept Edition – eye, lip, and heart-rate sensors for training/analytics.[26]
AR headsets
- Microsoft HoloLens 2 – gaze-based targeting and automatic calibration.[27]
- Magic Leap 2 – eye tracking for input, analytics, and segmented-display foveation.[28]
Technical specifications and performance metrics
- Accuracy
- Mean angular error; consumer HMDs achieve 0.5°–1.5°.
- Precision
- RMS sample-to-sample variation, commonly 0.1°–0.5°.
- Sampling rate
- 30 Hz–120 Hz (consumer) to >1000 Hz (research).
- Latency
- End-to-end delay ideally < 20 ms for foveated rendering; reported values 45–80 ms in some systems.[29]
- Robustness
- Percentage of valid samples under motion, glasses, etc.
- Head box / eye box
- Spatial region within which tracking is maintained.[30]
Calibration methods
- Point-by-point (static targets)
- Smooth-pursuit (moving target)
- Implicit / online refinement (during normal use)
Calibration can drift with headset slippage; repeated calibration or automatic re-calibration compensates.[31]
Challenges and Limitations
Technical
- Inter-user anatomical variation
- Glasses or contact-lens reflections
- Processing load and power budget
- Accuracy–latency trade-offs[32]
User-experience
- Calibration burden
- Motion discomfort if latency is high
- Privacy: gaze reveals identity, intent, and health.[33][34]
Accessibility
- Eye conditions (strabismus, nystagmus) may defeat tracking
- Cosmetic products can occlude IR glints[35]
Future Developments
- Low-power, smaller sensors
- Deep-learning-enhanced robustness (for example CNN pupil detectors)[36]
- Predictive gaze and perceptual super-sampling
- Emotion and cognitive-state inference[37]
Software Development and APIs
Major toolkits include Unity XR Interaction Toolkit, Unreal Engine Eye Tracker Interface, and OpenXR (`XR_EXT_eye_gaze_interaction`). Vendors such as Tobii supply dedicated SDKs.[38] Depending on the API layer, developers can access raw eye vectors, classified events (fixation/saccade), or semantic object-gaze hits.[39]
Ethical and Privacy Considerations
Gaze data are treated as sensitive biometric information under GDPR, CCPA/CPRA, and similar regulations. Best practice requires informed consent, minimal data retention, encryption in transit and at rest, and transparency around secondary uses.[33]
Standards and Regulations
- Khronos Group – OpenXR extensions
- IEEE – draft standard P2048.5 for XR learning metrics
- VRIF – implementation guidelines
- ISO/IEC JTC 1/SC 24 – graphics and XR data standards[40]
See also
- Augmented reality
- AR headset
- Attention mapping
- Avatar
- Computer vision
- Foveated rendering
- Gaze
- Hand tracking
- Head-mounted display
- Infrared
- Interpupillary distance
- Latency
- Privacy in VR/AR
- Saccade
- Social presence
- Varifocal display
- Vergence-accommodation conflict
- Virtual reality
- VR headset
References
- ↑ Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice. Springer.
- ↑ Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.
- ↑ Huey, E. B. (1908). The Psychology and Pedagogy of Reading. Macmillan.
- ↑ Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press.
- ↑ Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.
- ↑ Lang, B. (2023, May 2). Eye-tracking Is a Game-Changer for XR That Goes Far Beyond Foveated Rendering. Road to VR. https://www.roadtovr.com/why-eye-tracking-is-a-game-changer-for-vr-headsets-virtual-reality/
- ↑ Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500.
- ↑ Bulling, A., Ward, J. A., Gellersen, H., & Tröster, G. (2011). Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(4), 741–753.
- ↑ Robinson, D. A. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Transactions on Biomedical Engineering, BME-10(4), 137–145.
- ↑ Kar, A., & Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access, 5, 16495–16519.
- ↑ Leigh, R. J., & Zee, D. S. (2015). The Neurology of Eye Movements. Oxford University Press.
- ↑ Patney, A. et al. (2016). Towards foveated rendering for gaze-tracked VR. ACM Transactions on Graphics, 35(6), 179.
- ↑ Tobii Blog: Eye Tracking in VR — A Vital Component. 2024-02-16.
- ↑ NVIDIA Corp. (n.d.). Maximize VR Performance with Foveated Rendering. https://developer.nvidia.com/vrworks
- ↑ Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive VR. IEEE Symposium on 3D User Interfaces, 36–39.
- ↑ 16.0 16.1 Meta Platforms, Inc. (2022, Oct 11). Meta Quest Pro: A New Way to Work, Create and Collaborate. https://www.meta.com/blog/quest/meta-quest-pro-vr-headset-features-price-release-date/
- ↑ Clay, V., König, P., & König, S. (2019). Eye tracking in virtual reality. Journal of Eye Movement Research, 12(1).
- ↑ Akeley, K., Watt, S. J., Girshick, A. R., & Banks, M. S. (2004). A stereo display prototype with multiple focal distances. ACM Transactions on Graphics, 23(3), 804–813.
- ↑ Apple Inc. (n.d.). Apple Vision Pro. https://www.apple.com/apple-vision-pro/
- ↑ Sony Interactive Entertainment. (2023). PS VR2 Features. https://www.playstation.com/en-us/ps-vr2/ps-vr2-features/
- ↑ HTC Corp. (n.d.). VIVE Pro Eye. https://www.vive.com/us/product/vive-pro-eye/overview/
- ↑ Varjo Technologies Oy. (n.d.). Varjo Aero. https://varjo.com/products/aero/
- ↑ Varjo Technologies. (2021). Varjo XR-3 Technical Specifications. https://varjo.com/products/varjo-xr-3/
- ↑ Pimax Technology (Shanghai) Co. Ltd. (n.d.). Pimax Crystal. https://pimax.com/crystal/
- ↑ Pico Interactive. (2021). Pico Neo 3 Pro Eye Specifications. https://pico-interactive.com/en/products/neo3-pro-eye
- ↑ HP Dev. Co. L.P. (n.d.). HP Reverb G2 Omnicept Edition VR Headset. https://www.hp.com/us-en/vr/reverb-g2-vr-headset-omnicept-edition.html
- ↑ Microsoft Corp. (2019). HoloLens 2 Hardware Details. https://learn.microsoft.com/hololens/hololens2-hardware
- ↑ Magic Leap, Inc. (2022). Magic Leap 2 Technical Overview. https://www.magicleap.com/magic-leap-2
- ↑ Mack, S., et al. (2023). Eye tracking in virtual reality: a broad review of applications and challenges. Virtual Reality, 27, 1481–1505. https://link.springer.com/article/10.1007/s10055-022-00738-z
- ↑ Blignaut, P. (2018). Using eye tracking to assess user experience: A case of a mobile banking application. In ACM International Conference Proceeding Series, 219–228.
- ↑ Santini, T., Fuhl, W., & Kasneci, E. (2017). CalibMe: Fast and unsupervised eye-tracker calibration for gaze-based pervasive HCI. CHI Conference on Human Factors in Computing Systems, 2594–2605.
- ↑ Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human-computer interaction. In Advances in Physiological Computing, 39–65. Springer.
- ↑ 33.0 33.1 Kröger, J. L., Lutz, O. H. M., & Müller, F. (2020). What does your gaze reveal about you? On the privacy implications of eye tracking. In Privacy and Identity Management, 226–241.
- ↑ Crockford, K. (2020, Nov 19). The Privacy Bird Isn’t Real: Your VR/AR Data Is. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2020/11/privacy-bird-isnt-real-your-vrar-data
- ↑ Titz, J., Scholz, A., & Sedlmeier, P. (2018). Comparing eye trackers by correlating their eye-metric data. Behavior Research Methods, 50(5), 1853–1863.
- ↑ Fuhl, W., Santini, T., Kasneci, G., & Kasneci, E. (2016). PupilNet: Convolutional neural networks for robust pupil detection. CoRR, abs/1601.04902.
- ↑ Duchowski, A. T., et al. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. CHI Conference on Human Factors in Computing Systems, 1–13.
- ↑ Tobii Technology. (2023). Tobii XR SDK Documentation. https://vr.tobii.com/sdk/
- ↑ Kumar, D., Dutta, A., Das, A., & Lahiri, U. (2016). SmartEye: Developing a novel eye-tracking system for quantitative assessment of oculomotor abnormalities. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24(10), 1051–1059.
- ↑ International Organization for Standardization. (2023). ISO/IEC JTC 1/SC 24 – Computer graphics, image processing and environmental data representation. ISO.org.