Eye tracking: Difference between revisions
Appearance
Xinreality (talk | contribs) |
Xinreality (talk | contribs) |
||
Line 37: | Line 37: | ||
*'''Blinks''': Detection of eyelid closures. | *'''Blinks''': Detection of eyelid closures. | ||
== Applications in VR and AR == | ==Applications in VR and AR== | ||
Eye tracking unlocks numerous capabilities enhancing VR and AR experiences: | Eye tracking unlocks numerous capabilities enhancing VR and AR experiences: | ||
* | *'''[[Foveated rendering]]''': Arguably the most impactful application for performance. By knowing precisely where the user is looking, the system renders the scene at maximum [[resolution]] only in the small, central area of the user's gaze (corresponding to the eye's high-acuity fovea), while rendering the peripheral areas at progressively lower resolutions. This mimics human vision and can drastically reduce the computational load on the [[graphics processing unit]] (GPU) – potentially by 30% to over 70% – allowing for higher fidelity graphics, increased frame rates, reduced [[latency]], or lower power consumption, all without a noticeable loss in perceived visual quality.<ref name="Patney2016">Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D., & Lefohn, A. (2016). Towards foveated rendering for gaze-tracked virtual reality. *ACM Transactions on Graphics, 35*(6), 179.</ref><ref name="TobiiFoveated">[https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Tobii Blog: Eye Tracking in VR — A Vital Component]. Retrieved Nov 17, 2023.</ref><ref name="NvidiaFoveated">NVIDIA Corporation. (n.d.). *Maximize VR Performance with Foveated Rendering*. NVIDIA Developer. Retrieved November 16, 2023, from https://developer.nvidia.com/vrworks/graphics/foveatedrendering</ref> | ||
* | *'''Natural Interaction / Gaze-Based Interaction''': Eye tracking enables more intuitive control schemes: | ||
**'''Gaze Selection/Pointing''': Allows users to select objects, menu items, or targets simply by looking at them. This is often combined with a confirmation action like a button press on a [[controller (computing)|controller]], a [[hand tracking]] gesture (e.g., pinch), or a short dwell time.<ref name="Piumsomboon2017">Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive virtual reality. *IEEE Symposium on 3D User Interfaces*, 36-39.</ref> | |||
**'''Intent Prediction''': Systems can anticipate user actions or needs based on gaze patterns (e.g., highlighting an object the user looks at intently). | |||
**'''Gaze-directed Locomotion''': Steering movement within the virtual world based on gaze direction. | |||
* | *'''Enhanced Social Presence / [[Avatar]] Realism''': Realistic eye movements, including subtle saccades, blinks, and responsive gaze shifts, can be mirrored onto a user's avatar in social VR applications. This significantly enhances [[non-verbal communication]] and the feeling of [[social presence]] and connection when interacting with others.<ref name="MetaAvatarsEyeTracking">Meta Platforms, Inc. (2022, October 11). *Meta Quest Pro: A New Way to Work, Create and Collaborate*. Meta Quest Blog. https://www.meta.com/blog/quest/meta-quest-pro-vr-headset-features-price-release-date/</ref> | ||
* | *'''User Analytics and Research''': Eye tracking provides invaluable objective data for: | ||
**'''[[Usability testing]] & User Experience (UX) Research''': Understanding how users visually explore and interact with interfaces or environments. | |||
**'''[[Attention mapping]]''': Creating heatmaps and gaze plots to visualize areas of interest and attention duration. | |||
**'''Cognitive Load Assessment''': Measuring mental workload through metrics like pupil dilation, blink rate, and fixation patterns.<ref name="Clay2019">Clay, V., König, P., & König, S. (2019). Eye tracking in virtual reality. *Journal of Eye Movement Research, 12*(1).</ref> | |||
**'''Training and Simulation Analysis''': Assessing trainee attention, situational awareness, and decision-making processes in professional simulations (e.g., medical, aviation). | |||
* | *'''Automatic [[Interpupillary distance|IPD]] Adjustment''': Some headsets utilize the eye tracking cameras to automatically measure the user's interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort. | ||
* | *'''[[Accessibility]]''': Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR. | ||
* | *'''Adaptive Optics / [[Varifocal display]]s''': Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.<ref name="Akeley2004">Akeley, K., Watt, S.J., Girshick, A.R., & Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.</ref> | ||
* | *'''[[Dynamic Distortion Compensation]]''': Real-time adjustments to lens distortion correction based on precise eye position relative to the lens center can improve perceived sharpness across the field of view.<ref name="TobiiFoveated"/> | ||
== Current Implementations == | == Current Implementations == |