<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://vrarwiki.com/index.php?action=history&amp;feed=atom&amp;title=Eye_tracking</id>
	<title>Eye tracking - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://vrarwiki.com/index.php?action=history&amp;feed=atom&amp;title=Eye_tracking"/>
	<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;action=history"/>
	<updated>2026-04-18T16:36:24Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.0</generator>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=35426&amp;oldid=prev</id>
		<title>Xinreality at 21:21, 7 May 2025</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=35426&amp;oldid=prev"/>
		<updated>2025-05-07T21:21:43Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 21:21, 7 May 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l40&quot;&gt;Line 40:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 40:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Eye tracking unlocks numerous capabilities enhancing VR and AR experiences:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Eye tracking unlocks numerous capabilities enhancing VR and AR experiences:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Foveated rendering]]&#039;&#039;&#039;: Arguably the most impactful application for performance. By knowing precisely where the user is looking, the system renders the scene at maximum [[resolution]] only in the small, central area of the user&#039;s gaze (corresponding to the eye&#039;s high-acuity fovea), while rendering the peripheral areas at progressively lower resolutions. This mimics human vision and can drastically reduce the computational load on the [[graphics processing unit]] (GPU) – potentially by 30% to over 70% – allowing for higher fidelity graphics, increased frame rates, reduced [[latency]], or lower power consumption, all without a noticeable loss in perceived visual quality.&amp;lt;ref name=&quot;Patney2016&quot;&amp;gt;Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D., &amp;amp; Lefohn, A. (2016). Towards foveated rendering for gaze-tracked virtual reality. *ACM Transactions on Graphics, 35*(6), 179.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;TobiiFoveated&quot;&amp;gt;[https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Tobii Blog: Eye Tracking in VR &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;— &lt;/del&gt;A Vital Component]. Retrieved Nov 17, 2023.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;NvidiaFoveated&quot;&amp;gt;NVIDIA Corporation. (n.d.). *Maximize VR Performance with Foveated Rendering*. NVIDIA Developer. Retrieved November 16, 2023, from https://developer.nvidia.com/vrworks/graphics/foveatedrendering&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Foveated rendering]]&#039;&#039;&#039;: Arguably the most impactful application for performance. By knowing precisely where the user is looking, the system renders the scene at maximum [[resolution]] only in the small, central area of the user&#039;s gaze (corresponding to the eye&#039;s high-acuity fovea), while rendering the peripheral areas at progressively lower resolutions. This mimics human vision and can drastically reduce the computational load on the [[graphics processing unit]] (GPU) – potentially by 30% to over 70% – allowing for higher fidelity graphics, increased frame rates, reduced [[latency]], or lower power consumption, all without a noticeable loss in perceived visual quality.&amp;lt;ref name=&quot;Patney2016&quot;&amp;gt;Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D., &amp;amp; Lefohn, A. (2016). Towards foveated rendering for gaze-tracked virtual reality. *ACM Transactions on Graphics, 35*(6), 179.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;TobiiFoveated&quot;&amp;gt;[https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Tobii Blog: Eye Tracking in VR &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;- &lt;/ins&gt;A Vital Component]. Retrieved Nov 17, 2023.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;NvidiaFoveated&quot;&amp;gt;NVIDIA Corporation. (n.d.). *Maximize VR Performance with Foveated Rendering*. NVIDIA Developer. Retrieved November 16, 2023, from https://developer.nvidia.com/vrworks/graphics/foveatedrendering&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;Natural Interaction / Gaze-Based Interaction&amp;#039;&amp;#039;&amp;#039;: Eye tracking enables more intuitive control schemes:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;Natural Interaction / Gaze-Based Interaction&amp;#039;&amp;#039;&amp;#039;: Eye tracking enables more intuitive control schemes:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34817&amp;oldid=prev</id>
		<title>Xinreality: Undo revision 34816 by Xinreality (talk)</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34817&amp;oldid=prev"/>
		<updated>2025-05-01T16:21:38Z</updated>

		<summary type="html">&lt;p&gt;Undo revision &lt;a href=&quot;/wiki/Special:Diff/34816&quot; title=&quot;Special:Diff/34816&quot;&gt;34816&lt;/a&gt; by &lt;a href=&quot;/wiki/Special:Contributions/Xinreality&quot; title=&quot;Special:Contributions/Xinreality&quot;&gt;Xinreality&lt;/a&gt; (&lt;a href=&quot;/wiki/User_talk:Xinreality&quot; title=&quot;User talk:Xinreality&quot;&gt;talk&lt;/a&gt;)&lt;/p&gt;
&lt;a href=&quot;https://vrarwiki.com/index.php?title=Eye_tracking&amp;amp;diff=34817&amp;amp;oldid=34816&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34816&amp;oldid=prev</id>
		<title>Xinreality at 16:21, 1 May 2025</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34816&amp;oldid=prev"/>
		<updated>2025-05-01T16:21:15Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;a href=&quot;https://vrarwiki.com/index.php?title=Eye_tracking&amp;amp;diff=34816&amp;amp;oldid=34650&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34650&amp;oldid=prev</id>
		<title>Xinreality: Text replacement - &quot;e.g.,&quot; to &quot;for example&quot;</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34650&amp;oldid=prev"/>
		<updated>2025-04-29T04:23:11Z</updated>

		<summary type="html">&lt;p&gt;Text replacement - &amp;quot;e.g.,&amp;quot; to &amp;quot;for example&amp;quot;&lt;/p&gt;
&lt;a href=&quot;https://vrarwiki.com/index.php?title=Eye_tracking&amp;amp;diff=34650&amp;amp;oldid=34465&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34465&amp;oldid=prev</id>
		<title>Xinreality: /* Standards and Regulations */</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34465&amp;oldid=prev"/>
		<updated>2025-04-24T16:47:37Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Standards and Regulations&lt;/span&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 16:47, 24 April 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l167&quot;&gt;Line 167:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 167:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Standards and Regulations==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Standards and Regulations==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Efforts are underway to standardize aspects of eye tracking technology and address its implications:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Efforts are underway to standardize aspects of eye tracking technology and address its implications:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Khronos Group]] / [[OpenXR]]&#039;&#039;&#039;: Defines standard APIs for accessing eye tracking data (e.g., &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;`&lt;/del&gt;XR_EXT_eye_gaze_interaction&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;`&lt;/del&gt;).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Khronos Group]] / [[OpenXR]]&#039;&#039;&#039;: Defines standard APIs for accessing eye tracking data (e.g., &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;XR_EXT_eye_gaze_interaction&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]]&lt;/ins&gt;).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;[[IEEE Standards Association|IEEE]]&amp;#039;&amp;#039;&amp;#039;: Working groups like IEEE P2048.5 focus on standards for immersive learning, potentially including eye tracking metrics.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;[[IEEE Standards Association|IEEE]]&amp;#039;&amp;#039;&amp;#039;: Working groups like IEEE P2048.5 focus on standards for immersive learning, potentially including eye tracking metrics.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;[[Virtual Reality Industry Forum]] (VRIF)&amp;#039;&amp;#039;&amp;#039;: Develops guidelines for VR implementation, potentially covering eye tracking best practices.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;[[Virtual Reality Industry Forum]] (VRIF)&amp;#039;&amp;#039;&amp;#039;: Develops guidelines for VR implementation, potentially covering eye tracking best practices.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34464&amp;oldid=prev</id>
		<title>Xinreality at 16:46, 24 April 2025</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34464&amp;oldid=prev"/>
		<updated>2025-04-24T16:46:28Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;a href=&quot;https://vrarwiki.com/index.php?title=Eye_tracking&amp;amp;diff=34464&amp;amp;oldid=34463&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34463&amp;oldid=prev</id>
		<title>Xinreality: /* Applications in VR and AR */</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34463&amp;oldid=prev"/>
		<updated>2025-04-24T16:43:21Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Applications in VR and AR&lt;/span&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 16:43, 24 April 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l37&quot;&gt;Line 37:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 37:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;Blinks&amp;#039;&amp;#039;&amp;#039;: Detection of eyelid closures.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&amp;#039;&amp;#039;&amp;#039;Blinks&amp;#039;&amp;#039;&amp;#039;: Detection of eyelid closures.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Applications in VR and AR ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Applications in VR and AR==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Eye tracking unlocks numerous capabilities enhancing VR and AR experiences:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Eye tracking unlocks numerous capabilities enhancing VR and AR experiences:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Foveated rendering]]&#039;&#039;&#039;: Arguably the most impactful application for performance. By knowing precisely where the user is looking, the system renders the scene at maximum [[resolution]] only in the small, central area of the user&#039;s gaze (corresponding to the eye&#039;s high-acuity fovea), while rendering the peripheral areas at progressively lower resolutions. This mimics human vision and can drastically reduce the computational load on the [[graphics processing unit]] (GPU) – potentially by 30% to over 70% – allowing for higher fidelity graphics, increased frame rates, reduced [[latency]], or lower power consumption, all without a noticeable loss in perceived visual quality.&amp;lt;ref name=&quot;Patney2016&quot;&amp;gt;Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D., &amp;amp; Lefohn, A. (2016). Towards foveated rendering for gaze-tracked virtual reality. *ACM Transactions on Graphics, 35*(6), 179.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;TobiiFoveated&quot;&amp;gt;[https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Tobii Blog: Eye Tracking in VR — A Vital Component]. Retrieved Nov 17, 2023.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;NvidiaFoveated&quot;&amp;gt;NVIDIA Corporation. (n.d.). *Maximize VR Performance with Foveated Rendering*. NVIDIA Developer. Retrieved November 16, 2023, from https://developer.nvidia.com/vrworks/graphics/foveatedrendering&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Foveated rendering]]&#039;&#039;&#039;: Arguably the most impactful application for performance. By knowing precisely where the user is looking, the system renders the scene at maximum [[resolution]] only in the small, central area of the user&#039;s gaze (corresponding to the eye&#039;s high-acuity fovea), while rendering the peripheral areas at progressively lower resolutions. This mimics human vision and can drastically reduce the computational load on the [[graphics processing unit]] (GPU) – potentially by 30% to over 70% – allowing for higher fidelity graphics, increased frame rates, reduced [[latency]], or lower power consumption, all without a noticeable loss in perceived visual quality.&amp;lt;ref name=&quot;Patney2016&quot;&amp;gt;Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D., &amp;amp; Lefohn, A. (2016). Towards foveated rendering for gaze-tracked virtual reality. *ACM Transactions on Graphics, 35*(6), 179.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;TobiiFoveated&quot;&amp;gt;[https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component Tobii Blog: Eye Tracking in VR — A Vital Component]. Retrieved Nov 17, 2023.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;NvidiaFoveated&quot;&amp;gt;NVIDIA Corporation. (n.d.). *Maximize VR Performance with Foveated Rendering*. NVIDIA Developer. Retrieved November 16, 2023, from https://developer.nvidia.com/vrworks/graphics/foveatedrendering&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Natural Interaction / Gaze-Based Interaction&#039;&#039;&#039;: Eye tracking enables more intuitive control schemes:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Natural Interaction / Gaze-Based Interaction&#039;&#039;&#039;: Eye tracking enables more intuitive control schemes:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Gaze Selection/Pointing&#039;&#039;&#039;: Allows users to select objects, menu items, or targets simply by looking at them. This is often combined with a confirmation action like a button press on a [[controller (computing)|controller]], a [[hand tracking]] gesture (e.g., pinch), or a short dwell time.&amp;lt;ref name=&quot;Piumsomboon2017&quot;&amp;gt;Piumsomboon, T., Lee, G., Lindeman, R. W., &amp;amp; Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive virtual reality. *IEEE Symposium on 3D User Interfaces*, 36-39.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;Gaze Selection/Pointing&#039;&#039;&#039;: Allows users to select objects, menu items, or targets simply by looking at them. This is often combined with a confirmation action like a button press on a [[controller (computing)|controller]], a [[hand tracking]] gesture (e.g., pinch), or a short dwell time.&amp;lt;ref name=&quot;Piumsomboon2017&quot;&amp;gt;Piumsomboon, T., Lee, G., Lindeman, R. W., &amp;amp; Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive virtual reality. *IEEE Symposium on 3D User Interfaces*, 36-39.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Intent Prediction&#039;&#039;&#039;: Systems can anticipate user actions or needs based on gaze patterns (e.g., highlighting an object the user looks at intently).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;Intent Prediction&#039;&#039;&#039;: Systems can anticipate user actions or needs based on gaze patterns (e.g., highlighting an object the user looks at intently).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Gaze-directed Locomotion&#039;&#039;&#039;: Steering movement within the virtual world based on gaze direction.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;Gaze-directed Locomotion&#039;&#039;&#039;: Steering movement within the virtual world based on gaze direction.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Enhanced Social Presence / [[Avatar]] Realism&#039;&#039;&#039;: Realistic eye movements, including subtle saccades, blinks, and responsive gaze shifts, can be mirrored onto a user&#039;s avatar in social VR applications. This significantly enhances [[non-verbal communication]] and the feeling of [[social presence]] and connection when interacting with others.&amp;lt;ref name=&quot;MetaAvatarsEyeTracking&quot;&amp;gt;Meta Platforms, Inc. (2022, October 11). *Meta Quest Pro: A New Way to Work, Create and Collaborate*. Meta Quest Blog. https://www.meta.com/blog/quest/meta-quest-pro-vr-headset-features-price-release-date/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Enhanced Social Presence / [[Avatar]] Realism&#039;&#039;&#039;: Realistic eye movements, including subtle saccades, blinks, and responsive gaze shifts, can be mirrored onto a user&#039;s avatar in social VR applications. This significantly enhances [[non-verbal communication]] and the feeling of [[social presence]] and connection when interacting with others.&amp;lt;ref name=&quot;MetaAvatarsEyeTracking&quot;&amp;gt;Meta Platforms, Inc. (2022, October 11). *Meta Quest Pro: A New Way to Work, Create and Collaborate*. Meta Quest Blog. https://www.meta.com/blog/quest/meta-quest-pro-vr-headset-features-price-release-date/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;User Analytics and Research&#039;&#039;&#039;: Eye tracking provides invaluable objective data for:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;User Analytics and Research&#039;&#039;&#039;: Eye tracking provides invaluable objective data for:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Usability testing]] &amp;amp; User Experience (UX) Research&#039;&#039;&#039;: Understanding how users visually explore and interact with interfaces or environments.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;[[Usability testing]] &amp;amp; User Experience (UX) Research&#039;&#039;&#039;: Understanding how users visually explore and interact with interfaces or environments.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Attention mapping]]&#039;&#039;&#039;: Creating heatmaps and gaze plots to visualize areas of interest and attention duration.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;[[Attention mapping]]&#039;&#039;&#039;: Creating heatmaps and gaze plots to visualize areas of interest and attention duration.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Cognitive Load Assessment&#039;&#039;&#039;: Measuring mental workload through metrics like pupil dilation, blink rate, and fixation patterns.&amp;lt;ref name=&quot;Clay2019&quot;&amp;gt;Clay, V., König, P., &amp;amp; König, S. (2019). Eye tracking in virtual reality. *Journal of Eye Movement Research, 12*(1).&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;Cognitive Load Assessment&#039;&#039;&#039;: Measuring mental workload through metrics like pupil dilation, blink rate, and fixation patterns.&amp;lt;ref name=&quot;Clay2019&quot;&amp;gt;Clay, V., König, P., &amp;amp; König, S. (2019). Eye tracking in virtual reality. *Journal of Eye Movement Research, 12*(1).&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Training and Simulation Analysis&#039;&#039;&#039;: Assessing trainee attention, situational awareness, and decision-making processes in professional simulations (e.g., medical, aviation).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;&#039;&#039;&#039;Training and Simulation Analysis&#039;&#039;&#039;: Assessing trainee attention, situational awareness, and decision-making processes in professional simulations (e.g., medical, aviation).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Automatic [[Interpupillary distance|IPD]] Adjustment&#039;&#039;&#039;: Some headsets utilize the eye tracking cameras to automatically measure the user&#039;s interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Automatic [[Interpupillary distance|IPD]] Adjustment&#039;&#039;&#039;: Some headsets utilize the eye tracking cameras to automatically measure the user&#039;s interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Accessibility]]&#039;&#039;&#039;: Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Accessibility]]&#039;&#039;&#039;: Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Adaptive Optics / [[Varifocal display]]s&#039;&#039;&#039;: Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.&amp;lt;ref name=&quot;Akeley2004&quot;&amp;gt;Akeley, K., Watt, S.J., Girshick, A.R., &amp;amp; Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Adaptive Optics / [[Varifocal display]]s&#039;&#039;&#039;: Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.&amp;lt;ref name=&quot;Akeley2004&quot;&amp;gt;Akeley, K., Watt, S.J., Girshick, A.R., &amp;amp; Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Dynamic Distortion Compensation]]&#039;&#039;&#039;: Real-time adjustments to lens distortion correction based on precise eye position relative to the lens center can improve perceived sharpness across the field of view.&amp;lt;ref name=&quot;TobiiFoveated&quot;/&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Dynamic Distortion Compensation]]&#039;&#039;&#039;: Real-time adjustments to lens distortion correction based on precise eye position relative to the lens center can improve perceived sharpness across the field of view.&amp;lt;ref name=&quot;TobiiFoveated&quot;/&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Current Implementations ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Current Implementations ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34462&amp;oldid=prev</id>
		<title>Xinreality: /* Technical Principles */</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34462&amp;oldid=prev"/>
		<updated>2025-04-24T16:42:04Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Technical Principles&lt;/span&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 16:42, 24 April 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l4&quot;&gt;Line 4:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 4:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While its integration into consumer VR/AR is relatively recent, the systematic study of eye movements dates back to the 19th century. [[Louis Émile Javal]] noted in 1879 that reading involved discrete stops (fixations) and rapid movements (saccades).&amp;lt;ref name=&amp;quot;Rayner1998&amp;quot;&amp;gt;Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. *Psychological Bulletin, 124*(3), 372–422.&amp;lt;/ref&amp;gt; Early tracking devices included [[Edmund Huey]]&amp;#039;s contact lens-based tracker (~1908)&amp;lt;ref name=&amp;quot;Huey1908&amp;quot;&amp;gt;Huey, E.B. (1908). *The Psychology and Pedagogy of Reading*. Macmillan.&amp;lt;/ref&amp;gt; and [[Guy Thomas Buswell]]&amp;#039;s film-based systems in the 1930s. [[Alfred L. Yarbus]]&amp;#039;s work in the 1960s highlighted how viewing patterns are task-dependent.&amp;lt;ref name=&amp;quot;Yarbus1967&amp;quot;&amp;gt;Yarbus, A.L. (1967). *Eye Movements and Vision*. Plenum Press.&amp;lt;/ref&amp;gt; These foundational efforts paved the way for modern video-based and integrated tracking systems.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While its integration into consumer VR/AR is relatively recent, the systematic study of eye movements dates back to the 19th century. [[Louis Émile Javal]] noted in 1879 that reading involved discrete stops (fixations) and rapid movements (saccades).&amp;lt;ref name=&amp;quot;Rayner1998&amp;quot;&amp;gt;Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. *Psychological Bulletin, 124*(3), 372–422.&amp;lt;/ref&amp;gt; Early tracking devices included [[Edmund Huey]]&amp;#039;s contact lens-based tracker (~1908)&amp;lt;ref name=&amp;quot;Huey1908&amp;quot;&amp;gt;Huey, E.B. (1908). *The Psychology and Pedagogy of Reading*. Macmillan.&amp;lt;/ref&amp;gt; and [[Guy Thomas Buswell]]&amp;#039;s film-based systems in the 1930s. [[Alfred L. Yarbus]]&amp;#039;s work in the 1960s highlighted how viewing patterns are task-dependent.&amp;lt;ref name=&amp;quot;Yarbus1967&amp;quot;&amp;gt;Yarbus, A.L. (1967). *Eye Movements and Vision*. Plenum Press.&amp;lt;/ref&amp;gt; These foundational efforts paved the way for modern video-based and integrated tracking systems.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Technical Principles ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Technical Principles==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Tracking &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Methods &lt;/del&gt;===&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===Tracking &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Method&lt;/ins&gt;===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Most eye tracking systems in contemporary VR and AR headsets employ one of several core technologies:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Most eye tracking systems in contemporary VR and AR headsets employ one of several core technologies:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Pupil Center Corneal Reflection]] (PCCR)&#039;&#039;&#039;: This is the predominant method. It involves:&amp;lt;ref name=&quot;Holmqvist2011&quot;&amp;gt;Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., &amp;amp; Van de Weijer, J. (2011). *Eye tracking: A comprehensive guide to methods and measures*. Oxford University Press.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;RoadToVREyeTracking&quot;&amp;gt;Lang, B. (2021, July 21). *Casual Explainer: What is Eye-tracking &amp;amp; How Does it Work?* Road to VR. https://www.roadtovr.com/casual-explainer-what-is-eye-tracking-how-does-it-work/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Pupil Center Corneal Reflection]] (PCCR)&#039;&#039;&#039;: This is the predominant method. It involves:&amp;lt;ref name=&quot;Holmqvist2011&quot;&amp;gt;Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., &amp;amp; Van de Weijer, J. (2011). *Eye tracking: A comprehensive guide to methods and measures*. Oxford University Press.&amp;lt;/ref&amp;gt;&amp;lt;ref name=&quot;RoadToVREyeTracking&quot;&amp;gt;Lang, B. (2021, July 21). *Casual Explainer: What is Eye-tracking &amp;amp; How Does it Work?* Road to VR. https://www.roadtovr.com/casual-explainer-what-is-eye-tracking-how-does-it-work/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Illumination:&#039;&#039;&#039; [[Infrared]] (IR) light-emitting diodes ([[LED]]s) safely illuminate the eye. IR light is used because it is invisible to the human eye, preventing distraction, and provides high contrast for [[camera]]s.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Illumination:&#039;&#039;&#039; [[Infrared]] (IR) light-emitting diodes ([[LED]]s) safely illuminate the eye. IR light is used because it is invisible to the human eye, preventing distraction, and provides high contrast for [[camera]]s.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Imaging:&#039;&#039;&#039; Small, high-[[frame rate]] infrared cameras capture images of the eye, specifically tracking the center of the [[pupil]] and the reflection(s) of the IR light off the surface of the [[cornea]] (known as glints).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Imaging:&#039;&#039;&#039; Small, high-[[frame rate]] infrared cameras capture images of the eye, specifically tracking the center of the [[pupil]] and the reflection(s) of the IR light off the surface of the [[cornea]] (known as glints).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Algorithm|Algorithmic Processing]]:&#039;&#039;&#039; Sophisticated [[computer vision]] and [[image processing]] algorithms analyze the captured images. By calculating the vector between the pupil center and the corneal reflection(s), the system determines the eye&#039;s orientation and calculates the user&#039;s gaze point with high accuracy.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Algorithm|Algorithmic Processing]]:&#039;&#039;&#039; Sophisticated [[computer vision]] and [[image processing]] algorithms analyze the captured images. By calculating the vector between the pupil center and the corneal reflection(s), the system determines the eye&#039;s orientation and calculates the user&#039;s gaze point with high accuracy.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;    &lt;/del&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Calibration]]:&#039;&#039;&#039; A per-user calibration process is usually required upon first use, and sometimes periodically, to account for individual differences in eye physiology (e.g., corneal shape, pupil size range) and the precise fit of the headset. This typically involves the user looking at specific points displayed within the headset.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Calibration]]:&#039;&#039;&#039; A per-user calibration process is usually required upon first use, and sometimes periodically, to account for individual differences in eye physiology (e.g., corneal shape, pupil size range) and the precise fit of the headset. This typically involves the user looking at specific points displayed within the headset.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Video-based eye tracking (Shape/Feature Tracking)&#039;&#039;&#039;: Uses cameras aimed at the eyes to capture images, which are then analyzed using computer vision algorithms to identify eye features (pupil outline, iris texture, blood vessels) to determine gaze direction without necessarily relying on corneal reflections.&amp;lt;ref name=&quot;Hansen2010&quot;&amp;gt;Hansen, D. W., &amp;amp; Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. *IEEE Transactions on Pattern Analysis and Machine Intelligence, 32*(3), 478-500.&amp;lt;/ref&amp;gt; PCCR is often considered a subset of this broader category.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Video-based eye tracking (Shape/Feature Tracking)&#039;&#039;&#039;: Uses cameras aimed at the eyes to capture images, which are then analyzed using computer vision algorithms to identify eye features (pupil outline, iris texture, blood vessels) to determine gaze direction without necessarily relying on corneal reflections.&amp;lt;ref name=&quot;Hansen2010&quot;&amp;gt;Hansen, D. W., &amp;amp; Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. *IEEE Transactions on Pattern Analysis and Machine Intelligence, 32*(3), 478-500.&amp;lt;/ref&amp;gt; PCCR is often considered a subset of this broader category.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Electrooculography]] (EOG)&#039;&#039;&#039;: Measures the electrical potential difference between electrodes placed on the skin around the eyes. This potential changes predictably as the eye rotates. While less common for high-accuracy gaze pointing in consumer VR/AR due to lower precision and susceptibility to muscle noise, it can be used for detecting larger eye movements or in specialized applications.&amp;lt;ref name=&quot;Bulling2011&quot;&amp;gt;Bulling, A., Ward, J. A., Gellersen, H., &amp;amp; Tröster, G. (2011). Eye movement analysis for activity recognition using electrooculography. *IEEE Transactions on Pattern Analysis and Machine Intelligence, 33*(4), 741-753.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Electrooculography]] (EOG)&#039;&#039;&#039;: Measures the electrical potential difference between electrodes placed on the skin around the eyes. This potential changes predictably as the eye rotates. While less common for high-accuracy gaze pointing in consumer VR/AR due to lower precision and susceptibility to muscle noise, it can be used for detecting larger eye movements or in specialized applications.&amp;lt;ref name=&quot;Bulling2011&quot;&amp;gt;Bulling, A., Ward, J. A., Gellersen, H., &amp;amp; Tröster, G. (2011). Eye movement analysis for activity recognition using electrooculography. *IEEE Transactions on Pattern Analysis and Machine Intelligence, 33*(4), 741-753.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Scleral search coil]]&#039;&#039;&#039;: Involves wearing a special contact lens containing a wire coil. The user sits within a magnetic field, and eye movements induce currents in the coil, providing extremely precise measurements. This is highly invasive and primarily used in laboratory research, not consumer VR/AR.&amp;lt;ref name=&quot;Robinson1963&quot;&amp;gt;Robinson, D.A. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. *IEEE Transactions on Biomedical Engineering, BME-10*(4), 137–145.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Scleral search coil]]&#039;&#039;&#039;: Involves wearing a special contact lens containing a wire coil. The user sits within a magnetic field, and eye movements induce currents in the coil, providing extremely precise measurements. This is highly invasive and primarily used in laboratory research, not consumer VR/AR.&amp;lt;ref name=&quot;Robinson1963&quot;&amp;gt;Robinson, D.A. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. *IEEE Transactions on Biomedical Engineering, BME-10*(4), 137–145.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Key Components (PCCR-based) ===&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===Key Components (PCCR-based)===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A typical PCCR eye tracking system integrated into a VR/AR headset consists of:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A typical PCCR eye tracking system integrated into a VR/AR headset consists of:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Illuminators&#039;&#039;&#039;: Infrared LEDs providing consistent, non-visible lighting.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Illuminators&#039;&#039;&#039;: Infrared LEDs providing consistent, non-visible lighting.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Cameras&#039;&#039;&#039;: Specialized high-speed infrared cameras positioned to capture clear images of both eyes.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Cameras&#039;&#039;&#039;: Specialized high-speed infrared cameras positioned to capture clear images of both eyes.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Processing Unit&#039;&#039;&#039;: Either onboard [[System on a Chip|SoC]] resources or dedicated hardware to run the detection and gaze calculation algorithms in real-time.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Processing Unit&#039;&#039;&#039;: Either onboard [[System on a Chip|SoC]] resources or dedicated hardware to run the detection and gaze calculation algorithms in real-time.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Calibration Software&#039;&#039;&#039;: [[Software]] routines guiding the user through calibration and storing individual profiles.&amp;lt;ref name=&quot;Kar2017&quot;&amp;gt;Kar, A., &amp;amp; Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. *IEEE Access, 5*, 16495-16519.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Calibration Software&#039;&#039;&#039;: [[Software]] routines guiding the user through calibration and storing individual profiles.&amp;lt;ref name=&quot;Kar2017&quot;&amp;gt;Kar, A., &amp;amp; Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. *IEEE Access, 5*, 16495-16519.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Eye Movement Types Measured ===&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===Eye Movement Types Measured===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Eye tracking systems in VR/AR can detect and analyze various types of eye movements and states:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Eye tracking systems in VR/AR can detect and analyze various types of eye movements and states:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Fixations]]&#039;&#039;&#039;: Periods when the gaze remains relatively stable on a specific area (typically &amp;gt; 100-200 ms), indicating visual attention.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Fixations]]&#039;&#039;&#039;: Periods when the gaze remains relatively stable on a specific area (typically &amp;gt; 100-200 ms), indicating visual attention.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Saccades]]&#039;&#039;&#039;: Rapid, ballistic movements shifting the gaze between fixation points (typically 30-120 ms).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Saccades]]&#039;&#039;&#039;: Rapid, ballistic movements shifting the gaze between fixation points (typically 30-120 ms).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Smooth pursuit]]&#039;&#039;&#039;: Movements allowing the eyes to smoothly follow a moving object.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Smooth pursuit]]&#039;&#039;&#039;: Movements allowing the eyes to smoothly follow a moving object.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Vergence]]&#039;&#039;&#039;: The simultaneous movement of both eyes in opposite directions to obtain or maintain single binocular vision, critical for focusing on objects at different depths.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Vergence]]&#039;&#039;&#039;: The simultaneous movement of both eyes in opposite directions to obtain or maintain single binocular vision, critical for focusing on objects at different depths.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Pupil dilation|Pupil Size / Dilation]]&#039;&#039;&#039;: Changes in pupil diameter (pupillometry), which can correlate with changes in light levels, cognitive load, emotional arousal, or interest.&amp;lt;ref name=&quot;Leigh2015&quot;&amp;gt;Leigh, R. J., &amp;amp; Zee, D. S. (2015). *The neurology of eye movements*. Oxford University Press.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Pupil dilation|Pupil Size / Dilation]]&#039;&#039;&#039;: Changes in pupil diameter (pupillometry), which can correlate with changes in light levels, cognitive load, emotional arousal, or interest.&amp;lt;ref name=&quot;Leigh2015&quot;&amp;gt;Leigh, R. J., &amp;amp; Zee, D. S. (2015). *The neurology of eye movements*. Oxford University Press.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Blinks&#039;&#039;&#039;: Detection of eyelid closures.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Blinks&#039;&#039;&#039;: Detection of eyelid closures.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Applications in VR and AR ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Applications in VR and AR ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34461&amp;oldid=prev</id>
		<title>Xinreality at 16:40, 24 April 2025</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34461&amp;oldid=prev"/>
		<updated>2025-04-24T16:40:48Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 16:40, 24 April 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l57&quot;&gt;Line 57:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 57:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   &amp;#039;&amp;#039;&amp;#039;Automatic [[Interpupillary distance|IPD]] Adjustment&amp;#039;&amp;#039;&amp;#039;: Some headsets utilize the eye tracking cameras to automatically measure the user&amp;#039;s interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   &amp;#039;&amp;#039;&amp;#039;Automatic [[Interpupillary distance|IPD]] Adjustment&amp;#039;&amp;#039;&amp;#039;: Some headsets utilize the eye tracking cameras to automatically measure the user&amp;#039;s interpupillary distance (the distance between the centers of the pupils) and mechanically adjust the lens spacing for optimal visual clarity, stereo depth perception, and user comfort.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   &#039;&#039;&#039;[[Accessibility]]&#039;&#039;&#039;: Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR.&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&amp;lt;ref name=&quot;Yuan2020&quot;&amp;gt;Yuan, Z., Bi, T., Muntean, G. M., &amp;amp; Ghinea, G. (2020). Perceived synchronization of mulsemedia services. *IEEE Transactions on Multimedia, 17*(7), 957-966. Note: While this ref discusses synchronization, accessibility is a widely cited benefit of eye tracking.&amp;lt;/ref&amp;gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   &#039;&#039;&#039;[[Accessibility]]&#039;&#039;&#039;: Eye tracking offers a powerful hands-free input modality for users with limited physical mobility, enabling them to navigate interfaces, communicate (e.g., gaze typing), and control applications within VR/AR.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   &amp;#039;&amp;#039;&amp;#039;Adaptive Optics / [[Varifocal display]]s&amp;#039;&amp;#039;&amp;#039;: Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.&amp;lt;ref name=&amp;quot;Akeley2004&amp;quot;&amp;gt;Akeley, K., Watt, S.J., Girshick, A.R., &amp;amp; Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   &amp;#039;&amp;#039;&amp;#039;Adaptive Optics / [[Varifocal display]]s&amp;#039;&amp;#039;&amp;#039;: Eye tracking is essential for dynamic varifocal displays, which adjust their focal plane based on where the user is looking in virtual depth. This helps address the [[vergence-accommodation conflict]], potentially reducing eye strain and improving visual realism.&amp;lt;ref name=&amp;quot;Akeley2004&amp;quot;&amp;gt;Akeley, K., Watt, S.J., Girshick, A.R., &amp;amp; Banks, M.S. (2004). A stereo display prototype with multiple focal distances. *ACM Transactions on Graphics, 23*(3), 804–813.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l142&quot;&gt;Line 142:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 142:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;ref name=&amp;quot;Duchowski2018&amp;quot;&amp;gt;Duchowski, A. T., Krejtz, K., Krejtz, I., Biele, C., Niedzielska, A., Kiefer, P., Raubal, M., &amp;amp; Giannopoulos, I. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. *CHI Conference on Human Factors in Computing Systems*, 1-13.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;ref name=&amp;quot;Duchowski2018&amp;quot;&amp;gt;Duchowski, A. T., Krejtz, K., Krejtz, I., Biele, C., Niedzielska, A., Kiefer, P., Raubal, M., &amp;amp; Giannopoulos, I. (2018). The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. *CHI Conference on Human Factors in Computing Systems*, 1-13.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Software Development and APIs ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Software Development and APIs==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Integrating eye tracking into applications requires specific software support:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Integrating eye tracking into applications requires specific software support:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;### &lt;/del&gt;Development Frameworks and APIs&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===&lt;/ins&gt;Development Frameworks and APIs&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Unity (game engine)|Unity]]&#039;&#039;&#039;: Provides APIs through its XR Interaction Toolkit and potential vendor-specific SDKs (e.g., Tobii XR SDK).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Unity (game engine)|Unity]]&#039;&#039;&#039;: Provides APIs through its XR Interaction Toolkit and potential vendor-specific SDKs (e.g., Tobii XR SDK).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Unreal Engine]]&#039;&#039;&#039;: Offers native interfaces and plugins for accessing eye tracking data.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Unreal Engine]]&#039;&#039;&#039;: Offers native interfaces and plugins for accessing eye tracking data.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[OpenXR]]&#039;&#039;&#039;: The cross-platform standard includes an `XR_EXT_eye_gaze_interaction` extension, allowing developers to write more portable code.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[OpenXR]]&#039;&#039;&#039;: The cross-platform standard includes an `XR_EXT_eye_gaze_interaction` extension, allowing developers to write more portable code.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Vendor SDKs&#039;&#039;&#039;: Companies like [[Tobii]] provide dedicated Software Development Kits offering fine-grained control and optimized features for their hardware.&amp;lt;ref name=&quot;TobiiSDK&quot;&amp;gt;Tobii Technology. (2023). *Tobii XR SDK Documentation*. Tobii.com.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Vendor SDKs&#039;&#039;&#039;: Companies like [[Tobii]] provide dedicated Software Development Kits offering fine-grained control and optimized features for their hardware.&amp;lt;ref name=&quot;TobiiSDK&quot;&amp;gt;Tobii Technology. (2023). *Tobii XR SDK Documentation*. Tobii.com.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;### &lt;/del&gt;Data Access Levels&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===&lt;/ins&gt;Data Access Levels&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Developers typically access eye tracking data at different levels of abstraction:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Developers typically access eye tracking data at different levels of abstraction:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Raw Data&#039;&#039;&#039;: Direct coordinates of pupil centers, glints, eye openness, etc. Requires significant processing by the application.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Raw Data&#039;&#039;&#039;: Direct coordinates of pupil centers, glints, eye openness, etc. Requires significant processing by the application.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Gaze Data&#039;&#039;&#039;: Processed output providing calibrated gaze origin and direction vectors, or intersection points on virtual surfaces.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Gaze Data&#039;&#039;&#039;: Processed output providing calibrated gaze origin and direction vectors, or intersection points on virtual surfaces.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Eye Movement Events&#039;&#039;&#039;: Classified data identifying fixations, saccades, and blinks.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Eye Movement Events&#039;&#039;&#039;: Classified data identifying fixations, saccades, and blinks.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Semantic Data&#039;&#039;&#039;: Higher-level interpretations, such as the specific object being looked at (gaze target) or estimated attention levels.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Semantic Data&#039;&#039;&#039;: Higher-level interpretations, such as the specific object being looked at (gaze target) or estimated attention levels.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;ref name=&amp;quot;Kumar2016&amp;quot;&amp;gt;Kumar, D., Dutta, A., Das, A., &amp;amp; Lahiri, U. (2016). SmartEye: Developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24*(10), 1051-1059.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;ref name=&amp;quot;Kumar2016&amp;quot;&amp;gt;Kumar, D., Dutta, A., Das, A., &amp;amp; Lahiri, U. (2016). SmartEye: Developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24*(10), 1051-1059.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Ethical and Privacy Considerations ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Ethical and Privacy Considerations==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The collection and use of eye tracking data necessitate careful ethical consideration:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The collection and use of eye tracking data necessitate careful ethical consideration:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Biometric Data Privacy&#039;&#039;&#039;: Gaze patterns can be unique identifiers. This data requires strong [[data security]] measures and compliance with regulations like [[GDPR]] (which may classify it as sensitive biometric data).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Biometric Data Privacy&#039;&#039;&#039;: Gaze patterns can be unique identifiers. This data requires strong [[data security]] measures and compliance with regulations like [[GDPR]] (which may classify it as sensitive biometric data).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Inference and Profiling&#039;&#039;&#039;: The potential to infer sensitive information (health, emotions, interests) without explicit consent raises ethical questions.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Inference and Profiling&#039;&#039;&#039;: The potential to infer sensitive information (health, emotions, interests) without explicit consent raises ethical questions.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Attention Monitoring&#039;&#039;&#039;: Concerns about workplace surveillance or manipulative advertising based on attention analytics.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Attention Monitoring&#039;&#039;&#039;: Concerns about workplace surveillance or manipulative advertising based on attention analytics.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Informed Consent&#039;&#039;&#039;: Users must be clearly informed about what eye data is collected, how it is processed, stored, and shared, and for what purposes. Opt-in mechanisms are crucial.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Informed Consent&#039;&#039;&#039;: Users must be clearly informed about what eye data is collected, how it is processed, stored, and shared, and for what purposes. Opt-in mechanisms are crucial.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Data Minimization and Anonymization&#039;&#039;&#039;: Collect only necessary data, anonymize or aggregate whenever possible, and define clear data retention policies.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Data Minimization and Anonymization&#039;&#039;&#039;: Collect only necessary data, anonymize or aggregate whenever possible, and define clear data retention policies.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;ref name=&amp;quot;Kroger2020&amp;quot;/&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;ref name=&amp;quot;Kroger2020&amp;quot;/&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Standards and Regulations ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Standards and Regulations==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Efforts are underway to standardize aspects of eye tracking technology and address its implications:&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Efforts are underway to standardize aspects of eye tracking technology and address its implications:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Khronos Group]] / [[OpenXR]]&#039;&#039;&#039;: Defines standard APIs for accessing eye tracking data (e.g., `XR_EXT_eye_gaze_interaction`).&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Khronos Group]] / [[OpenXR]]&#039;&#039;&#039;: Defines standard APIs for accessing eye tracking data (e.g., `XR_EXT_eye_gaze_interaction`).&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[IEEE Standards Association|IEEE]]&#039;&#039;&#039;: Working groups like IEEE P2048.5 focus on standards for immersive learning, potentially including eye tracking metrics.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[IEEE Standards Association|IEEE]]&#039;&#039;&#039;: Working groups like IEEE P2048.5 focus on standards for immersive learning, potentially including eye tracking metrics.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[Virtual Reality Industry Forum]] (VRIF)&#039;&#039;&#039;: Develops guidelines for VR implementation, potentially covering eye tracking best practices.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[Virtual Reality Industry Forum]] (VRIF)&#039;&#039;&#039;: Develops guidelines for VR implementation, potentially covering eye tracking best practices.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;Data Protection Regulations&#039;&#039;&#039;: [[GDPR]] (Europe), [[CCPA]]/[[CPRA]] (California), and similar laws globally impose requirements on handling personal and biometric data, including eye tracking data.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;Data Protection Regulations&#039;&#039;&#039;: [[GDPR]] (Europe), [[CCPA]]/[[CPRA]] (California), and similar laws globally impose requirements on handling personal and biometric data, including eye tracking data.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;  &lt;/del&gt;&#039;&#039;&#039;[[ISO]]/[[IEC]] JTC 1/SC 24&#039;&#039;&#039;: Committee working on international standards for computer graphics, image processing, and environmental data representation, relevant to VR/AR interfaces.&amp;lt;ref name=&quot;ISO_SC24&quot;&amp;gt;International Organization for Standardization. (2023). *ISO/IEC JTC 1/SC 24 - Computer graphics, image processing and environmental data representation*. ISO.org.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*&#039;&#039;&#039;[[ISO]]/[[IEC]] JTC 1/SC 24&#039;&#039;&#039;: Committee working on international standards for computer graphics, image processing, and environmental data representation, relevant to VR/AR interfaces.&amp;lt;ref name=&quot;ISO_SC24&quot;&amp;gt;International Organization for Standardization. (2023). *ISO/IEC JTC 1/SC 24 - Computer graphics, image processing and environmental data representation*. ISO.org.&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== See Also ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==See Also==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   [[Augmented reality]]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   [[Augmented reality]]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   [[AR headset]]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   [[AR headset]]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l205&quot;&gt;Line 205:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 202:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   [[VR headset]]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;*   [[VR headset]]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== References ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==References==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34460&amp;oldid=prev</id>
		<title>Xinreality: Undo revision 34459 by Xinreality (talk)</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Eye_tracking&amp;diff=34460&amp;oldid=prev"/>
		<updated>2025-04-24T16:21:55Z</updated>

		<summary type="html">&lt;p&gt;Undo revision &lt;a href=&quot;/wiki/Special:Diff/34459&quot; title=&quot;Special:Diff/34459&quot;&gt;34459&lt;/a&gt; by &lt;a href=&quot;/wiki/Special:Contributions/Xinreality&quot; title=&quot;Special:Contributions/Xinreality&quot;&gt;Xinreality&lt;/a&gt; (&lt;a href=&quot;/wiki/User_talk:Xinreality&quot; title=&quot;User talk:Xinreality&quot;&gt;talk&lt;/a&gt;)&lt;/p&gt;
&lt;a href=&quot;https://vrarwiki.com/index.php?title=Eye_tracking&amp;amp;diff=34460&amp;amp;oldid=34459&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
</feed>