Jump to content

Foveated rendering: Difference between revisions

From VR & AR Wiki
No edit summary
No edit summary
Line 4: Line 4:


{{Infobox technology
{{Infobox technology
| name = Foveated Rendering
| name         = Foveated rendering
| image = [[File:Foveated rendering illustration.svg|250px]]
| image       = [[File:Foveated_rendering_illustration.svg|250px]]
| caption = Illustration of foveated rendering, showing high resolution in the foveal region and reduced resolution in the periphery
| caption     = Illustration of foveated rendering, showing high resolution in the foveal region and reduced resolution in the periphery
| type = [[Computer graphics]] technique
| type         = [[Computer graphics]] technique
| industry = [[Virtual reality]] (VR), [[Augmented reality]] (AR)
| used_in      = [[Virtual reality]], [[Augmented reality]], [[Extended reality]]
| application = Performance optimization, human vision simulation
| inventor    =
| hardware = [[Graphics processing unit|GPU]]s, [[Head-mounted display]]s with [[eye tracking]]
| inception    = 1990 (research), 2016 (commercial demonstrations), 2019 (consumer deployment)
| manufacturer = [[Meta]], [[Sony]], [[Apple]], [[HTC]], [[Varjo]], [[Pico]]
| available    = Yes
| related      = [[Eye tracking]], [[Variable rate shading]], [[Level of detail]]
}}
}}


Line 17: Line 20:
By rendering the area of the image that falls on the user's fovea at the highest [[resolution]] and progressively reducing the quality of the image in the periphery, foveated rendering can achieve significant performance gains with little to no perceptible loss in visual quality.<ref name="IntegrativeView">{{cite web |url=https://www.researchgate.net/publication/355503409_An_integrative_view_of_foveated_rendering |title=An integrative view of foveated rendering}}</ref><ref name="VarjoWhatIs">{{cite web |url=https://support.varjo.com/hc/en-us/what-is-foveated-rendering |title=What is foveated rendering? | Varjo Support}}</ref> This makes it a critical enabling technology for [[virtual reality]] (VR) and [[augmented reality]] (AR) [[head-mounted display]]s (HMDs), which must render high-resolution, stereoscopic images at very high [[frame rate]]s to provide a comfortable and immersive experience.<ref name="HVS_VR_Context">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations}}</ref>
By rendering the area of the image that falls on the user's fovea at the highest [[resolution]] and progressively reducing the quality of the image in the periphery, foveated rendering can achieve significant performance gains with little to no perceptible loss in visual quality.<ref name="IntegrativeView">{{cite web |url=https://www.researchgate.net/publication/355503409_An_integrative_view_of_foveated_rendering |title=An integrative view of foveated rendering}}</ref><ref name="VarjoWhatIs">{{cite web |url=https://support.varjo.com/hc/en-us/what-is-foveated-rendering |title=What is foveated rendering? | Varjo Support}}</ref> This makes it a critical enabling technology for [[virtual reality]] (VR) and [[augmented reality]] (AR) [[head-mounted display]]s (HMDs), which must render high-resolution, stereoscopic images at very high [[frame rate]]s to provide a comfortable and immersive experience.<ref name="HVS_VR_Context">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations}}</ref>


Implementations of foveated rendering are broadly categorized into two types: '''[[Fixed foveated rendering]]''' (FFR), which assumes the user is always looking at the center of the screen, and '''dynamic (or eye-tracked) foveated rendering''' (ETFR or DFR), which uses integrated [[eye tracking]] hardware to update the high-quality region in real-time to match the user's gaze.<ref name="MetaETFRvsFFR">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
Implementations of foveated rendering are broadly categorized into two types: '''fixed foveated rendering''' (FFR), which assumes the user is always looking at the center of the screen, and '''dynamic (or eye-tracked) foveated rendering''' (ETFR or DFR), which uses integrated [[eye tracking]] hardware to update the high-quality region in real-time to match the user's gaze.<ref name="MetaETFRvsFFR">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>


== Biological Foundation: The Human Visual System ==
== Biological Foundation: The Human Visual System ==
The efficacy of foveated rendering is entirely dependent on the unique, non-uniform characteristics of the human visual system. The design of the human [[retina]] is the biological blueprint that computer graphics engineers seek to mimic for performance optimization.
The efficacy of foveated rendering is entirely dependent on the unique, non-uniform characteristics of the human visual system. The design of the human [[retina]] is the biological blueprint that computer graphics engineers seek to mimic for performance optimization.


=== Foveal vs. Peripheral Vision ===
=== Foveal vs. Peripheral Vision ===
The retina is not a uniform sensor. It contains a small, specialized central region called the '''[[fovea centralis|fovea]]''', which is responsible for sharp, detailed, and color-rich central vision (also known as foveal vision).<ref name="GazeContingentPipeline">{{cite web |url=https://graphics.tu-bs.de/upload/publications/stengel2016adaptsampling.pdf |title=Gaze-Contingent Rendering for Deferred Shading}}</ref> This region is densely packed with [[cone cell]]s, the photoreceptors responsible for high-acuity and color perception. The fovea covers only about 1-2 degrees of the visual field, yet it consumes approximately 50% of the neural resources in the [[visual cortex]].<ref name="StateOfArtSurvey">{{cite web |url=https://www.researchgate.net/publication/366842988_Foveated_rendering_A_state_of_the_art_survey |title=Foveated rendering: A state-of-the-art survey}}</ref>


As one moves away from the fovea into the '''[[peripheral vision]]''', the density of cone cells decreases rapidly, while the density of [[rod cell]]s, which are more sensitive to light and motion but not to color or fine detail, increases.<ref name="TOYF_Paper">{{cite web |url=https://research.manchester.ac.uk/files/296585058/toyf.pdf |title=Type of Movement and Attentional Task Affect the Efficacy of a Foveated Rendering Method in Virtual Reality}}</ref> This anatomical arrangement means that our ability to perceive detail, color, and stereoscopic depth diminishes significantly with increasing [[eccentricity]] (the angular distance from the point of gaze).<ref name="EyeTrackingVRReview">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations}}</ref> However, our peripheral vision is highly attuned to detecting motion and flicker.<ref name="HVS_VR_Context_2">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview}}</ref>
The retina is not a uniform sensor. It contains a small, specialized central region called the '''[[fovea centralis|fovea]]''', which is responsible for sharp, detailed, and color-rich central vision (also known as foveal vision).<ref name="GazeContingentPipeline">{{cite web |url=https://graphics.tu-bs.de/upload/publications/stengel2016adaptsampling.pdf |title=Gaze-Contingent Rendering for Deferred Shading}}</ref> This region is densely packed with [[cone cell]]s, the photoreceptors responsible for high-acuity and color perception. The fovea covers only about 1-2 degrees of the visual field (approximately 2.6-3.6° in total span), yet it consumes approximately 50% of the neural resources in the [[visual cortex]].<ref name="StateOfArtSurvey">{{cite web |url=https://www.researchgate.net/publication/366842988_Foveated_rendering_A_state-of-the-art_survey |title=Foveated rendering: A state-of-the-art survey}}</ref><ref name="tobii2023">{{cite web |url=https://www.tobii.com/blog/what-is-foveated-rendering |title=What is foveated rendering? |publisher=Tobii |date=2023-03-15}}</ref>
 
As one moves away from the fovea into the '''[[peripheral vision]]''', the density of cone cells decreases rapidly, while the density of [[rod cell]]s, which are more sensitive to light and motion but not to color or fine detail, increases.<ref name="TOYF_Paper">{{cite web |url=https://research.manchester.ac.uk/files/296585058/toyf.pdf |title=Type of Movement and Attentional Task Affect the Efficacy of a Foveated Rendering Method in Virtual Reality}}</ref> This anatomical arrangement means that our ability to perceive detail, color, and stereoscopic depth diminishes significantly with increasing [[eccentricity]] (the angular distance from the point of gaze).<ref name="EyeTrackingVRReview">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations}}</ref>  
 
Human visual acuity falls off rapidly with eccentricity: it is highest within 5 degrees and drops to about 10% of peak at 30 degrees.<ref name="survey">{{cite arxiv |eprint=2211.07969 |title=Foveated rendering: A state-of-the-art survey |date=2022-11-15 |last1=Wang |first1=Lili |last2=Shi |first2=Xuehuai |last3=Liu |first3=Yi |url=https://arxiv.org/abs/2211.07969}}</ref> Visual acuity follows a hyperbolic decay model where the minimum resolvable angle increases linearly with eccentricity from the gaze point, described by the equation ω(e) = me + ω₀, where e represents eccentricity in degrees.<ref name="nvidia2017">{{cite web |url=https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf |title=Latency Requirements for Foveated Rendering in Virtual Reality |publisher=NVIDIA Research |year=2017}}</ref>
 
However, our peripheral vision is highly attuned to detecting motion and flicker.<ref name="HVS_VR_Context_2">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview}}</ref> [[Contrast sensitivity]] also varies with eccentricity, with peak sensitivity occurring at 3-5 cycles per degree in the fovea but shifting below 1 cycle per degree at 30° eccentricity.<ref name="ieee2023">{{cite web |url=https://link.springer.com/article/10.1007/s41095-022-0306-4 |title=Foveated rendering: A state-of-the-art survey |publisher=Computational Visual Media |year=2023}}</ref>


Foveated rendering exploits this exact trade-off. It allocates the bulk of the GPU's rendering budget to the small foveal region of the image where the user's eye can actually perceive high detail, and saves resources by rendering the much larger peripheral areas at a lower quality. The subjective experience of a uniformly high-resolution world is maintained because the brain naturally integrates the high-resolution "snapshots" from the fovea as the eyes rapidly scan the environment through quick movements called [[saccade]]s.<ref name="EyeTrackingVRReview" />
Foveated rendering exploits this exact trade-off. It allocates the bulk of the GPU's rendering budget to the small foveal region of the image where the user's eye can actually perceive high detail, and saves resources by rendering the much larger peripheral areas at a lower quality. The subjective experience of a uniformly high-resolution world is maintained because the brain naturally integrates the high-resolution "snapshots" from the fovea as the eyes rapidly scan the environment through quick movements called [[saccade]]s.<ref name="EyeTrackingVRReview" />


=== Perceptual Phenomena: Saccadic Masking and Visual Attention ===
=== Perceptual Phenomena: Saccadic Masking and Visual Attention ===
Two key perceptual phenomena make foveated rendering even more effective and are critical for its implementation.
Two key perceptual phenomena make foveated rendering even more effective and are critical for its implementation.


The first is '''[[saccadic masking]]''' (also known as saccadic suppression), a mechanism where the brain selectively blocks visual processing during a saccade.<ref name="FoveatedRenderingExplainedReddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/afj50w/eye_tracking_foveated_rendering_explained_what_it/ |title=Eye Tracking & Foveated Rendering Explained}}</ref> This prevents the perception of motion blur as the eyes sweep across the visual field, effectively creating a brief window of functional blindness. This period of suppressed sensitivity begins about 50 ms before a saccade and lasts until about 100 ms after it begins.<ref name="LatencyRequirements">{{cite web |url=https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf |title=Latency Requirements for Eye-Tracked Foveated Rendering}}</ref> This window is crucial for dynamic foveated rendering systems. If the system's total [[latency]]—from detecting the eye movement to displaying the updated foveated image—is shorter than the saccadic masking window, the transition of the high-resolution region can occur without the user ever perceiving the intermediate low-resolution state at their new point of gaze. This direct link between a biological phenomenon and a hardware specification means that low-latency eye tracking is not just a desirable feature but a hard requirement for effective ETFR.
The first is '''[[saccadic masking]]''' (also known as saccadic suppression), a mechanism where the brain selectively blocks visual processing during a saccade.<ref name="FoveatedRenderingExplainedReddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/afj50w/eye_tracking_foveated_rendering_explained_what_it/ |title=Eye Tracking & Foveated Rendering Explained}}</ref> This prevents the perception of motion blur as the eyes sweep across the visual field, effectively creating a brief window of functional blindness. This period of suppressed sensitivity begins about 50 ms before a saccade and lasts until about 100 ms after it begins.<ref name="LatencyRequirements">{{cite web |url=https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf |title=Latency Requirements for Eye-Tracked Foveated Rendering}}</ref> Human eyes can perform saccades at up to 900-1000 degrees per second.<ref name="roadtovr2016">{{cite web |url=https://www.roadtovr.com/a-pocket-guide-to-foveated-rendering-from-smi/ |title=A Quick-Start Guide to Foveated Rendering |publisher=Road to VR |date=2016-02-16}}</ref>


The second phenomenon is '''[[visual attention]]'''. Research has shown that the HVS's capabilities are not static but are modulated by cognitive factors. When a user is concentrating on a visually demanding task at their point of gaze, their [[contrast sensitivity]] in the periphery drops significantly.<ref name="AttentionAware">{{cite web |url=https://www.computationalimaging.org/publications/attention-aware/ |title=Towards Attention-Aware Foveated Rendering}}</ref><ref name="AttentionAwareSIGGRAPH">{{cite web |url=https://history.siggraph.org/learning/towards-attention-aware-foveated-rendering-by-krajancich-kellnhofer-and-wetzstein/ |title=“Towards Attention–Aware Foveated Rendering” by Krajancich, Kellnhofer and Wetzstein}}</ref> This implies that current foveation models, which are based solely on the fixed acuity falloff of the retina, may be too conservative. Attention-aware foveation systems could potentially apply even more aggressive degradation in the periphery during periods of high foveal concentration, unlocking further performance gains without any perceptual loss.
The second phenomenon is '''[[visual attention]]'''. Research has shown that the HVS's capabilities are not static but are modulated by cognitive factors. When a user is concentrating on a visually demanding task at their point of gaze, their [[contrast sensitivity]] in the periphery drops significantly.<ref name="AttentionAware">{{cite web |url=https://www.computationalimaging.org/publications/attention-aware/ |title=Towards Attention-Aware Foveated Rendering}}</ref><ref name="AttentionAwareSIGGRAPH">{{cite web |url=https://history.siggraph.org/learning/towards-attention-aware-foveated-rendering-by-krajancich-kellnhofer-and-wetzstein/ |title="Towards Attention–Aware Foveated Rendering" by Krajancich, Kellnhofer and Wetzstein}}</ref>


== Core Principles and Technical Methodologies ==
== Core Principles and Technical Methodologies ==
Transitioning from the biological "why" to the technical "how," foveated rendering is implemented through a combination of gaze-tracking paradigms and specific GPU-level rendering techniques.
Transitioning from the biological "why" to the technical "how," foveated rendering is implemented through a combination of gaze-tracking paradigms and specific GPU-level rendering techniques.


=== The Gaze-Contingent Paradigm ===
=== The Gaze-Contingent Paradigm ===
At its core, dynamic foveated rendering is an application of the '''[[gaze-contingency paradigm]]''', a concept in [[human-computer interaction]] where a system's display changes in real-time based on where the user is looking.<ref name="WikipediaFR" /><ref name="GazeContingentMultiresolution">{{cite web |url=https://vgl.cs.usfca.edu/assets/Foveated_Visualization___VDA_2020.pdf |title=Gaze-Contingent Multiresolution Visualization for Large-Scale Vector and Volume Data}}</ref> The typical rendering pipeline for a gaze-contingent foveated system operates on a per-frame basis:<ref name="GazeContingentPipeline" />
At its core, dynamic foveated rendering is an application of the '''[[gaze-contingency paradigm]]''', a concept in [[human-computer interaction]] where a system's display changes in real-time based on where the user is looking.<ref name="WikipediaFR" /><ref name="GazeContingentMultiresolution">{{cite web |url=https://vgl.cs.usfca.edu/assets/Foveated_Visualization___VDA_2020.pdf |title=Gaze-Contingent Multiresolution Visualization for Large-Scale Vector and Volume Data}}</ref> The typical rendering pipeline for a gaze-contingent foveated system operates on a per-frame basis:<ref name="GazeContingentPipeline" />
# '''Gaze Capture:''' An integrated eye tracker, typically using infrared cameras, captures images of the user's eyes.
# '''Gaze Capture:''' An integrated eye tracker, typically using infrared cameras, captures images of the user's eyes.
# '''Gaze Vector Calculation:''' Image processing algorithms determine the orientation of each eye to calculate a precise gaze vector.
# '''Gaze Vector Calculation:''' Image processing algorithms determine the orientation of each eye to calculate a precise gaze vector.
Line 46: Line 59:
# '''Region Definition:''' The system defines concentric regions of varying quality around the fixation point. These typically include a high-resolution foveal region, a medium-resolution parafoveal or transition region, and a low-resolution peripheral region.
# '''Region Definition:''' The system defines concentric regions of varying quality around the fixation point. These typically include a high-resolution foveal region, a medium-resolution parafoveal or transition region, and a low-resolution peripheral region.
# '''Instruction to GPU:''' The [[Graphics pipeline|graphics pipeline]] is instructed to render each of these regions at its designated quality level using one of the methods described below.
# '''Instruction to GPU:''' The [[Graphics pipeline|graphics pipeline]] is instructed to render each of these regions at its designated quality level using one of the methods described below.
# '''Display Update:''' The final, composited multi-resolution image is presented to the user. This entire loop must be completed within the frame budget (e.g., under 11.1 ms for a 90 Hz display) to ensure a smooth experience.
# '''Display Update:''' The final, composited multi-resolution image is presented to the user.
 
This entire loop must be completed within the frame budget (e.g., under 11.1 ms for a 90 Hz display) to ensure a smooth experience.


=== Methods of Quality Reduction ===
=== Methods of Quality Reduction ===
The term "reducing quality" encompasses several distinct techniques that can be applied to the peripheral regions to save computational power. These methods can be used individually or in combination:<ref name="IntegrativeView" />
The term "reducing quality" encompasses several distinct techniques that can be applied to the peripheral regions to save computational power. These methods can be used individually or in combination:<ref name="IntegrativeView" />
* '''Resolution Scaling / Subsampling:''' This is the most common and intuitive method. The peripheral regions are rendered into a smaller off-screen buffer (e.g., at half or quarter resolution) and then upscaled to fit the final display. This directly reduces the number of pixels that need to be processed and shaded.<ref name="GazeContingent2D">{{cite web |url=http://stanford.edu/class/ee367/Winter2017/mehra_sankar_ee367_win17_report.pdf |title=Gaze Contingent Foveated Rendering for 2D Displays}}</ref>
* '''Resolution Scaling / Subsampling:''' This is the most common and intuitive method. The peripheral regions are rendered into a smaller off-screen buffer (e.g., at half or quarter resolution) and then upscaled to fit the final display. This directly reduces the number of pixels that need to be processed and shaded.<ref name="GazeContingent2D">{{cite web |url=http://stanford.edu/class/ee367/Winter2017/mehra_sankar_ee367_win17_report.pdf |title=Gaze Contingent Foveated Rendering for 2D Displays}}</ref>
* '''Shading Rate Reduction:''' This method focuses on reducing the workload of the [[pixel shader]] (also known as a fragment shader). Instead of executing a complex shading program for every single pixel in the periphery, a single shader result can be applied to a block of multiple pixels. This is the core mechanism behind [[Variable rate shading|Variable Rate Shading (VRS)]].<ref name="TOYF_Paper" /><ref name="AutoVRSE">{{cite web |url=https://www.autovrse.com/foveated-rendering |title=What is Foveated Rendering? - autovrse}}</ref>
* '''Shading Rate Reduction:''' This method focuses on reducing the workload of the [[pixel shader]] (also known as a fragment shader). Instead of executing a complex shading program for every single pixel in the periphery, a single shader result can be applied to a block of multiple pixels. This is the core mechanism behind [[Variable Rate Shading]] (VRS).<ref name="TOYF_Paper" /><ref name="AutoVRSE">{{cite web |url=https://www.autovrse.com/foveated-rendering |title=What is Foveated Rendering? - autovrse}}</ref>
* '''Geometric Simplification:''' The geometric complexity of the scene can be reduced in the periphery. This involves using lower-polygon [[Level of detail|Level of Detail (LOD)]] models for objects that are outside the user's direct gaze.
* '''Geometric Simplification:''' The geometric complexity of the scene can be reduced in the periphery. This involves using lower-polygon [[level of detail]] models for objects that are outside the user's direct gaze.
* '''Other Methods:''' More advanced or experimental techniques include chromatic degradation (reducing color precision, since the periphery is less sensitive to color), simplifying lighting and shadow calculations, and spatio-temporal deterioration, which involves reducing quality across both space and time.
* '''Other Methods:''' More advanced or experimental techniques include chromatic degradation (reducing color precision, since the periphery is less sensitive to color), simplifying lighting and shadow calculations, and spatio-temporal deterioration.


{| class="wikitable"
=== Key Implementation Technologies ===
|-
! Technique !! Description !! Performance Gain !! Example Implementation
|-
| Multi-spatial Resolution Sampling || High-res fovea, low-res periphery || 2–4× || Guenter et al. (2012)<ref name="Guenter2012">{{Cite conference |last=Guenter |first=Brian |last2=Grimes |first2=Mark |last3=Nehab |first3=Diego |last4=Sander |first4=Pedro V. |last5=Summa |first5=João |title=Efficient rerendering in viewport space |journal=ACM Transactions on Graphics |volume=31 |issue=6 |pages=1–13 |year=2012 |doi=10.1145/2366145.2366195 |description=Key 2012 foveated graphics paper.}}</ref>
|-
| Coarse Pixel Shading || Reduced shading in tiles || 3× || AMD FidelityFX Super Resolution
|-
| Neural Reconstruction || AI-based upsampling || 5–8× || NVIDIA DeepFovea<ref name="DeepFovea">{{Cite conference |last=Kaplanyan |first=Anton |et al. |title=DeepFovea: Neural reconstruction for foveated rendering |journal=ACM Transactions on Graphics |volume=38 |issue=6 |pages=1–15 |year=2019 |doi=10.1145/3355089.3356559 |description=Neural upsampling for foveation.}}</ref>
|-
| [[Variable rate shading|Variable Rate Shading]] || GPU-accelerated rate control || 2.5× || [[DirectX]] 12 VRS
|}


=== Key Implementation Technologies ===
Modern GPUs and graphics APIs provide specialized features that make implementing foveated rendering highly efficient.
Modern GPUs and graphics APIs provide specialized features that make implementing foveated rendering highly efficient.


==== Variable Rate Shading (VRS) ====
==== Variable Rate Shading (VRS) ====
[[Variable rate shading|Variable Rate Shading]] (VRS) is a hardware feature available on modern GPUs (e.g., Nvidia Turing architecture and newer, AMD RDNA 2 and newer) that provides fine-grained control over the pixel shading rate.<ref name="TOYF_Paper" /><ref name="OpenXRToolkit">{{cite web |url=https://mbucchia.github.io/OpenXR-Toolkit/fr.html |title=Foveated Rendering - OpenXR Toolkit}}</ref> It allows a single pixel shader operation to compute the color for a block of pixels, such as a 2x2 or 4x4 block, instead of just a single pixel.<ref name="VarjoAPI">{{cite web |url=https://developer.varjo.com/docs/native/foveated-rendering-api |title=Foveated Rendering - Varjo for Developers}}</ref><ref name="PicoUnrealOpenXR">{{cite web |url=https://developer.picoxr.com/document/unreal-openxr/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal OpenXR Plugin}}</ref> For foveated rendering, the GPU can be instructed to use a 1x1 shading rate (full quality) for the foveal region and progressively coarser rates (e.g., 2x2, 4x4, or even culling) for the periphery. This is an extremely efficient, GPU-centric way to reduce shading load with minimal CPU overhead.
 
[[Variable Rate Shading]] (VRS) is a hardware feature available on modern GPUs (e.g., [[NVIDIA]] Turing architecture and newer, [[AMD]] RDNA 2 and newer, [[Intel]] Gen11+) that provides fine-grained control over the pixel shading rate.<ref name="TOYF_Paper" /><ref name="OpenXRToolkit">{{cite web |url=https://mbucchia.github.io/OpenXR-Toolkit/fr.html |title=Foveated Rendering - OpenXR Toolkit}}</ref><ref name="microsoft2019">{{cite web |url=https://devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/ |title=Variable Rate Shading: a scalpel in a world of sledgehammers |publisher=Microsoft DirectX Blog |year=2019}}</ref> It allows a single pixel shader operation to compute the color for a block of pixels, such as a 2x2 or 4x4 block, instead of just a single pixel.<ref name="VarjoAPI">{{cite web |url=https://developer.varjo.com/docs/native/foveated-rendering-api |title=Foveated Rendering - Varjo for Developers}}</ref><ref name="PicoUnrealOpenXR">{{cite web |url=https://developer.picoxr.com/document/unreal-openxr/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal OpenXR Plugin}}</ref> The technique supports shading rates from 1×1 (full quality) to 4×4 (coarse, one shade per 16 pixels).


==== Multi-View Rendering & Quad Views ====
==== Multi-View Rendering & Quad Views ====
An alternative approach, notably used by [[Varjo]] and available in [[Unreal Engine]], is to render multiple distinct views for each eye.<ref name="VarjoAPI" /><ref name="DCS_Forum_QuadViews">{{cite web |url=https://www.reddit.com/r/hoggit/comments/15ep59q/dcs_dynamic_foveated_rendering_available_for_more/ |title=DCS Dynamic Foveated Rendering available for more headsets}}</ref> For example, a "Quad Views" implementation renders four views in total for a stereo image: a high-resolution central "focus" view for each eye, and a lower-resolution peripheral "context" view for each eye. These are then composited into the final image.<ref name="PimaxQuadViews">{{cite web |url=https://pimax.com/blogs/blogs/quad-views-foveated-rendering-for-pimax-crystal |title=Quad Views Foveated Rendering for Pimax Crystal}}</ref> This technique can dramatically reduce the total number of pixels that need to be shaded, often yielding greater performance gains than VRS, especially in fill-rate limited scenarios. However, it comes at the cost of increased CPU overhead, as the scene's geometry must be submitted and processed multiple times (once for each view).<ref name="DCS_Forum_QuadViews" /> The choice between VRS and a multi-view approach thus represents a fundamental trade-off. A developer must consider whether their application is primarily bottlenecked by the GPU's pixel shading capabilities (favoring VRS) or by its overall pixel throughput, and whether they have sufficient CPU headroom to manage the multiple rendering passes required by a multi-view system.
 
An alternative approach, notably used by [[Varjo]] and available in [[Unreal Engine]], is to render multiple distinct views for each eye.<ref name="VarjoAPI" /><ref name="DCS_Forum_QuadViews">{{cite web |url=https://www.reddit.com/r/hoggit/comments/15ep59q/dcs_dynamic_foveated_rendering_available_for_more/ |title=DCS Dynamic Foveated Rendering available for more headsets}}</ref> For example, a "Quad Views" implementation renders four views in total for a stereo image: a high-resolution central "focus" view for each eye, and a lower-resolution peripheral "context" view for each eye. These are then composited into the final image.<ref name="PimaxQuadViews">{{cite web |url=https://pimax.com/blogs/blogs/quad-views-foveated-rendering-for-pimax-crystal |title=Quad Views Foveated Rendering for Pimax Crystal}}</ref>  
 
Multiview rendering uses [[OpenGL]] OVR_multiview extensions to render these views in a single pass, achieving 74.4% pixel reduction with circular [[stencil]] masks further reducing the count to 78.4%.<ref name="arm2020">{{cite web |url=https://developer.arm.com/-/media/developer/Graphics%20and%20Multimedia/White%20Papers/Foveated%20Rendering%20Whitepaper.pdf |title=Foveated Rendering Current and Future Technologies for Virtual Reality |publisher=ARM Developer |year=2020}}</ref>


==== Fragment Density Maps (FDM) ====
==== Fragment Density Maps (FDM) ====
At a lower level, graphics APIs like [[Vulkan (API)|Vulkan]] provide powerful tools for foveation. The <code>VK_EXT_fragment_density_map</code> extension allows an application to provide the GPU with a small texture, known as a fragment density map, that specifies the desired shading rate for different parts of the render target.<ref name="VulkanMobileVR">{{cite web |url=https://developers.meta.com/horizon/blog/vulkan-for-mobile-vr-rendering/ |title=Vulkan for Mobile VR Rendering}}</ref><ref name="VulkanFDMExtension">{{cite web |url=https://expipiplus1.github.io/vulkan/vulkan-3.8.1-docs/Vulkan-Extensions-VK_EXT_fragment_density_map.html |title=Vulkan API Documentation: VK_EXT_fragment_density_map}}</ref> This gives developers or the VR runtime precise, per-pixel level control over the foveation pattern, enabling complex and customized quality distributions beyond simple concentric circles. For dynamic foveation, extensions like <code>VK_QCOM_fragment_density_map_offset</code> allow this map to be shifted efficiently without regenerating it each frame, reducing latency.<ref name="VulkanFDMOffset">{{cite web |url=https://www.qualcomm.com/developer/blog/2022/08/improving-foveated-rendering-fragment-density-map-offset-extension-vulkan |title=Improving Foveated Rendering with the Fragment Density Map Offset Extension for Vulkan}}</ref>


=== Foveated Transport ===
At a lower level, graphics APIs like [[Vulkan (API)|Vulkan]] provide powerful tools for foveation. The <code>VK_EXT_fragment_density_map</code> extension allows an application to provide the GPU with a small texture, known as a fragment density map, that specifies the desired shading rate for different parts of the render target.<ref name="VulkanMobileVR">{{cite web |url=https://developers.meta.com/horizon/blog/vulkan-for-mobile-vr-rendering/ |title=Vulkan for Mobile VR Rendering}}</ref><ref name="VulkanFDMExtension">{{cite web |url=https://expipiplus1.github.io/vulkan/vulkan-3.8.1-docs/Vulkan-Extensions-VK_EXT_fragment_density_map.html |title=Vulkan API Documentation: VK_EXT_fragment_density_map}}</ref> Extensions like <code>VK_QCOM_fragment_density_map_offset</code> allow this map to be shifted efficiently without regenerating it each frame, reducing latency.<ref name="VulkanFDMOffset">{{cite web |url=https://www.qualcomm.com/developer/blog/2022/08/improving-foveated-rendering-fragment-density-map-offset-extension-vulkan |title=Improving Foveated Rendering with the Fragment Density Map Offset Extension for Vulkan}}</ref>
A related concept, known as foveated transport or foveated streaming, applies the same principles to bandwidth optimization rather than real-time rendering. This is primarily used in cloud-streamed VR or 360° video playback.<ref name="FOVAS_Pixvana">{{cite web |title=Pixvana’s FOVAS Technology Delivers 8K VR Video on Today’s Headsets |url=https://www.businesswire.com/news/home/20160907005500/en/Pixvana’s-FOVAS-Technology-Delivers-8K-VR-Video-on-Today’s-Headsets |publisher=Business Wire |date=2016-09-07 |access-date=2025-10-26}}</ref> In this model, the server sends a high-resolution, high-bitrate video stream only for the small portion of the scene the user is currently looking at, while sending a much lower-quality stream for the periphery. This drastically reduces the bandwidth required to achieve a high-quality experience. Technologies like [[FOVAS]] (Field of View Adaptive Streaming) pioneered this technique.<ref name="FOVAS_Pixvana" />
 
==== Kernel Foveated Rendering ====
 
Kernel foveated rendering applies log-polar coordinate transformations based on cortical magnification models.<ref name="kfr2018">{{cite web |url=https://www.researchgate.net/publication/326636875_Kernel_Foveated_Rendering |title=Kernel Foveated Rendering |publisher=ResearchGate |year=2018}}</ref> The forward transform maps screen coordinates to a reduced buffer using mathematical kernels, achieving 2.16× speedup with appropriate parameterization.


== Types of Foveated Rendering ==
== Types of Foveated Rendering ==
Foveated rendering is not a monolithic technology but a category of techniques that can be broadly classified based on whether they utilize real-time gaze data.
Foveated rendering is not a monolithic technology but a category of techniques that can be broadly classified based on whether they utilize real-time gaze data.


=== Fixed Foveated Rendering (FFR) ===
=== Fixed Foveated Rendering (FFR) ===
Fixed Foveated Rendering is the most basic implementation of the concept. It operates without any eye-tracking hardware and instead relies on the assumption that a user will predominantly look towards the center of the screen.<ref name="WikipediaFR" /><ref name="JigSpace">{{cite web |url=https://www.jig.com/spatial-computing/foveated-rendering |title=What Is Foveated Rendering? - JigSpace}}</ref> Consequently, FFR systems render a static, high-resolution region in the center of each eye's display, while the quality degrades in fixed concentric rings towards the edges.<ref name="MetaFFRvsETFR">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
Fixed Foveated Rendering is the most basic implementation of the concept. It operates without any eye-tracking hardware and instead relies on the assumption that a user will predominantly look towards the center of the screen.<ref name="WikipediaFR" /><ref name="JigSpace">{{cite web |url=https://www.jig.com/spatial-computing/foveated-rendering |title=What Is Foveated Rendering? - JigSpace}}</ref> Consequently, FFR systems render a static, high-resolution region in the center of each eye's display, while the quality degrades in fixed concentric rings towards the edges.<ref name="MetaFFRvsETFR">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>


Line 95: Line 108:
'''Disadvantages:'''
'''Disadvantages:'''
* '''Sub-optimal Gains:''' Because the system cannot know where the user is actually looking, the central high-quality region must be made conservatively large to account for natural eye movements. This limits the potential performance savings compared to dynamic systems.<ref name="FFR_vs_TFR_Paper">{{cite web |url=https://3dvar.com/Singh2023Power.pdf |title=Power, Performance, and Quality of Gaze-Tracked Foveated Rendering in Practical XR Systems}}</ref>
* '''Sub-optimal Gains:''' Because the system cannot know where the user is actually looking, the central high-quality region must be made conservatively large to account for natural eye movements. This limits the potential performance savings compared to dynamic systems.<ref name="FFR_vs_TFR_Paper">{{cite web |url=https://3dvar.com/Singh2023Power.pdf |title=Power, Performance, and Quality of Gaze-Tracked Foveated Rendering in Practical XR Systems}}</ref>
* '''Visible Artifacts:''' If a user moves their eyes to look at the periphery without turning their head, they can easily notice the drop in resolution, which can be distracting and break immersion.<ref name="MetaFFR_OS" /><ref name="UnityOpenXR_FR">{{cite web |url=https://docs.unity3d.com/Packages/[email protected]/manual/features/foveatedrendering.html |title=Foveated rendering in OpenXR - Unity Manual}}</ref>
* '''Visible Artifacts:''' If a user moves their eyes to look at the periphery without turning their head, they can easily notice the drop in resolution.<ref name="MetaFFR_OS" /><ref name="UnityOpenXR_FR">{{cite web |url=https://docs.unity3d.com/Packages/[email protected]/manual/features/foveatedrendering.html |title=Foveated rendering in OpenXR - Unity Manual}}</ref>


=== Dynamic (Eye-Tracked) Foveated Rendering (ETFR / DFR) ===
=== Dynamic (Eye-Tracked) Foveated Rendering (ETFR / DFR) ===
Dynamic Foveated Rendering represents the full realization of the concept. It requires a [[head-mounted display|HMD]] with integrated eye-tracking cameras to determine the user's precise point of gaze in real-time.<ref name="VarjoWhatIs" /><ref name="WikipediaFR" /> The high-resolution foveal region is then dynamically moved to match this gaze point on a frame-by-frame basis, ensuring that the user is always looking at a fully rendered part of the scene.<ref name="TobiiDFR">{{cite web |url=https://www.tobii.com/resource-center/reports-and-papers/eye-tracking-and-dynamic-foveated-rendering |title=Eye tracking and dynamic foveated rendering - Tobii}}</ref>
 
Dynamic Foveated Rendering represents the full realization of the concept. It requires a [[head-mounted display]] with integrated eye-tracking cameras to determine the user's precise point of gaze in real-time.<ref name="VarjoWhatIs" /><ref name="WikipediaFR" /> The high-resolution foveal region is then dynamically moved to match this gaze point on a frame-by-frame basis, ensuring that the user is always looking at a fully rendered part of the scene.<ref name="TobiiDFR">{{cite web |url=https://www.tobii.com/resource-center/reports-and-papers/eye-tracking-and-dynamic-foveated-rendering |title=Eye tracking and dynamic foveated rendering - Tobii}}</ref>


'''Advantages:'''
'''Advantages:'''
* '''Maximum Performance:''' ETFR allows for much more aggressive foveation—a smaller foveal region and a more significant quality reduction in the periphery—because the low-quality areas are always guaranteed to be outside the user's direct line of sight. This results in substantially greater performance and power savings.<ref name="VRX_FR_Types" /><ref name="PicoUnrealOpenXR" />
* '''Maximum Performance:''' ETFR allows for much more aggressive foveation—a smaller foveal region and a more significant quality reduction in the periphery—resulting in substantially greater performance and power savings.<ref name="VRX_FR_Types" /><ref name="PicoUnrealOpenXR" />
* '''Perceptually Seamless:''' When implemented with low latency, the effect is imperceptible to the user. The entire virtual world appears to be rendered in high resolution, as the degradation is always hidden in the natural blur of peripheral vision.<ref name="VarjoWhatIs" />
* '''Perceptually Seamless:''' When implemented with low latency, the effect is imperceptible to the user.<ref name="VarjoWhatIs" />


'''Disadvantages:'''
'''Disadvantages:'''
* '''Hardware Requirements:''' It is entirely dependent on the presence and quality of eye-tracking hardware, which increases the cost, weight, and power consumption of the HMD.
* '''Hardware Requirements:''' It is entirely dependent on the presence and quality of eye-tracking hardware, which increases the cost, weight, and power consumption of the HMD.
* '''Sensitivity to Latency:''' The technique is highly sensitive to system latency. If the delay between an eye movement and the corresponding display update is too long, the user will perceive artifacts, which negates the primary benefit.<ref name="LatencyRequirements" />
* '''Sensitivity to Latency:''' The technique is highly sensitive to system latency. If the delay between an eye movement and the corresponding display update is too long, the user will perceive artifacts.<ref name="LatencyRequirements" />


{| class="wikitable"
{| class="wikitable"
|+ Comparison of Foveated Rendering Methodologies
|+ Comparison of Foveated Rendering Methodologies
! Feature
! Feature ! Fixed Foveated Rendering (FFR) ! Dynamic (Eye-Tracked) Foveated Rendering (ETFR/DFR)
! Fixed Foveated Rendering (FFR)
! Dynamic (Eye-Tracked) Foveated Rendering (ETFR/DFR)
 
|-
|-
! Core Principle
! Core Principle
 
| Assumes user's gaze is fixed at the center of the display. | Tracks the user's real-time gaze to place the high-quality region precisely where they are looking.
| Assumes user's gaze is fixed at the center of the display.
| Tracks the user's real-time gaze to place the high-quality region precisely where they are looking.
|-
|-
! Hardware Requirement
! Hardware Requirement
 
| None (beyond a capable [[GPU]]). | Integrated [[eye tracking]] cameras and processing hardware.
| None (beyond a capable [[GPU]]).
| Integrated [[eye tracking]] cameras and processing hardware.
|-
|-
! High-Quality Region
! High-Quality Region
 
| Static, located in the center of the screen. Must be conservatively large. | Dynamic, moves with the user's fovea. Can be made smaller and more aggressive.
| Static, located in the center of the screen. Must be conservatively large.
| Dynamic, moves with the user's fovea. Can be made smaller and more aggressive.
|-
|-
! User Experience
! User Experience
 
| Generally effective, but artifacts can be noticed if the user looks to the periphery without moving their head.<ref name="MetaFFR_OS" /><ref name="UnityOpenXR_FR" /> | When latency is low, the effect is imperceptible to the user, providing a consistently high-quality experience.<ref name="VarjoWhatIs" />
| Generally effective, but artifacts can be noticed if the user looks to the periphery without moving their head.<ref name="MetaFFR_OS" /><ref name="UnityOpenXR_FR" />
| When latency is low, the effect is imperceptible to the user, providing a consistently high-quality experience.<ref name="VarjoWhatIs" />
|-
|-
! Performance Savings
! Performance Savings
 
| Moderate. Reduces GPU load by rendering fewer pixels in the periphery (e.g., 26-43% savings reported for Meta Quest).<ref name="QuestProPerformance">{{cite web |url=https://www.uploadvr.com/quest-pro-foveated-rendering-performance/ |title=Quest Pro Foveated Rendering GPU Savings Detailed}}</ref> | Significant. Allows for more aggressive degradation, leading to greater GPU savings (e.g., 33-52% savings reported for Meta Quest Pro).<ref name="VRX_FR_Types" /><ref name="QuestProPerformance" />
| Moderate. Reduces GPU load by rendering fewer pixels in the periphery (e.g., 26-43% savings reported for Meta Quest).<ref name="QuestProPerformance">{{cite web |url=https://www.uploadvr.com/quest-pro-foveated-rendering-performance/ |title=Quest Pro Foveated Rendering GPU Savings Detailed}}</ref>
| Significant. Allows for more aggressive degradation, leading to greater GPU savings (e.g., 33-52% savings reported for Meta Quest Pro).<ref name="VRX_FR_Types" /><ref name="QuestProPerformance" />
|-
|-
! Ideal Use Cases
! Ideal Use Cases
 
| Cost-sensitive standalone headsets, applications without eye-tracking support, platforms where simplicity is prioritized.<ref name="JigSpace" /> | High-end PC VR and standalone headsets, demanding simulations, applications seeking maximum visual fidelity and performance.<ref name="JigSpace" />
| Cost-sensitive standalone headsets, applications without eye-tracking support, platforms where simplicity is prioritized.<ref name="JigSpace" />
| High-end PC VR and standalone headsets, demanding simulations, applications seeking maximum visual fidelity and performance.<ref name="JigSpace" />
|-
|-
! Key Drawback
! Key Drawback
 
| Sub-optimal performance gains and potentially visible artifacts. | Increased hardware cost, complexity, power consumption, and high sensitivity to system latency.
| Sub-optimal performance gains and potentially visible artifacts.
| Increased hardware cost, complexity, power consumption, and high sensitivity to system latency.
|}
|}


=== Predictive and Attention-Aware Foveation ===
=== Predictive and Attention-Aware Foveation ===
As the technology matures, research is exploring more advanced forms of foveation that incorporate predictive and cognitive models.
As the technology matures, research is exploring more advanced forms of foveation that incorporate predictive and cognitive models.
* '''Predictive Foveation:''' Some systems attempt to predict the landing point of a saccade based on its initial trajectory and velocity. This allows the rendering system to begin shifting the foveal region to the target destination ''before'' the eye movement is complete, effectively hiding some of the system's latency and making artifacts less likely.<ref name="FoveatedRenderingExplainedReddit" /><ref name="VRX_FR_Types" />
 
* '''Attention-Aware Foveation:''' This is a cutting-edge research area that aims to model the user's cognitive state of attention. As noted earlier, peripheral visual sensitivity decreases when foveal attention is high. An attention-aware system could leverage this by applying even more aggressive foveation during tasks that require intense focus, further optimizing performance in a way that is tailored to the user's cognitive load, not just their retinal physiology.<ref name="AttentionAware" /><ref name="AttentionAwareSIGGRAPH" />
* '''Predictive Foveation:''' Some systems attempt to predict the landing point of a saccade based on its initial trajectory and velocity. This allows the rendering system to begin shifting the foveal region to the target destination before the eye movement is complete.<ref name="FoveatedRenderingExplainedReddit" /><ref name="VRX_FR_Types" />
* '''Attention-Aware Foveation:''' This is a cutting-edge research area that aims to model the user's cognitive state of attention. Peripheral visual sensitivity decreases when foveal attention is high.<ref name="AttentionAware" /><ref name="AttentionAwareSIGGRAPH" />


== Performance, Efficacy, and Benchmarks ==
== Performance, Efficacy, and Benchmarks ==
The primary motivation for implementing foveated rendering is to improve performance. The efficacy of the technique can be measured through several key metrics, and real-world benchmarks demonstrate substantial gains across a variety of hardware platforms.
The primary motivation for implementing foveated rendering is to improve performance. The efficacy of the technique can be measured through several key metrics, and real-world benchmarks demonstrate substantial gains across a variety of hardware platforms.


=== Metrics for Performance Gain ===
=== Metrics for Performance Gain ===
The benefits of foveated rendering are quantified using the following metrics:
The benefits of foveated rendering are quantified using the following metrics:
* '''GPU Frame Time:''' The most direct measurement of performance. This is the time, in milliseconds (ms), that the GPU takes to render a single frame. Foveated rendering directly reduces this value, and a lower frame time is always better.<ref name="PSVR2_GDC_Unity">{{cite web |url=https://www.playstationlifestyle.net/2022/03/28/psvr-2-specs-eye-tracking-foveated-rendering/ |title=PSVR 2 Specs Run 3.6x Faster Using Eye-Tracking Technology}}</ref>
 
* '''Frames Per Second (FPS):''' Lower frame times enable higher and more stable frame rates. Maintaining a high FPS (typically 90 FPS or more) is critical for a comfortable and immersive VR experience, as dropped frames or low FPS can induce [[virtual reality sickness]].<ref name="PimaxDFR">{{cite web |url=httpshttps://pimax.com/blogs/blogs/the-crystal-supers-secret-weapon-dynamic-foveated-rendering |title=The Crystal Super's Secret Weapon: Dynamic Foveated Rendering}}</ref>
* '''GPU Frame Time:''' The most direct measurement of performance. This is the time, in milliseconds (ms), that the GPU takes to render a single frame.<ref name="PSVR2_GDC_Unity">{{cite web |url=https://www.playstationlifestyle.net/2022/03/28/psvr-2-specs-eye-tracking-foveated-rendering/ |title=PSVR 2 Specs Run 3.6x Faster Using Eye-Tracking Technology}}</ref>
* '''Frames Per Second (FPS):''' Lower frame times enable higher and more stable frame rates. Maintaining a high FPS (typically 90 FPS or more) is critical for a comfortable and immersive VR experience.<ref name="PimaxDFR">{{cite web |url=https://pimax.com/blogs/blogs/the-crystal-supers-secret-weapon-dynamic-foveated-rendering |title=The Crystal Super's Secret Weapon: Dynamic Foveated Rendering}}</ref>
* '''Power Consumption:''' On battery-powered standalone headsets, reducing the GPU workload directly translates to lower power consumption, leading to longer battery life and reduced thermal output.<ref name="JigSpace" />
* '''Power Consumption:''' On battery-powered standalone headsets, reducing the GPU workload directly translates to lower power consumption, leading to longer battery life and reduced thermal output.<ref name="JigSpace" />
* '''Increased Visual Fidelity:''' Instead of simply increasing FPS, developers can reinvest the saved GPU performance. This "performance headroom" can be used to render the scene at a higher base resolution ([[supersampling]]), enable more complex lighting and shader effects, or use higher-quality assets, resulting in a visually richer experience at the same target frame rate.<ref name="AutoVRSE" /><ref name="MetaETFR_Blog">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
* '''Increased Visual Fidelity:''' Instead of simply increasing FPS, developers can reinvest the saved GPU performance for higher base resolution ([[supersampling]]), more complex lighting and shader effects, or higher-quality assets.<ref name="AutoVRSE" /><ref name="MetaETFR_Blog">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>


=== Real-World Performance Gains ===
=== Real-World Performance Gains ===
Benchmarks from hardware manufacturers and developers illustrate the significant impact of foveated rendering.


{| class="wikitable"
{| class="wikitable"
|+ FFR Performance on Meta Quest Devices<ref name="QuestProPerformance" />
|+ Comprehensive Performance Benchmarks Across Platforms
! Platform !! Test Content !! Baseline !! FFR Gain !! ETFR Gain !! Notes
|-
| [[Meta Quest 2]] || Various || 100% || 26-36% || N/A || Level 3 FFR achieves up to 43% savings<ref name="QuestProPerformance" />
|-
| [[Meta Quest Pro]] || Default res || 100% || 26-43% || 33-52% || ETFR provides 7-9% additional benefit<ref name="uploadvr2022">{{cite web |url=https://www.uploadvr.com/quest-pro-foveated-rendering-performance/ |title=Here's The Exact Performance Benefit Of Foveated Rendering On Quest Pro |publisher=UploadVR |date=October 2022}}</ref>
|-
| Meta Quest Pro || Red Matter 2 || Default density || N/A || +33% pixels || 77% more total pixels in optical center<ref name="MetaETFR_Blog" />
|-
| [[PlayStation VR2]] || Unity Demo || 33.2ms || 14.3ms (2.5×) || 9.2ms (3.6×) || Eye tracking provides dramatic improvement<ref name="unity2023">{{cite web |url=https://www.uploadvr.com/psvr-2-eye-tracking-foveated-rendering-gdc/ |title=PSVR 2 Foveated Rendering Provides 3.6x Faster Performance - Unity |publisher=UploadVR |date=March 2023}}</ref>
|-
|-
! Device !! Foveation Level !! Peripheral Reduction !! Performance Gain
| [[Varjo Aero]] || Professional apps || 100% || 30-40% || 50-60% || 200Hz eye tracking enables aggressive foveation<ref name="varjo2023">{{cite web |url=https://developer.varjo.com/docs/native/foveated-rendering-api |title=Foveated Rendering |publisher=Varjo for developers |year=2023}}</ref>
|-
|-
| [[Meta Quest 2]] || Level 1 || 4× (2×2) || 26%
| [[Pimax Crystal]] || VRS Method || 100% || N/A || 10-40% || 120Hz Tobii eye tracking<ref name="PimaxDFR" />
|-
|-
| Meta Quest 2 || Level 3 || 16× (4×4) || 36%
| Pimax Crystal || Quad Views || 100% || N/A || 50-100% || More aggressive peripheral reduction<ref name="PimaxDFR" />
|-
|-
| [[Meta Quest 3]] || Level 1 || 4× (2×2) || 26%
| ARM Mali GPU || CircuitVR || 488M cycles || 397M cycles || N/A || 18.6% cycle reduction<ref name="arm2020" />
|-
|-
| Meta Quest 3 || Level 3 || 16× (4×4) || 43%
| NVIDIA GTX 1080 || Shadow Warrior 2 || 60 FPS || N/A || 78 FPS || 30% performance gain<ref name="nvidia2017" />
|-
| PowerGS || 3D Gaussian Splatting || 100% power || N/A || 37% power || 63% power reduction<ref name="vrsplatting2024">{{cite web |url=https://dl.acm.org/doi/10.1145/3728302 |title=VR-Splatting: Foveated Radiance Field Rendering via 3D Gaussian Splatting and Neural Points |publisher=ACM |year=2024}}</ref>
|}
|}
== History ==
Research into foveated rendering dates back over three decades, evolving from theoretical psychophysics to practical implementations in immersive technologies.


{| class="wikitable"
{| class="wikitable"
|+ ETFR Performance Comparison (GPU Savings)<ref name="QuestProPerformance" /><ref name="UploadVR_PSVR2">{{cite web |url=https://www.uploadvr.com/psvr-2-eye-tracking-foveated-rendering-gdc/ |title=PSVR 2 Foveated Rendering Provides 3.6x Faster Performance - Unity}}</ref><ref name="VarjoDevs">{{cite web |title=Foveated Rendering |url=https://developer.varjo.com/docs/native/foveated-rendering-api |publisher=Varjo for developers |year=2023}}</ref>
|-
! Year !! Milestone !! Key Contributors/Devices !! Description
|-
| 1990 || Gaze-directed volume rendering || Levoy and Whitaker || First application of foveation to volume data visualization, reducing samples in peripheral regions.<ref name="levoy1990">{{cite journal |last=Levoy |first=Marc |last2=Whitaker |first2=Robert |title=Gaze-directed volume rendering |journal=Proceedings of the 1990 Symposium on Interactive 3D Graphics |pages=361–369 |year=1990 |doi=10.1145/91394.91431}}</ref>
|-
| 1991 || Foundational research || Academic papers || Theoretical concept of adapting rendering to HVS acuity established.<ref name="WikipediaFR" />
|-
| 1996 || Gaze-directed adaptive rendering || Ohshima et al. || Introduced adaptive resolution based on eye position for virtual environments.<ref name="ohshima1999">{{cite conference |last=Ohshima |first=Takashi |last2=Satoh |first2=Keiichi |last3=Tamaki |first3=Hiroaki |title=AR²HMD: Augmented reality with high resolution head mounted display |journal=Proceedings of the 1st International Symposium on Mixed and Augmented Reality |pages=110–119 |year=1999 |doi=10.1109/ISMAR.1999.803809}}</ref>
|-
| 2001 || Perceptually-driven simplification || Luebke and Hallen || LOD techniques guided by visual attention models.<ref name="luebke2001">{{cite journal |last=Luebke |first=David |last2=Hallen |first2=Ben |title=Perceptually-driven simplification for interactive rendering |journal=Proceedings 12th International Conference on Parallel Processing |pages=223–230 |year=2001 |doi=10.1109/IPDPS.2001.925025}}</ref>
|-
| 2012 || Foveated 3D graphics || Guenter et al. || Rasterization-based system achieving 6.2× speedup in VR.<ref name="guenter2012">{{cite conference |last=Guenter |first=Brian |last2=Grimes |first2=Mark |last3=Nehab |first3=Diego |last4=Sander |first4=Pedro V. |last5=Summa |first5=João |title=Efficient rerendering in viewport space |journal=ACM Transactions on Graphics |volume=31 |issue=6 |pages=1–13 |year=2012 |doi=10.1145/2366145.2366195}}</ref>
|-
| 2014 || First consumer HMD prototype || [[FOVE]] || Unveiled eye-tracked VR headset with foveated rendering at TechCrunch Disrupt SF. First public demonstration of foveated rendering in a VR headset.<ref name="techcrunch2014">{{cite web |url=https://techcrunch.com/2014/09/09/fove/ |title=FOVE Uses Eye Tracking To Make Virtual Reality More Immersive |publisher=TechCrunch |date=2014-09-10}}</ref>
|-
| 2015 || Kickstarter success || FOVE || Raised funds for production of first commercial eye-tracked HMD with foveated rendering.<ref name="kickstarter2015">{{cite web |url=https://www.kickstarter.com/projects/fove/fove-the-worlds-first-eye-tracking-virtual-reality |title=FOVE: The World's First Eye Tracking Virtual Reality Headset |publisher=Kickstarter |date=2015-09-01}}</ref>
|-
| 2016 (January) || High-speed eye tracking demo || SMI (SensoMotoric Instruments) || Demonstrated 250Hz eye tracking system with foveated rendering at CES, achieving 2-4× performance boost with imperceptible quality loss.<ref name="uploadvr2016">{{cite web |url=https://uploadvr.com/smi-hands-on-250hz-eye-tracking/ |title=SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real, and the Cost May Surprise You |publisher=UploadVR |date=2016-01-15}}</ref>
|-
| 2016 (July) || SIGGRAPH demonstration || [[NVIDIA]] & SMI || NVIDIA demonstrated "perceptually-guided" foveated rendering techniques with 50-66% pixel shading load reduction.<ref name="nvidia2016">{{cite web |url=https://blogs.nvidia.com/blog/2016/07/21/rendering-foveated-vr/ |title=NVIDIA Partners with SMI on Innovative Rendering Technique That Improves VR |publisher=NVIDIA |date=2016-07-21}}</ref><ref name="digitaltrends2016">{{cite web |url=https://www.digitaltrends.com/computing/nvidia-research-foveated-rendering-vr-smi/ |title=Nvidia plans to prove that new method improves image quality in virtual reality |publisher=Digital Trends |date=2016-07-23}}</ref>
|-
| 2016 (November) || First commercial release || FOVE 0 || FOVE 0 headset shipped to developers, first commercially available HMD with integrated eye tracking and foveated rendering support.<ref name="tomshardware2016">{{cite web |url=https://www.tomshardware.com/news/fove-vr-first-look-ces,30964.html |title=Exclusive: Fove's VR HMD At CES 2016 |publisher=Tom's Hardware |date=2016-01-11}}</ref>
|-
| 2017 || Mobile VR support || [[Qualcomm]] || Snapdragon 835 VRDK announced with "Adreno Foveation" for mobile VR, signaling technology's arrival on mobile processors.<ref name="qualcomm2017">{{cite web |url=https://www.qualcomm.com/news/releases/2017/02/23/qualcomm-introduces-snapdragon-835-virtual-reality-development-kit |title=Qualcomm Introduces Snapdragon 835 Virtual Reality Development Kit |publisher=Qualcomm |date=2017-02-23}}</ref>
|-
| 2017 || SMI acquired || [[Apple]] || Apple acquired SMI, indicating growing interest in foveated rendering for AR/VR.<ref name="WikipediaFR" />
|-
| 2018-2019 || Enterprise adoption || [[StarVR One]], [[Varjo]] VR-1 || Professional headsets with integrated [[Tobii]] eye-tracking for foveated rendering. Varjo's "bionic display" used hardware-level foveation.<ref name="starvr2018">{{cite web |url=https://arstechnica.com/gaming/2018/08/starvr-one-is-a-premium-vr-headset-with-built-in-eye-tracking/ |title=StarVR One is a premium VR headset with built-in eye-tracking |publisher=Ars Technica |date=2018-08-14}}</ref>
|-
| 2019 (January) || Consumer eye tracking || [[HTC Vive Pro Eye]] || First mainstream consumer VR headset aimed at general users with dynamic foveated rendering support.<ref name="theverge2019">{{cite web |url=https://www.theverge.com/2019/1/7/18172064/htc-vive-pro-eye-vr-headset-eye-tracking-announced-features-price-release |title=HTC announces new Vive Pro Eye VR headset with native eye tracking |publisher=The Verge |date=2019-01-07}}</ref>
|-
| 2019 (December) || SDK support || [[Meta Quest|Oculus Quest]] || Fixed Foveated Rendering exposed in SDK, marking first large-scale commercial deployment.<ref name="venturebeat2019">{{cite web |url=https://venturebeat.com/2019/12/22/oculus-quest-gets-dynamic-fixed-foveated-rendering/ |title=Oculus Quest gets dynamic fixed foveated rendering |publisher=VentureBeat |date=2019-12-22}}</ref>
|-
|-
! Platform !! FFR Savings !! ETFR Savings !! Additional Benefit
| 2020 || Neural reconstruction || Facebook Reality Labs || [[DeepFovea]] demonstrated AI-based foveated reconstruction with up to 10-14× pixel count reduction.<ref name="deepfovea2019">{{cite web |url=https://dl.acm.org/doi/10.1145/3306307.3328186 |title=DeepFovea: Neural Reconstruction for Foveated Rendering |publisher=ACM SIGGRAPH |year=2019}}</ref>
|-
|-
| [[Meta Quest Pro]] || 26-43% || 33-52% || 7-9%
| 2021 || Chipset integration || Qualcomm XR2 || Built-in support for foveated rendering and eye tracking in standalone VR chipset.<ref name="WikipediaFR" />
|-
|-
| [[PlayStation VR2]] || ~60% (2.5×) || ~72% (3.6×) || ~12%
| 2022 || Consumer ETFR || [[Meta Quest Pro]] || First mainstream standalone headset with Eye-Tracked Foveated Rendering, achieving 33-52% performance gains.<ref name="uploadvr2022" /><ref name="WikipediaFR" />
|-
|-
| [[Varjo Aero]] || 30-40% || 50-60% || 10-20%
| 2023 (February) || Console integration || [[PlayStation VR2]] || Eye tracking with foveated rendering standard in every unit, achieving up to 3.6× speedup.<ref name="unity2023" /><ref name="sonyblog2022">{{cite web |url=https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/ |title=PlayStation VR2 and PlayStation VR2 Sense controller: The next generation of VR gaming on PS5 |publisher=PlayStation Blog |date=2022-01-04}}</ref>
|-
| 2024 (February) || Spatial computing || [[Apple Vision Pro]] || High-end mixed reality headset with sophisticated eye-tracking system for foveated rendering and interaction.<ref name="pcmag2025">{{cite web |url=https://www.pcmag.com/picks/the-best-vr-headsets |title=The Best VR Headsets for 2025 |publisher=PC Magazine}}</ref>
|}
|}


* '''[[Meta Quest Pro]] (ETFR):''' In performance tests conducted by Meta, ETFR demonstrated significant savings over both non-foveated rendering and FFR. At 1.5 times the default resolution, FFR provided a performance saving of 34-43%, while ETFR delivered a saving of 36-52%.<ref name="QuestProPerformance" /><ref name="80lvQuestPro">{{cite web |url=https://80.lv/articles/quest-pro-s-foveated-rendering-saves-up-to-52-performance |title=Quest Pro's Foveated Rendering Saves up to 52% Performance}}</ref> The developers of the game ''Red Matter 2'' utilized ETFR to increase the rendered pixel density by 33%, which equates to 77% more total pixels in the optical center, leading to a much sharper image.<ref name="MetaETFR_Blog" />
== Software Support ==
* '''[[PlayStation VR2]] (ETFR):''' During a GDC presentation, Unity demonstrated that on the PSVR2, FFR alone provides a 2.5x speedup in GPU frame times, while enabling eye-tracked foveated rendering boosts this to a 3.6x speedup.<ref name="PSVR2_GDC_Unity" /><ref name="UploadVR_PSVR2" /> In one demo, this dropped the frame time from 33.2 ms to a much more manageable 14.3 ms.
* '''[[Pimax]] Crystal (DFR):''' Pimax reports that its DFR implementation using [[Variable rate shading|VRS]] can increase FPS by 10-40%. Their more advanced "Quad Views" method, which reduces peripheral resolution more directly, can yield performance boosts of 50-100%.<ref name="PimaxDFR" /><ref name="UnityFR_Test_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/3bls3q/unity_foveated_rendering_test_4x_fps_increase/ |title=unity foveated rendering test 4x fps increase with a pretty simple rendering strategy}}</ref>


=== Factors Influencing Efficacy ===
=== Game Engines ===
The performance benefits of foveated rendering are not absolute but are highly dependent on the specific application and rendering settings.
* '''Shader Complexity:''' The greatest gains are seen in applications that are heavily '''GPU-bound''' due to complex pixel shaders (e.g., realistic lighting, reflections, and post-processing effects). Applications with very simple shaders may see little to no improvement, and in some cases, the overhead of the foveation technique can even lead to a minor performance loss.<ref name="MetaFFR_OS" />
* '''Base Resolution:''' The higher the resolution of the HMD, the more pixels there are to process, and thus the greater the potential savings from foveation. The benefits are more pronounced on 4K and higher resolution displays.<ref name="QuestProPerformance" />
* '''Foveation Level:''' Developers can typically choose from several preset levels of foveation (e.g., Low, Medium, High). A "High" setting will apply more aggressive degradation to the periphery, yielding greater performance at the cost of potentially noticeable artifacts, while a "Low" setting will be more conservative.<ref name="OpenXRToolkit" /><ref name="PicoUnityFFR">{{cite web |url=https://developer.picoxr.com/document/unity/fixed-foveated-rendering/ |title=Fixed foveated rendering - PICO Unity Integration SDK}}</ref>


The performance gains from foveated rendering are not linear. The technique primarily alleviates bottlenecks related to pixel shading and fill rate. Once this bottleneck is removed, the application's performance will become limited by another part of the system, such as CPU performance (e.g., draw calls, physics simulations) or memory bandwidth. At that point, applying more aggressive foveation will not improve frame rates further and will only serve to degrade visual quality. This reality has led to the development of "dynamic foveation" features in some SDKs, which automatically adjust the foveation level based on the current GPU load to maintain a target frame rate, thus finding the optimal balance between performance and quality on-the-fly.<ref name="MetaQuestProUnityETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unity}}</ref><ref name="MetaUnrealETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unreal/unreal-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unreal}}</ref>
==== [[Unity (game engine)|Unity]] ====


== Applications and Use Cases ==
Unity provides native support for foveated rendering on supported XR platforms through its [[Scriptable Render Pipeline]]s (URP and HDRP).<ref name="UnityFR_Roadmap">{{cite web |url=https://unity.com/roadmap/1356-xr-foveated-rendering |title=XR Foveated Rendering - Unity Roadmap}}</ref><ref name="UnityDocsFR">{{cite web |url=https://docs.unity3d.com/6000.2/Documentation/Manual/xr-foveated-rendering.html |title=Foveated rendering - Unity Manual}}</ref> Unity 6 introduced cross-platform foveated rendering through the Scriptable Render Pipeline (SRP) Foveation API, supporting both VRS and Variable Rate Rasterization.<ref name="unity2024">{{cite web |url=https://docs.unity3d.com/6000.0/Documentation/Manual/xr-foveated-rendering.html |title=Foveated rendering |publisher=Unity Documentation |year=2024}}</ref>
While foveated rendering is most prominent in VR/AR gaming and simulation to achieve higher frame rates, its principles are applied in other domains:


* '''360° Video Streaming:''' As noted in the Foveated Transport section, foveated encoding can reduce bandwidth requirements for streaming high-resolution (e.g., 8K) 360° video by up to 80%.<ref name="Foveated360Video">{{Cite conference |last=Li |first=Jian |et al. |title=Foveated video coding for cloud-based VR streaming |journal=2021 IEEE International Conference on Multimedia and Expo |pages=1–6 |year=2021 |doi=10.1109/ICME51207.2021.9428350 |description=Foveated 360° video.}}</ref>
Developers can enable the feature within the '''XR Plug-in Management''' project settings. At runtime, the strength of the foveation effect is controlled by setting the <code>XRDisplaySubsystem.foveatedRenderingLevel</code> property to a value between 0 (off) and 1 (maximum). To enable gaze-based foveation on supported hardware, the <code>foveatedRenderingFlags</code> property must be set to allow gaze input.<ref name="UnityOpenXR_FR" /><ref name="UnityDocsFR" />
* '''Medical Visualization:''' In high-fidelity medical imaging, foveated rendering allows for real-time interaction with massive volume datasets (e.g., from CT or MRI scans) for surgical planning and simulation.<ref name="MedicalVolumeRender">{{Cite conference |last=Gallo |first=Luigi |et al. |title=Foveation for 3D volume rendering |journal=2013 IEEE 15th International Conference on e-Health Networking, Applications and Services |pages=1–6 |year=2013 |doi=10.1109/HealthCom.2013.6720650 |description=Medical volume foveation.}}</ref>
* '''Holographic and Light Field Displays:''' These next-generation displays require enormous computational power to calculate light propagation. Foveated rendering is a critical area of research for making these displays computationally feasible.<ref name="HolographicFoveation">{{Cite conference |last=Chakravarthula |first=Praneeth |et al. |title=To the holograms born |journal=ACM Transactions on Graphics |volume=40 |issue=6 |pages=1–16 |year=2021 |doi=10.1145/3486108 |description=Holographic foveation.}}</ref>


== Ecosystem Integration ==
Platform-specific SDKs provide their own wrappers and APIs. The Meta XR SDK and the PICO Unity Integration SDK expose dedicated components and functions for enabling and configuring both FFR and ETFR.<ref name="PicoUnityFFR">{{cite web |url=https://developer.picoxr.com/document/unity/fixed-foveated-rendering/ |title=Fixed foveated rendering - PICO Unity Integration SDK}}</ref><ref name="MetaUnityFFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-fixed-foveated-rendering/ |title=Using Fixed Foveated Rendering - Unity}}</ref><ref name="PicoUnityETFR">{{cite web |url=https://developer.picoxr.com/document/unity/eye-tracked-foveated-rendering/ |title=Eye tracked foveated rendering - PICO Unity Integration SDK}}</ref>
For foveated rendering to be widely adopted, it must be accessible to developers through common game engines and standardized APIs. The ecosystem has matured from requiring bespoke, hardware-specific implementations to being a more integrated feature of modern development platforms.


=== Software: Game Engine Support ===
==== [[Unreal Engine]] ====
==== [[Unity (game engine)|Unity]] ====
Unity provides native support for foveated rendering on supported XR platforms through its [[Universal Render Pipeline|Scriptable Render Pipeline]]s (URP and HDRP).<ref name="UnityFR_Roadmap">{{cite web |url=https://unity.com/roadmap/1356-xr-foveated-rendering |title=XR Foveated Rendering - Unity Roadmap}}</ref><ref name="UnityDocsFR">{{cite web |url=https://docs.unity3d.com/6000.2/Documentation/Manual/xr-foveated-rendering.html |title=Foveated rendering - Unity Manual}}</ref> Developers can enable the feature within the '''XR Plug-in Management''' project settings. At runtime, the strength of the foveation effect is controlled by setting the <code>XRDisplaySubsystem.foveatedRenderingLevel</code> property to a value between 0 (off) and 1 (maximum). To enable gaze-based foveation on supported hardware, the <code>foveatedRenderingFlags</code> property must be set to allow gaze input.<ref name="UnityOpenXR_FR" /><ref name="UnityDocsFR" />


In addition to this core support, platform-specific SDKs often provide their own wrappers and APIs. For example, the Meta XR SDK and the PICO Unity Integration SDK expose dedicated components and functions for enabling and configuring both FFR and ETFR, offering different levels of quality and performance.<ref name="PicoUnityFFR" /><ref name="MetaUnityFFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-fixed-foveated-rendering/ |title=Using Fixed Foveated Rendering - Unity}}</ref><ref name="PicoUnityETFR">{{cite web |url=https://developer.picoxr.com/document/unity/eye-tracked-foveated-rendering/ |title=Eye tracked foveated rendering - PICO Unity Integration SDK}}</ref>
Support for foveated rendering in Unreal Engine is managed through platform-specific plugins like the Meta XR Plugin or the PICO Unreal OpenXR Plugin.<ref name="ViveUnrealFFR">{{cite web |url=https://developer.vive.com/resources/openxr/unreal/unreal-tutorials/rendering/foveated-rendering/ |title=Foveated Rendering - VIVE OpenXR Unreal}}</ref><ref name="PicoUnrealOpenXR" /> Unreal Engine 5 implements foveated rendering through Variable Rate Shading for PCVR and Meta's [[OpenXR]] plugin for Quest devices.


==== [[Unreal Engine]] ====
Configuration is handled through '''Project Settings''' and console variables.<ref name="MetaUnrealETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unreal/unreal-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unreal}}</ref><ref name="PicoUnrealOpenXR" /> For example, developers can set the foveation level using a console command like <code>xr.OpenXRFBFoveationLevel=2</code> for medium foveation. On mobile platforms, ETFR support is often a [[Vulkan (API)|Vulkan]]-only feature.<ref name="MetaUnrealETFR" /><ref name="PicoUnrealOpenXR" />
Support for foveated rendering in Unreal Engine is also robust, though often managed through platform-specific plugins like the Meta XR Plugin or the PICO Unreal OpenXR Plugin.<ref name="ViveUnrealFFR">{{cite web |url=https://developer.vive.com/resources/openxr/unreal/unreal-tutorials/rendering/foveated-rendering/ |title=Foveated Rendering - VIVE OpenXR Unreal}}</ref><ref name="PicoUnrealOpenXR" /> Configuration is typically handled through a combination of '''Project Settings''' (under the Plugins section for the specific XR platform) and console variables.<ref name="MetaUnrealETFR" /><ref name="PicoUnrealOpenXR" /> For example, developers can set the foveation level using a console command like <code>xr.OpenXRFBFoveationLevel=2</code> for medium foveation. These settings can also be controlled dynamically at runtime via [[Blueprints (visual scripting)|Blueprints]] or C++.<ref name="PicoUnrealOpenXR" /> On mobile platforms, ETFR support is often a [[Vulkan (API)|Vulkan]]-only feature and may require using a specific version or fork of the engine to ensure compatibility.<ref name="MetaUnrealETFR" /><ref name="PicoUnrealOpenXR" />


=== API and Standards ===
=== API and Standards ===
==== Graphics APIs ([[Vulkan (API)|Vulkan]] & [[DirectX]]) ====
The underlying implementation of foveated rendering relies on features within low-level graphics APIs. [[Variable rate shading|Variable Rate Shading]] (VRS) is a core feature of [[DirectX|DirectX]] 12 (requiring Tier 2 support) and is also supported in Vulkan.<ref name="UnrealVRS_FFR">{{cite web |url=https.dev.epicgames.com/documentation/en-us/unreal-engine/vr-performance-features?application_version=4.27 |title=VR Performance Features - Unreal Engine 4.27 Documentation}}</ref><ref name="UnityDocsFR" />


The Vulkan API, in particular, offers powerful and flexible extensions for foveation. The <code>VK_EXT_fragment_density_map</code> extension allows the VR runtime to provide the GPU with a custom texture that dictates the rendering resolution across the framebuffer, enabling highly tailored foveation patterns.<ref name="VulkanMobileVR" /><ref name="VulkanFDMExtension" /> This approach is a cornerstone of foveation implementation on modern Android-based standalone headsets.
==== Graphics APIs ====
 
[[Variable Rate Shading]] (VRS) is a core feature of [[DirectX]] 12 (requiring Tier 2 support) and is also supported in [[Vulkan]].<ref name="UnrealVRS_FFR">{{cite web |url=https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-performance-features?application_version=4.27 |title=VR Performance Features - Unreal Engine 4.27 Documentation}}</ref><ref name="UnityDocsFR" /> [[DirectX]] 12 Tier 2 Variable Rate Shading provides granular control through shading rate surfaces.
 
The [[Vulkan]] API offers powerful extensions for foveation. The <code>VK_EXT_fragment_density_map</code> extension allows the VR runtime to provide the GPU with a custom texture that dictates the rendering resolution across the framebuffer.<ref name="VulkanMobileVR" /><ref name="VulkanFDMExtension" />


==== [[OpenXR]] ====
==== [[OpenXR]] ====
[[OpenXR]] is a royalty-free, open standard from the [[Khronos Group]] that provides high-performance access to AR and VR platforms and devices. It plays a crucial role in standardizing foveated rendering for developers. By defining a common set of extensions, OpenXR allows game engines and applications to implement foveation in a vendor-agnostic way. Key extensions include:
 
[[OpenXR]] is a royalty-free, open standard from the [[Khronos Group]] that provides high-performance access to AR and VR platforms. OpenXR 1.1 standardized foveated rendering through multiple vendor extensions.<ref name="openxr2024">{{cite web |url=https://www.uploadvr.com/openxr-1-1/ |title=OpenXR 1.1 Brings Foveated Rendering & More Into The Spec |publisher=UploadVR |year=2024}}</ref> Key extensions include:
* <code>XR_FB_foveation</code>
* <code>XR_FB_foveation</code>
* <code>XR_FB_foveation_configuration</code>
* <code>XR_FB_foveation_configuration</code>
* <code>XR_META_foveation_eye_tracked</code>
* <code>XR_META_foveation_eye_tracked</code>
* <code>XR_VARJO_foveated_rendering</code>


These extensions allow an application to query for foveation support, configure its parameters (like quality levels), and enable or disable it at runtime.<ref name="PicoUnrealOpenXR" /><ref name="ViveUnrealFFR" /> This level of abstraction is a sign of a maturing technology; it moves developers away from needing to write hardware-specific code and toward a "write once, run anywhere" paradigm. This trend, where low-level hardware features are abstracted first by graphics APIs, then by standards like OpenXR, and finally by game engines, is critical for lowering the barrier to entry and encouraging widespread adoption of foveated rendering.
These extensions allow an application to query for foveation support, configure its parameters, and enable or disable it at runtime.<ref name="PicoUnrealOpenXR" /><ref name="ViveUnrealFFR" /> The '''OpenXR Toolkit''' can inject foveated rendering capabilities into OpenXR applications that may not natively support it.<ref name="OpenXRToolkit" /><ref name="DCS_Forum_QuadViews" />
 
Furthermore, tools like the '''OpenXR Toolkit''' demonstrate the power of this layered approach by acting as an API layer that can inject foveated rendering capabilities into OpenXR applications that may not natively support it, provided the user's hardware is capable.<ref name="OpenXRToolkit" /><ref name="DCS_Forum_QuadViews" />


== Hardware Support and Implementation ==
== Hardware Implementations ==
The availability and effectiveness of foveated rendering are intrinsically linked to the capabilities of the VR hardware itself, with the quality of the integrated eye-tracking system being the most critical component for dynamic variants.


=== The Critical Role of Eye-Tracking Hardware ===
=== Consumer Devices ===
Dynamic Foveated Rendering is entirely dependent on a high-quality, low-latency eye-tracking system. A typical implementation involves one or more small infrared (IR) cameras mounted inside the headset, aimed at each eye, and illuminated by IR LEDs.<ref name="PimaxDFR" /><ref name="PSVR2_TechSpecs">{{cite web |url=https.www.playstation.com/en-ca/ps-vr2/ps-vr2-tech-specs/ |title=PlayStation VR2 tech specs | PlayStation}}</ref> The system captures images of the pupil and corneal reflections to calculate the gaze direction. The key performance specifications of an eye tracker that directly impact the quality of ETFR are:
* '''Frequency (Hz):''' This measures how many times per second the eye position is updated. A higher frequency reduces the time between an eye movement and its detection, contributing to lower overall latency. Commercial headsets feature tracking frequencies ranging from 120 Hz to 200 Hz.<ref name="PimaxDFR" /><ref name="VarjoAeroSpecs">{{cite web |url=https://varjo.com/products/aero |title=Varjo Aero - Varjo}}</ref>
* '''Accuracy:''' This measures how close the tracked gaze point is to the true gaze point, typically expressed in degrees of visual angle. Sub-degree accuracy is necessary to ensure the foveal region is correctly placed.<ref name="VarjoAeroSpecs" />
* '''Latency:''' This is the total end-to-end delay from the moment an eye moves to when the eye tracker's data is available to the application. As discussed previously, minimizing latency is the single most important factor in preventing perceptual artifacts.<ref name="LatencyRequirements" />
 
=== Consumer Headsets ===
* '''[[Meta Quest Pro]]:''' Released in 2022, the Quest Pro was Meta's first commercial headset to feature integrated eye tracking, making ETFR a key selling point. It leverages this capability to offer significant performance gains over the FFR-only [[Meta Quest 2]].<ref name="WikipediaFR" /><ref name="MetaQuestProUnityETFR" />
* '''[[PlayStation VR2]]:''' Launched in 2023, the PSVR2 made ETFR a central pillar of its architecture. By offloading rendering work, the headset enables the [[PlayStation 5]] console to produce graphics with a higher level of fidelity than would otherwise be possible. The headset integrates one IR camera per eye for tracking.<ref name="WikipediaFR" /><ref name="PSVR2_TechSpecs" />
* '''[[Pico 4]] Pro / Enterprise:''' Similar to Meta's lineup, the professional-oriented versions of the Pico 4 headset include eye tracking and support ETFR, offering a performance advantage over the standard model.<ref name="PicoUnityETFR" />
 
=== Professional & Enthusiast Headsets ===
* '''[[Varjo]] (Aero, XR-4, etc.):''' Varjo's high-end professional headsets are known for their industry-leading visual fidelity and feature a robust 200 Hz eye-tracking system. They support advanced foveation techniques, including both VRS and a proprietary "dynamic projection" method, which is a form of multi-view rendering.<ref name="VarjoAeroSpecs" /><ref name="VarjoEyeTrackingSpecs">{{cite web |url=https://developer.varjo.com/docs/get-started/eye-tracking-with-varjo-headset |title=Eye tracking with Varjo headset}}</ref>
* '''[[Pimax]] (Crystal, Crystal Super):''' Aimed at the VR enthusiast market, Pimax headsets like the Crystal integrate high-speed (120 Hz) eye tracking. The companion Pimax Play software supports both VRS and the highly effective Quad Views rendering method, and can even inject DFR into some games that lack native support.<ref name="PimaxCrystalSuper">{{cite web |url=https://pimax.com/pages/pimax-crystal-super |title=Pimax Crystal Super}}</ref><ref name="PimaxDFR" />
* '''[[Apple Vision Pro]]:''' Apple's "spatial computer," released in 2024, also features a sophisticated eye-tracking system as a primary input method. It leverages this for foveated rendering to help drive its very high-resolution displays with the onboard Apple M2 chip.<ref name="PCMagBestVR">{{cite web |url=https.www.pcmag.com/picks/the-best-vr-headsets |title=The Best VR Headsets for 2025}}</ref>


{| class="wikitable"
{| class="wikitable"
|+ Foveated Rendering Capabilities of Major Commercial VR Headsets
|+ Foveated Rendering Capabilities of Major Commercial VR Headsets
! Headset
! Headset !! Release Year !! Display Resolution (per eye) !! Eye Tracking !! Eye Tracker Specs !! Foveated Rendering Support
! Release Year
! Display Resolution (per eye)
! Eye Tracking
! Eye Tracker Specs
! Foveated Rendering Support
 
|-
|-
! [[Meta Quest 2]]
! [[Meta Quest 2]]
 
| 2020 | 1832 x 1920 | No | N/A | Fixed Foveated Rendering (FFR) only<ref name="VRX_FR_Types" />
| 2020
|-
| 1832 x 1920
! [[Meta Quest 3]]
| No
| 2023 | 2064 x 2208 | No | N/A | Fixed Foveated Rendering (FFR) with improved efficiency<ref name="VRX_FR_Types" />
| N/A
| Fixed Foveated Rendering (FFR) only<ref name="VRX_FR_Types" />
|-
|-
! [[Meta Quest Pro]]
! [[Meta Quest Pro]]
 
| 2022 | 1800 x 1920 | Yes | Internal cameras, gaze prediction, 46-57ms latency<ref name="MetaETFR_Blog" /> | Eye-Tracked Foveated Rendering (ETFR) & FFR<ref name="WikipediaFR" /><ref name="MetaQuestProUnityETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unity}}</ref>
| 2022
| 1800 x 1920
| Yes
| Internal cameras, gaze prediction<ref name="MetaETFR_Blog" />
| Eye-Tracked Foveated Rendering (ETFR) & FFR<ref name="WikipediaFR" /><ref name="MetaQuestProUnityETFR" />
|-
|-
! [[PlayStation VR2]]
! [[PlayStation VR2]]
 
| 2023 | 2000 x 2040 | Yes | 1x IR camera per eye, [[Tobii]] technology<ref name="PSVR2_TechSpecs">{{cite web |url=https://www.playstation.com/en-ca/ps-vr2/ps-vr2-tech-specs/ |title=PlayStation VR2 tech specs | PlayStation}}</ref> | Eye-Tracked Foveated Rendering (ETFR)<ref name="WikipediaFR" /><ref name="PSVR2_GDC_Unity" />
| 2023
|-
| 2000 x 2040
! [[HTC Vive Pro Eye]]
| Yes
| 2019 | 1440 x 1600 | Yes | [[Tobii]] eye tracking, 120Hz | Dynamic Foveated Rendering<ref name="theverge2019" />
| 1x IR camera per eye<ref name="PSVR2_TechSpecs" />
| Eye-Tracked Foveated Rendering (ETFR)<ref name="WikipediaFR" /><ref name="PSVR2_GDC_Unity" />
|-
|-
! [[Varjo Aero]]
! [[HTC Vive Focus 3]]
 
| 2021 | 2448 x 2448 | No (Add-on available) | N/A | Fixed Foveated Rendering (via VRS)<ref name="UnrealVRS_FFR" />
| 2021
| 2880 x 2720
| Yes
| 200Hz, sub-degree accuracy<ref name="VarjoAeroSpecs" /><ref name="VarjoEyeTrackingSpecs" />
| Dynamic Foveated Rendering (Dynamic Projection & VRS)<ref name="VarjoAPI" /><ref name="VarjoFoveationPage">{{cite web |url=https://support.varjo.com/hc/en-us/foveated-rendering |title=Foveated rendering - Varjo Support}}</ref>
|-
|-
! [[Pimax Crystal]]
! [[Pico 4]] Standard
 
| 2022 | 2160 x 2160 | No | N/A | Fixed Foveated Rendering<ref name="PicoUnityFFR" />
| 2023
| 2880 x 2880
| Yes
| 120Hz, Tobii-powered<ref name="PimaxDFR" /><ref name="PimaxDFR_About">{{cite web |url=httpshttps://pimax.com/blogs/blogs/about-dynamic-foveated-rendering-dfr-in-virtual-reality-vr |title=About Dynamic Foveated Rendering (DFR) in Virtual Reality (VR)}}</ref>
| Dynamic Foveated Rendering (VRS & Quad Views)<ref name="PimaxDFR" /><ref name="PimaxQuadViews" />
|-
|-
! [[Pico 4 Pro]]
! [[Pico 4 Pro]]
| 2022 | 2160 x 2160 | Yes | Internal cameras | Eye-Tracked Foveated Rendering (ETFR) & FFR<ref name="PicoUnityETFR" /><ref name="PicoUnrealLegacy">{{cite web |url=https://developer.picoxr.com/document/unreal/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal Integration SDK}}</ref>
|-
! [[Apple Vision Pro]]
| 2024 | ~3660 x 3200 (est.) | Yes | High-speed cameras and IR illuminators, M2 chip processing | Eye-Tracked Foveated Rendering<ref name="pcmag2025">{{cite web |url=https://www.pcmag.com/picks/the-best-vr-headsets |title=The Best VR Headsets for 2025 |publisher=PC Magazine}}</ref>
|}
=== Professional & Enthusiast Devices ===
* '''[[Varjo]] (Aero, XR-4, VR-3, etc.):''' Professional headsets with industry-leading visual fidelity featuring 200Hz [[Tobii]] eye-tracking system. Support advanced foveation techniques including both VRS and proprietary "dynamic projection" method. Achieve over 70 pixels per degree in focus area using bionic displays.<ref name="VarjoAeroSpecs">{{cite web |url=https://varjo.com/products/aero |title=Varjo Aero - Varjo}}</ref><ref name="VarjoEyeTrackingSpecs">{{cite web |url=https://developer.varjo.com/docs/get-started/eye-tracking-with-varjo-headset |title=Eye tracking with Varjo headset}}</ref><ref name="VarjoFoveationPage">{{cite web |url=https://support.varjo.com/hc/en-us/foveated-rendering |title=Foveated rendering - Varjo Support}}</ref>
* '''[[Pimax]] (Crystal, Crystal Super):''' Enthusiast headsets with 2880 x 2880 per eye resolution, integrating 120Hz [[Tobii]]-powered eye tracking. Support both VRS (10-40% gain) and Quad Views rendering (50-100% gain).<ref name="PimaxCrystalSuper">{{cite web |url=https://pimax.com/pages/pimax-crystal-super |title=Pimax Crystal Super}}</ref><ref name="PimaxDFR" /><ref name="PimaxDFR_About">{{cite web |url=https://pimax.com/blogs/blogs/about-dynamic-foveated-rendering-dfr-in-virtual-reality-vr |title=About Dynamic Foveated Rendering (DFR) in Virtual Reality (VR)}}</ref>
* '''[[StarVR One]]:''' Enterprise headset with 210° field of view, integrated [[Tobii]] eye-tracking for foveated rendering across ultra-wide displays.<ref name="starvr2018" />


| 2022
* '''[[FOVE|FOVE 0]]:''' First commercially available HMD with integrated eye tracking and foveated rendering support (2016-2017). Featured infrared eye tracking with 100° field of view.<ref name="tomshardware2016" />
| 2160 x 2160
 
| Yes
=== Hardware Components ===
| Internal cameras
| Eye-Tracked Foveated Rendering (ETFR) & FFR<ref name="PicoUnityETFR" /><ref name="PicoUnrealLegacy">{{cite web |url=https://developer.picoxr.com/document/unreal/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal Integration SDK}}</ref>
|-
! [[HTC Vive Pro 2]]


| 2021
The quality of eye-tracking hardware directly impacts ETFR effectiveness. Key specifications include:
| 2448 x 2448
| No (Add-on available)
| N/A
| Fixed Foveated Rendering (via VRS)<ref name="UnrealVRS_FFR" />
|-
! [[Apple Vision Pro]]


| 2024
* '''Frequency:''' Commercial headsets feature tracking frequencies from 120Hz to 200Hz. Higher frequencies reduce time between eye movement and detection.<ref name="PimaxDFR" /><ref name="VarjoAeroSpecs" />
| ~3660 x 3200 (est.)
* '''Accuracy:''' Sub-degree accuracy (typically 0.5-1.0°) necessary to ensure correct foveal region placement. Mobile VR headsets typically achieve 0.5-1.1° accuracy compared to sub-0.5° for research-grade systems.<ref name="tobii2023" />
| Yes
* '''Latency:''' Total end-to-end delay from eye movement to data availability. Must remain below 50-70ms for imperceptible artifacts, with latencies beyond 80-150ms causing significant quality degradation.<ref name="nvidia2017" />
| High-speed cameras and IR illuminators
* '''Implementation:''' Typical systems use one or more small infrared (IR) cameras mounted inside the headset, aimed at each eye, and illuminated by IR LEDs to capture pupil and corneal reflections.<ref name="PimaxDFR" /><ref name="PSVR2_TechSpecs" />
| Eye-Tracked Foveated Rendering<ref name="PCMagBestVR" />
|}


== Challenges and Limitations ==
== Challenges and Limitations ==
Despite its significant benefits, implementing foveated rendering effectively presents several technical and perceptual challenges that must be addressed by hardware manufacturers and software developers.


=== Eye-Tracking Latency, Accuracy, and Jitter ===
=== Eye-Tracking Latency, Accuracy, and Jitter ===
The quality of the eye-tracking subsystem is the single most critical factor for the success of ETFR.
The quality of the eye-tracking subsystem is the single most critical factor for the success of ETFR.
* '''Latency:''' As previously established, high end-to-end latency is the primary antagonist of foveated rendering. If the system cannot update the foveal region before the user's saccade completes and saccadic masking wears off, the user will perceive a blurry or low-resolution image at their new point of focus. This is a jarring artifact known as "pop-in."<ref name="UnityFR_Test_Reddit" /> Research indicates that while latencies of 80-150 ms cause significant issues, a total system latency of 50-70 ms can be tolerated without a major impact on the acceptable level of foveation.<ref name="LatencyRequirements" /> Meta reports the end-to-end latency on the Quest Pro to be in the 46-57 ms range, which is within this acceptable threshold.<ref name="MetaETFR_Blog" />
 
* '''Accuracy and Jitter:''' The tracking system must be accurate enough to place the foveal region correctly. Inaccuracies can lead to the user's actual fovea landing on a medium- or low-resolution part of the image. Furthermore, the tracking data must be stable. "Jitter," or small, rapid fluctuations in the reported gaze position, can cause the high-resolution area to shimmer or vibrate, which is highly distracting.
* '''Latency:''' High end-to-end latency is the primary antagonist of foveated rendering. If the system cannot update the foveal region before the user's saccade completes and saccadic masking wears off, the user will perceive artifacts known as "pop-in."<ref name="UnityFR_Test_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/3bls3q/unity_foveated_rendering_test_4x_fps_increase/ |title=unity foveated rendering test 4x fps increase with a pretty simple rendering strategy}}</ref> Research indicates that while latencies of 80-150ms cause significant issues, a total system latency of 50-70ms can be tolerated.<ref name="LatencyRequirements" />
* '''Accuracy and Jitter:''' The tracking system must be accurate enough to place the foveal region correctly. "Jitter," or small fluctuations in the reported gaze position, can cause the high-resolution area to shimmer or vibrate.


=== Perceptual Artifacts and Mitigation Strategies ===
=== Perceptual Artifacts and Mitigation Strategies ===
Even with good eye tracking, aggressive or poorly implemented foveation can introduce noticeable visual artifacts.
Even with good eye tracking, aggressive or poorly implemented foveation can introduce noticeable visual artifacts.
* '''"Tunnel Vision":''' If the peripheral region is blurred too aggressively or if the filtering process causes a significant loss of contrast, it can create a subjective feeling of looking through a narrow tunnel, which detracts from the sense of immersion.<ref name="SIGA16_GazeTracked">{{cite web |url=http://cwyman.org/papers/siga16_gazeTrackedFoveatedRendering.pdf |title=A Perceptually-Based Foveated Real-Time Renderer}}</ref>
* '''Flicker and Aliasing:''' Certain quality reduction methods, especially simple subsampling, can introduce temporal artifacts like shimmering and flickering, or spatial artifacts like jagged edges ([[aliasing]]) in the periphery. While the periphery is less sensitive to detail, it is highly sensitive to motion and flicker, making these artifacts particularly noticeable.<ref name="PerceptuallyBasedFR_Nvidia">{{cite web |url=https::research.nvidia.com/publication/2016-07_perceptually-based-foveated-virtual-reality |title=Perceptually-Based Foveated Virtual Reality | Research}}</ref>


To combat these issues, developers employ several mitigation strategies. Creating a smooth "blend" region between the high- and medium-quality zones, rather than a sharp cutoff, can make the transition less obvious.<ref name="IntegrativeView" /> Some advanced renderers apply a contrast enhancement pass to the periphery to counteract the contrast loss caused by blurring, which helps restore the apparent detail and reduce the sense of tunnel vision.<ref name="SIGA16_GazeTracked" /> Finally, using sophisticated [[anti-aliasing]] algorithms that are aware of the multi-resolution nature of the image can help to stabilize the periphery and reduce flicker.<ref name="SIGA16_GazeTracked" />
* '''"Tunnel Vision":''' If the peripheral region is blurred too aggressively or if the filtering process causes significant contrast loss, it creates a subjective feeling of looking through a narrow tunnel.<ref name="SIGA16_GazeTracked">{{cite web |url=http://cwyman.org/papers/siga16_gazeTrackedFoveatedRendering.pdf |title=A Perceptually-Based Foveated Real-Time Renderer}}</ref>
* '''Flicker and Aliasing:''' Simple subsampling can introduce temporal artifacts like shimmering and flickering, or spatial artifacts like jagged edges ([[aliasing]]) in the periphery.<ref name="PerceptuallyBasedFR_Nvidia">{{cite web |url=https://research.nvidia.com/publication/2016-07_perceptually-based-foveated-virtual-reality |title=Perceptually-Based Foveated Virtual Reality | Research}}</ref>
* '''Edge Artifacts:''' Producing jagged boundaries during smooth pursuits.<ref name="franke2021">{{cite web |url=https://onlinelibrary.wiley.com/doi/10.1111/cgf.14176 |title=Time-Warped Foveated Rendering for Virtual Reality Headsets |publisher=Computer Graphics Forum |year=2021}}</ref>
* '''"Chasing" Effect:''' From excessive latency where users perceive the sharp region following their gaze.<ref name="franke2021" />
 
Mitigation strategies include creating smooth "blend" regions between quality zones, applying contrast enhancement to the periphery, and using sophisticated [[anti-aliasing]] algorithms.<ref name="IntegrativeView" /><ref name="SIGA16_GazeTracked" />


=== Developer Adoption and Implementation Complexity ===
=== Developer Adoption and Implementation Complexity ===
While modern game engines and APIs have made implementation easier, foveated rendering is not always a simple "flick of a switch."
* '''Rendering Pipeline Incompatibility:''' Foveation can be incompatible with certain rendering techniques. For example, some [[post-processing]] effects that operate on the full-screen image may not work correctly with a multi-resolution input. Similarly, rendering to intermediate textures, which is common in techniques like camera stacking, can break the foveation pipeline in some engines.<ref name="PicoUnityFFR" /><ref name="QualcommSpacesUnityFR">{{cite web |url=httpss://docs.spaces.qualcomm.com/unity/setup/foveated-rendering |title=Foveated Rendering - Snapdragon Spaces}}</ref>
* '''Tuning and Testing:''' There is no universal "best" foveation setting. The optimal balance between performance gain and visual quality depends heavily on the specific content of an application. Developers must test and tune the foveation levels for different scenes to ensure artifacts are not visible during normal gameplay.<ref name="MetaFFR_OS" /><ref name="PSVR2_DevTime">{{cite web |url=httpshttps://www.reddit.com/r/PSVR/comments/1eacq3v/do_all_psvr2_games_use_foverated_rendering/ |title=Do all PSVR2 games use foverated rendering?}}</ref>
* '''Fallback Support:''' For applications targeting devices with ETFR, developers must also consider the case where eye tracking is unavailable (e.g., the user disables it for privacy reasons or calibration fails). The application must be able to gracefully fall back to FFR or no foveation and still maintain its target frame rate, which adds another layer of complexity to performance management.<ref name="MetaUnrealETFR" />


== History and Future Directions ==
While modern game engines and APIs have made implementation easier, foveated rendering is not always simple.
Foveated rendering has evolved from a theoretical concept in academic papers to a cornerstone technology for modern VR, with ongoing research continuing to push its boundaries.
 
* '''Rendering Pipeline Incompatibility:''' Foveation can be incompatible with certain [[post-processing]] effects that operate on full-screen images. Rendering to intermediate textures can break the foveation pipeline.<ref name="PicoUnityFFR" /><ref name="QualcommSpacesUnityFR">{{cite web |url=https://docs.spaces.qualcomm.com/unity/setup/foveated-rendering |title=Foveated Rendering - Snapdragon Spaces}}</ref>
* '''Tuning and Testing:''' No universal "best" foveation setting exists. Optimal balance depends on specific content.<ref name="MetaFFR_OS" /><ref name="PSVR2_DevTime">{{cite web |url=https://www.reddit.com/r/PSVR/comments/1eacq3v/do_all_psvr2_games_use_foverated_rendering/ |title=Do all PSVR2 games use foverated rendering?}}</ref>
* '''Fallback Support:''' Applications must gracefully fall back to FFR or no foveation when eye tracking is unavailable.<ref name="MetaUnrealETFR" />
 
=== Hardware Limitations ===
 
* '''Mobile vs Desktop Performance:''' Mobile GPU architectures see smaller benefits than console/desktop GPUs—Quest Pro achieves 33-45% savings while PSVR2 reaches 72%.<ref name="uploadvr2022" /><ref name="unity2023" />
* '''Cost and Complexity:''' Eye-tracking hardware increases headset cost, weight, and power consumption.
* '''Calibration Requirements:''' Individual calibration typically required for each user to map eye movements accurately.
 
== Neural Reconstruction Approaches ==
 
=== DeepFovea ===
 
[[DeepFovea]], developed by Facebook Reality Labs and presented at SIGGRAPH Asia 2019, pioneered neural reconstruction for foveated rendering. The system renders only 10% of peripheral pixels and reconstructs missing pixels using a [[convolutional neural network]], enabling up to 10-14× pixel count reduction with minimal perceptual impact.<ref name="deepfovea2019" />
 
=== Recent Advances ===
 
* '''FoVolNet (2022):''' Achieved 25× speedup over DeepFovea through hybrid direct and kernel prediction for volume rendering.<ref name="fovolnet2022">{{cite web |url=https://arxiv.org/abs/2209.09965 |title=FoVolNet: Fast Volume Rendering using Foveated Deep Neural Networks |publisher=arXiv |year=2022}}</ref>
* '''VR-Splatting (2024):''' Combines [[3D Gaussian Splatting]] with foveated rendering for photorealistic VR at 90Hz, achieving 63% power reduction.<ref name="vrsplatting2024" />
* '''FovealNet (2024):''' Integrates gaze prediction using AI to compensate for latency, advancing real-time performance.<ref name="fovealnet2024">{{cite web |url=https://arxiv.org/abs/2412.10456 |title=FovealNet: Advancing AI-Driven Gaze Tracking Solutions for Optimized Foveated Rendering System Performance in Virtual Reality |publisher=arXiv |year=2024}}</ref>
 
== Current Research Frontiers ==


=== Early Research and Key Milestones ===
The evolution of foveated rendering continues with researchers exploring more sophisticated models of human perception:
Research into gaze-contingent rendering can be traced back to at least 1990, when scientists began proposing that rendering computations could be reduced by taking the human visual acuity falloff into account.<ref name="Levoy1990">{{Cite journal |last=Levoy |first=Marc |last2=Whitaker |first2=Robert |title=Gaze-directed volume rendering |journal=Proceedings of the 1990 Symposium on Interactive 3D Graphics |pages=361–369 |year=1990 |doi=10.1145/91394.91431 |description=Foundational gaze-directed rendering paper.}}</ref> However, early experiments were limited by the available hardware. The path to commercialization began in earnest with the modern resurgence of VR.


* '''2014:''' The startup '''[[FOVE]]''' unveiled a prototype HMD at TechCrunch Disrupt, bringing the concept of an eye-tracked VR headset with foveated rendering into the public eye.<ref name="FOVE_TechCrunch">{{Cite web |title=FOVE Uses Eye Tracking To Make Virtual Reality More Immersive |url=https://techcrunch.com/2014/09/09/fove/ |publisher=TechCrunch |date=2014-09-10 |access-date=2025-10-26}}</ref> The company launched a successful Kickstarter in 2015, and the FOVE 0 headset shipped to developers in late 2016, becoming the first commercially available HMD with integrated eye tracking.<ref name="FOVE_Kickstarter">{{Cite web |title=FOVE: The World's First Eye Tracking Virtual Reality Headset |url=https://www.kickstarter.com/projects/fove/fove-the-worlds-first-eye-tracking-virtual-reality |publisher=Kickstarter |date=2015-09-01 |access-date=2025-10-26}}</ref>
* '''Luminance-Contrast-Aware Foveation:''' Recognizes that HVS sensitivity to detail depends not just on eccentricity but also local image content. Applies more aggressive foveation in very dark or low-contrast areas.<ref name="LuminanceContrastAware">{{cite web |url=https://history.siggraph.org/learning/luminance-contrast-aware-foveated-rendering-by-tursun-arabadzhiyska-koleva-wernikowski-mantiuk-seidel-et-al/ |title="Luminance-Contrast-Aware Foveated Rendering" by Tursun, Arabadzhiyska-Koleva, Wernikowski, Mantiuk, Seidel, et al.}}</ref>
* '''2016:''' At CES, '''SensoMotoric Instruments (SMI)''' demonstrated a 250 Hz eye-tracking system retrofitted into an Oculus Rift, showcasing a low-latency foveated rendering solution that observers found imperceptible.<ref name="SMI_UploadVR">{{Cite web |title=SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real |url=https://uploadvr.com/smi-hands-on-250hz-eye-tracking/ |publisher=UploadVR |date=2016-01-15 |access-date=2025-10-26}}</ref> Later that year at [[SIGGRAPH]], '''[[NVIDIA]]''' demonstrated a new, perceptually-based method claimed to be invisible to users, reporting a 50-66% reduction in pixel shading load.<ref name="Nvidia_Blog_SMI">{{Cite web |title=NVIDIA Partners with SMI on Innovative Rendering Technique That Improves VR |url=https://blogs.nvidia.com/blog/2016/07/21/rendering-foveated-vr/ |publisher=NVIDIA |date=2016-07-21 |access-date=2025-10-26}}</ref>
* '''2017:''' '''[[Qualcomm]]''' announced its Snapdragon 835 VRDK, which included support for a foveation technique called "Adreno Foveation," signaling the technology's arrival on mobile processors.<ref name="Qualcomm835">{{Cite web |title=Qualcomm Introduces Snapdragon 835 Virtual Reality Development Kit |url=https://www.qualcomm.com/news/releases/2017/02/23/qualcomm-introduces-snapdragon-835-virtual-reality-development-kit |publisher=Qualcomm |date=2017-02-23 |access-date=2025-10-26}}</ref>
* '''2019:''' Eye-tracking began to enter mainstream devices. The '''[[HTC Vive Pro Eye]]''' was announced at CES 2019 as a commercial headset with native eye tracking.<ref name="ViveProEye_Verge">{{Cite web |title=HTC announces new Vive Pro Eye virtual reality headset with native eye tracking |url=https://www.theverge.com/2019/1/7/18172700/htc-vive-pro-eye-tracking-virtual-reality-headset-features-pricing-release-date-ces-2019 |publisher=The Verge |date=2019-01-07 |access-date=2025-10-26}}</ref> Later that year, Meta (then Facebook) provided developers with access to Fixed Foveated Rendering in the Oculus Quest SDK, marking its first large-scale commercial deployment.<ref name="QuestFFR_VentureBeat">{{Cite web |title=Oculus Quest gets dynamic fixed foveated rendering |url=https://venturebeat.com/2019/12/22/oculus-quest-gets-dynamic-fixed-foveated-rendering/ |publisher=VentureBeat |date=2019-12-22 |access-date=2025-10-26}}</ref>
* '''2022-2023:''' The launch of the '''[[Meta Quest Pro]]''' (2022) and '''[[PlayStation VR2]]''' (2023) established Eye-Tracked Foveated Rendering as a flagship feature in the consumer VR market, cementing its role as a critical enabling technology for next-generation immersive experiences.<ref name="PSVR2_Blog">{{Cite web |title=PlayStation VR2 and PlayStation VR2 Sense controller: The next generation of VR gaming on PS5 |url=https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/ |publisher=PlayStation Blog |date=2022-01-04 |access-date=2025-10-26}}</ref>
* '''2024:''' The release of the '''[[Apple Vision Pro]]''' further solidified the technology's importance, incorporating high-speed eye tracking as a core component for both rendering optimization and user interaction.<ref name="VisionPro_Dev">{{Cite web |title=Discover visionOS |url=https://developer.apple.com/visionos/ |publisher=Apple Developer |date=2023-06-07 |access-date=2025-10-26}}</ref>


=== Neural Reconstruction Approaches ===
* '''Attention-Aware Foveation:''' Incorporates cognitive factors, using task difficulty to dynamically adjust peripheral degradation level.<ref name="AttentionAware" /><ref name="AttentionAwareSIGGRAPH" />
A cutting-edge frontier in foveation involves [[neural rendering]], which uses [[artificial intelligence]] to reconstruct a high-quality image from sparsely rendered data.


* '''[[DeepFovea]]:''' Developed by Facebook Reality Labs and presented at SIGGRAPH Asia 2019, DeepFovea pioneered neural reconstruction for foveated rendering. The system renders only 10% of peripheral pixels and then uses a [[convolutional neural network]] to "fill in" the missing information, enabling a potential 10-14× reduction in pixel shading.<ref name="DeepFovea" /><ref name="DeepFovea_Paper">{{cite web |title=DeepFovea: Neural Reconstruction for Foveated Rendering |url=https://dl.acm.org/doi/10.1145/3306307.3328186 |publisher=ACM SIGGRAPH |year=2019}}</ref>
* '''Individualized Foveated Rendering (IFR):''' Tailors foveation parameters to unique perceptual abilities of each user through brief calibration processes.<ref name="IndividualizedFR">{{cite web |url=https://www.researchgate.net/publication/377532315_Individualized_foveated_rendering_with_eye-tracking_head-mounted_display |title=Individualized foveated rendering with eye-tracking head-mounted display}}</ref>
* '''Recent Advances:''' Research has continued to build on this concept. FoVolNet (2022) achieved a 25× speedup over DeepFovea for volume rendering.<ref name="fovolnet2022">{{cite web |title=FoVolNet: Fast Volume Rendering using Foveated Deep Neural Networks |url=https://arxiv.org/abs/2209.09965 |publisher=arXiv |year=2022}}</ref> VR-Splatting (2024) combines [[3D Gaussian Splatting]] with foveated rendering to achieve photorealistic VR at 90Hz.<ref name="vrsplatting2024">{{cite web |title=VR-Splatting: Foveated Radiance Field Rendering via 3D Gaussian Splatting and Neural Points |url=https://dl.acm.org/doi/10.1145/3728302 |publisher=ACM |year=2024}}</ref>


=== Future Developments ===
* '''Eye-Dominance-Guided Foveation:''' Renders the image for the [[dominant eye]] at slightly higher quality, providing performance savings without noticeable impact on stereo perception.<ref name="EyeDominanceGuided">{{cite web |url=https://research.google/pubs/eye-dominance-guided-foveated-rendering/ |title=Eye-Dominance-Guided Foveated Rendering}}</ref>
The progression of foveated rendering represents a paradigm shift in computer graphics, moving away from a brute-force approach toward an intelligent, perceptually-driven allocation of resources.


* '''Near-term (2025-2026):''' Increased adoption of neural reconstruction techniques in consumer devices, potentially accelerated by on-chip [[NPU]]s. Standardization of foveation APIs via OpenXR will eliminate platform fragmentation.
* '''Predictive Foveation:''' Systems predict saccade landing points based on initial trajectory and velocity, allowing rendering systems to begin shifting the foveal region before eye movement completes.<ref name="FoveatedRenderingExplainedReddit" /><ref name="VRX_FR_Types" />
* '''Mid-term (2026-2028):''' Foveated rendering will become mandatory for "retinal resolution" displays (60-70 pixels per degree) to be computationally feasible. Research will focus on adaptive foveation that personalizes the quality falloff curve for individual users.
 
* '''Long-term (2028+):''' The technology will likely be fully integrated with [[Neural Radiance Fields]] (NeRF) and cloud streaming (foveated transport) to deliver photorealistic, real-time graphics. This could also extend to other senses, such as multi-modal foveation for spatial audio and haptics.
== Future Developments ==
 
=== Near-term (2025-2026) ===
* Production deployment of neural reconstruction techniques in consumer headsets
* Software-only gaze prediction enabling foveated rendering without eye tracking hardware
* OpenXR standardization eliminating platform fragmentation
* NPU acceleration for neural reconstruction on mobile VR platforms
 
=== Mid-term (2026-2028) ===
* Power optimization critical for wireless VR and AR glasses
* Adaptive foveated rendering personalizing quality curves per user
* Retinal resolution displays (60-70 pixels per degree) making foveated rendering mandatory
* Multi-modal foveation extending to audio and haptics
 
=== Long-term (2028+) ===
* [[Neural Radiance Fields]] (NeRF) with foveated rendering
* Cloud and edge rendering with dynamic foveated transport
* Theoretical limit of 20-100× improvements versus current rendering
* Foveated rendering as therapeutic tool for [[virtual reality sickness|cybersickness]] mitigation


== See Also ==
== See Also ==
* [[3D Gaussian Splatting]]
* [[Foveated imaging]]
* [[Augmented reality]]
* [[Eye tracking]]
* [[Fixed foveated rendering]]
* [[Fovea]]
* [[Fovea]]
* [[Foveated imaging]]
* [[FOVAS]]
* [[Gaze-contingency paradigm]]
* [[Gaze-contingency paradigm]]
* [[Eye tracking]]
* [[Variable Rate Shading]]
* [[Level of detail]]
* [[Virtual reality]]
* [[Augmented reality]]
* [[Head-mounted display]]
* [[Head-mounted display]]
* [[Human visual system]]
* [[Occlusion culling]]
* [[Level of detail]]
* [[3D Gaussian Splatting]]
* [[Neural rendering]]
* [[Neural rendering]]
* [[Occlusion culling]]
* [[Variable rate shading]]
* [[Virtual reality]]


== References ==
== References ==
Line 413: Line 440:
<ref name="MetaETFRvsFFR">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
<ref name="MetaETFRvsFFR">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
<ref name="GazeContingentPipeline">{{cite web |url=https://graphics.tu-bs.de/upload/publications/stengel2016adaptsampling.pdf |title=Gaze-Contingent Rendering for Deferred Shading}}</ref>
<ref name="GazeContingentPipeline">{{cite web |url=https://graphics.tu-bs.de/upload/publications/stengel2016adaptsampling.pdf |title=Gaze-Contingent Rendering for Deferred Shading}}</ref>
<ref name="StateOfArtSurvey">{{cite web |url=https://www.researchgate.net/publication/366842988_Foveated_rendering_A_state_of_the_art_survey |title=Foveated rendering: A state-of-the-art survey}}</ref>
<ref name="StateOfArtSurvey">{{cite web |url=https://www.researchgate.net/publication/366842988_Foveated_rendering_A_state-of-the-art_survey |title=Foveated rendering: A state-of-the-art survey}}</ref>
<ref name="tobii2023">{{cite web |url=https://www.tobii.com/blog/what-is-foveated-rendering |title=What is foveated rendering? |publisher=Tobii |date=2023-03-15}}</ref>
<ref name="TOYF_Paper">{{cite web |url=https://research.manchester.ac.uk/files/296585058/toyf.pdf |title=Type of Movement and Attentional Task Affect the Efficacy of a Foveated Rendering Method in Virtual Reality}}</ref>
<ref name="TOYF_Paper">{{cite web |url=https://research.manchester.ac.uk/files/296585058/toyf.pdf |title=Type of Movement and Attentional Task Affect the Efficacy of a Foveated Rendering Method in Virtual Reality}}</ref>
<ref name="EyeTrackingVRReview">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations}}</ref>
<ref name="EyeTrackingVRReview">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations}}</ref>
<ref name="survey">{{cite arxiv |eprint=2211.07969 |title=Foveated rendering: A state-of-the-art survey |date=2022-11-15 |last1=Wang |first1=Lili |last2=Shi |first2=Xuehuai |last3=Liu |first3=Yi |url=https://arxiv.org/abs/2211.07969}}</ref>
<ref name="nvidia2017">{{cite web |url=https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf |title=Latency Requirements for Foveated Rendering in Virtual Reality |publisher=NVIDIA Research |year=2017}}</ref>
<ref name="HVS_VR_Context_2">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview}}</ref>
<ref name="HVS_VR_Context_2">{{cite web |url=https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/ |title=Eye tracking in virtual reality: a comprehensive overview}}</ref>
<ref name="ieee2023">{{cite web |url=https://link.springer.com/article/10.1007/s41095-022-0306-4 |title=Foveated rendering: A state-of-the-art survey |publisher=Computational Visual Media |year=2023}}</ref>
<ref name="FoveatedRenderingExplainedReddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/afj50w/eye_tracking_foveated_rendering_explained_what_it/ |title=Eye Tracking & Foveated Rendering Explained}}</ref>
<ref name="FoveatedRenderingExplainedReddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/afj50w/eye_tracking_foveated_rendering_explained_what_it/ |title=Eye Tracking & Foveated Rendering Explained}}</ref>
<ref name="LatencyRequirements">{{cite web |url=https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf |title=Latency Requirements for Eye-Tracked Foveated Rendering}}</ref>
<ref name="LatencyRequirements">{{cite web |url=https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf |title=Latency Requirements for Eye-Tracked Foveated Rendering}}</ref>
<ref name="roadtovr2016">{{cite web |url=https://www.roadtovr.com/a-pocket-guide-to-foveated-rendering-from-smi/ |title=A Quick-Start Guide to Foveated Rendering |publisher=Road to VR |date=2016-02-16}}</ref>
<ref name="AttentionAware">{{cite web |url=https://www.computationalimaging.org/publications/attention-aware/ |title=Towards Attention-Aware Foveated Rendering}}</ref>
<ref name="AttentionAware">{{cite web |url=https://www.computationalimaging.org/publications/attention-aware/ |title=Towards Attention-Aware Foveated Rendering}}</ref>
<ref name="AttentionAwareSIGGRAPH">{{cite web |url=https://history.siggraph.org/learning/towards-attention-aware-foveated-rendering-by-krajancich-kellnhofer-and-wetzstein/ |title=“Towards Attention–Aware Foveated Rendering” by Krajancich, Kellnhofer and Wetzstein}}</ref>
<ref name="AttentionAwareSIGGRAPH">{{cite web |url=https://history.siggraph.org/learning/towards-attention-aware-foveated-rendering-by-krajancich-kellnhofer-and-wetzstein/ |title="Towards Attention–Aware Foveated Rendering" by Krajancich, Kellnhofer and Wetzstein}}</ref>
<ref name="GazeContingentMultiresolution">{{cite web |url=https://vgl.cs.usfca.edu/assets/Foveated_Visualization___VDA_2020.pdf |title=Gaze-Contingent Multiresolution Visualization for Large-Scale Vector and Volume Data}}</ref>
<ref name="GazeContingentMultiresolution">{{cite web |url=https://vgl.cs.usfca.edu/assets/Foveated_Visualization___VDA_2020.pdf |title=Gaze-Contingent Multiresolution Visualization for Large-Scale Vector and Volume Data}}</ref>
<ref name="GazeContingent2D">{{cite web |url=http://stanford.edu/class/ee367/Winter2017/mehra_sankar_ee367_win17_report.pdf |title=Gaze Contingent Foveated Rendering for 2D Displays}}</ref>
<ref name="GazeContingent2D">{{cite web |url=http://stanford.edu/class/ee367/Winter2017/mehra_sankar_ee367_win17_report.pdf |title=Gaze Contingent Foveated Rendering for 2D Displays}}</ref>
<ref name="AutoVRSE">{{cite web |url=https://www.autovrse.com/foveated-rendering |title=What is Foveated Rendering? - autovrse}}</ref>
<ref name="AutoVRSE">{{cite web |url=https://www.autovrse.com/foveated-rendering |title=What is Foveated Rendering? - autovrse}}</ref>
<ref name="Guenter2012">{{Cite conference |last=Guenter |first=Brian |last2=Grimes |first2=Mark |last3=Nehab |first3=Diego |last4=Sander |first4=Pedro V. |last5=Summa |first5=João |title=Efficient rerendering in viewport space |journal=ACM Transactions on Graphics |volume=31 |issue=6 |pages=1–13 |year=2012 |doi=10.1145/2366145.2366195 |description=Key 2012 foveated graphics paper.}}</ref>
<ref name="DeepFovea">{{Cite conference |last=Kaplanyan |first=Anton |et al. |title=DeepFovea: Neural reconstruction for foveated rendering |journal=ACM Transactions on Graphics |volume=38 |issue=6 |pages=1–15 |year=2019 |doi=10.1145/3355089.3356559 |description=Neural upsampling for foveation.}}</ref>
<ref name="OpenXRToolkit">{{cite web |url=https://mbucchia.github.io/OpenXR-Toolkit/fr.html |title=Foveated Rendering - OpenXR Toolkit}}</ref>
<ref name="OpenXRToolkit">{{cite web |url=https://mbucchia.github.io/OpenXR-Toolkit/fr.html |title=Foveated Rendering - OpenXR Toolkit}}</ref>
<ref name="microsoft2019">{{cite web |url=https://devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/ |title=Variable Rate Shading: a scalpel in a world of sledgehammers |publisher=Microsoft DirectX Blog |year=2019}}</ref>
<ref name="VarjoAPI">{{cite web |url=https://developer.varjo.com/docs/native/foveated-rendering-api |title=Foveated Rendering - Varjo for Developers}}</ref>
<ref name="VarjoAPI">{{cite web |url=https://developer.varjo.com/docs/native/foveated-rendering-api |title=Foveated Rendering - Varjo for Developers}}</ref>
<ref name="PicoUnrealOpenXR">{{cite web |url=https://developer.picoxr.com/document/unreal-openxr/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal OpenXR Plugin}}</ref>
<ref name="PicoUnrealOpenXR">{{cite web |url=https://developer.picoxr.com/document/unreal-openxr/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal OpenXR Plugin}}</ref>
<ref name="DCS_Forum_QuadViews">{{cite web |url=https://www.reddit.com/r/hoggit/comments/15ep59q/dcs_dynamic_foveated_rendering_available_for_more/ |title=DCS Dynamic Foveated Rendering available for more headsets}}</ref>
<ref name="DCS_Forum_QuadViews">{{cite web |url=https://www.reddit.com/r/hoggit/comments/15ep59q/dcs_dynamic_foveated_rendering_available_for_more/ |title=DCS Dynamic Foveated Rendering available for more headsets}}</ref>
<ref name="PimaxQuadViews">{{cite web |url=https://pimax.com/blogs/blogs/quad-views-foveated-rendering-for-pimax-crystal |title=Quad Views Foveated Rendering for Pimax Crystal}}</ref>
<ref name="PimaxQuadViews">{{cite web |url=https://pimax.com/blogs/blogs/quad-views-foveated-rendering-for-pimax-crystal |title=Quad Views Foveated Rendering for Pimax Crystal}}</ref>
<ref name="arm2020">{{cite web |url=https://developer.arm.com/-/media/developer/Graphics%20and%20Multimedia/White%20Papers/Foveated%20Rendering%20Whitepaper.pdf |title=Foveated Rendering Current and Future Technologies for Virtual Reality |publisher=ARM Developer |year=2020}}</ref>
<ref name="VulkanMobileVR">{{cite web |url=https://developers.meta.com/horizon/blog/vulkan-for-mobile-vr-rendering/ |title=Vulkan for Mobile VR Rendering}}</ref>
<ref name="VulkanMobileVR">{{cite web |url=https://developers.meta.com/horizon/blog/vulkan-for-mobile-vr-rendering/ |title=Vulkan for Mobile VR Rendering}}</ref>
<ref name="VulkanFDMExtension">{{cite web |url=https://expipiplus1.github.io/vulkan/vulkan-3.8.1-docs/Vulkan-Extensions-VK_EXT_fragment_density_map.html |title=Vulkan API Documentation: VK_EXT_fragment_density_map}}</ref>
<ref name="VulkanFDMExtension">{{cite web |url=https://expipiplus1.github.io/vulkan/vulkan-3.8.1-docs/Vulkan-Extensions-VK_EXT_fragment_density_map.html |title=Vulkan API Documentation: VK_EXT_fragment_density_map}}</ref>
<ref name="VulkanFDMOffset">{{cite web |url=https://www.qualcomm.com/developer/blog/2022/08/improving-foveated-rendering-fragment-density-map-offset-extension-vulkan |title=Improving Foveated Rendering with the Fragment Density Map Offset Extension for Vulkan}}</ref>
<ref name="VulkanFDMOffset">{{cite web |url=https://www.qualcomm.com/developer/blog/2022/08/improving-foveated-rendering-fragment-density-map-offset-extension-vulkan |title=Improving Foveated Rendering with the Fragment Density Map Offset Extension for Vulkan}}</ref>
<ref name="FOVAS_Pixvana">{{cite web |title=Pixvana’s FOVAS Technology Delivers 8K VR Video on Today’s Headsets |url=https://www.businesswire.com/news/home/20160907005500/en/Pixvana’s-FOVAS-Technology-Delivers-8K-VR-Video-on-Today’s-Headsets |publisher=Business Wire |date=2016-09-07 |access-date=2025-10-26}}</ref>
<ref name="kfr2018">{{cite web |url=https://www.researchgate.net/publication/326636875_Kernel_Foveated_Rendering |title=Kernel Foveated Rendering |publisher=ResearchGate |year=2018}}</ref>
<ref name="JigSpace">{{cite web |url=https://www.jig.com/spatial-computing/foveated-rendering |title=What Is Foveated Rendering? - JigSpace}}</ref>
<ref name="JigSpace">{{cite web |url=https://www.jig.com/spatial-computing/foveated-rendering |title=What Is Foveated Rendering? - JigSpace}}</ref>
<ref name="MetaFFR_OS">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/os-fixed-foveated-rendering/ |title=Fixed foveated rendering (FFR) - Meta Quest}}</ref>
<ref name="MetaFFR_OS">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/os-fixed-foveated-rendering/ |title=Fixed foveated rendering (FFR) - Meta Quest}}</ref>
Line 443: Line 475:
<ref name="QuestProPerformance">{{cite web |url=https://www.uploadvr.com/quest-pro-foveated-rendering-performance/ |title=Quest Pro Foveated Rendering GPU Savings Detailed}}</ref>
<ref name="QuestProPerformance">{{cite web |url=https://www.uploadvr.com/quest-pro-foveated-rendering-performance/ |title=Quest Pro Foveated Rendering GPU Savings Detailed}}</ref>
<ref name="PSVR2_GDC_Unity">{{cite web |url=https://www.playstationlifestyle.net/2022/03/28/psvr-2-specs-eye-tracking-foveated-rendering/ |title=PSVR 2 Specs Run 3.6x Faster Using Eye-Tracking Technology}}</ref>
<ref name="PSVR2_GDC_Unity">{{cite web |url=https://www.playstationlifestyle.net/2022/03/28/psvr-2-specs-eye-tracking-foveated-rendering/ |title=PSVR 2 Specs Run 3.6x Faster Using Eye-Tracking Technology}}</ref>
<ref name="PimaxDFR">{{cite web |url=httpshttps://pimax.com/blogs/blogs/the-crystal-supers-secret-weapon-dynamic-foveated-rendering |title=The Crystal Super's Secret Weapon: Dynamic Foveated Rendering}}</ref>
<ref name="PimaxDFR">{{cite web |url=https://pimax.com/blogs/blogs/the-crystal-supers-secret-weapon-dynamic-foveated-rendering |title=The Crystal Super's Secret Weapon: Dynamic Foveated Rendering}}</ref>
<ref name="MetaETFR_Blog">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
<ref name="MetaETFR_Blog">{{cite web |url=https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/ |title=Save GPU with Eye Tracked Foveated Rendering}}</ref>
<ref name="UploadVR_PSVR2">{{cite web |url=https://www.uploadvr.com/psvr-2-eye-tracking-foveated-rendering-gdc/ |title=PSVR 2 Foveated Rendering Provides 3.6x Faster Performance - Unity}}</ref>
<ref name="uploadvr2022">{{cite web |url=https://www.uploadvr.com/quest-pro-foveated-rendering-performance/ |title=Here's The Exact Performance Benefit Of Foveated Rendering On Quest Pro |publisher=UploadVR |date=October 2022}}</ref>
<ref name="VarjoDevs">{{cite web |title=Foveated Rendering |url=https://developer.varjo.com/docs/native/foveated-rendering-api |publisher=Varjo for developers |year=2023}}</ref>
<ref name="unity2023">{{cite web |url=https://www.uploadvr.com/psvr-2-eye-tracking-foveated-rendering-gdc/ |title=PSVR 2 Foveated Rendering Provides 3.6x Faster Performance - Unity |publisher=UploadVR |date=March 2023}}</ref>
<ref name="80lvQuestPro">{{cite web |url=https://80.lv/articles/quest-pro-s-foveated-rendering-saves-up-to-52-performance |title=Quest Pro's Foveated Rendering Saves up to 52% Performance}}</ref>
<ref name="varjo2023">{{cite web |url=https://developer.varjo.com/docs/native/foveated-rendering-api |title=Foveated Rendering |publisher=Varjo for developers |year=2023}}</ref>
<ref name="UnityFR_Test_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/3bls3q/unity_foveated_rendering_test_4x_fps_increase/ |title=unity foveated rendering test 4x fps increase with a pretty simple rendering strategy}}</ref>
<ref name="vrsplatting2024">{{cite web |url=https://dl.acm.org/doi/10.1145/3728302 |title=VR-Splatting: Foveated Radiance Field Rendering via 3D Gaussian Splatting and Neural Points |publisher=ACM |year=2024}}</ref>
<ref name="PicoUnityFFR">{{cite web |url=https://developer.picoxr.com/document/unity/fixed-foveated-rendering/ |title=Fixed foveated rendering - PICO Unity Integration SDK}}</ref>
<ref name="levoy1990">{{cite journal |last=Levoy |first=Marc |last2=Whitaker |first2=Robert |title=Gaze-directed volume rendering |journal=Proceedings of the 1990 Symposium on Interactive 3D Graphics |pages=361–369 |year=1990 |doi=10.1145/91394.91431}}</ref>
<ref name="MetaQuestProUnityETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unity}}</ref>
<ref name="ohshima1999">{{cite conference |last=Ohshima |first=Takashi |last2=Satoh |first2=Keiichi |last3=Tamaki |first3=Hiroaki |title=AR²HMD: Augmented reality with high resolution head mounted display |journal=Proceedings of the 1st International Symposium on Mixed and Augmented Reality |pages=110–119 |year=1999 |doi=10.1109/ISMAR.1999.803809}}</ref>
<ref name="MetaUnrealETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unreal/unreal-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unreal}}</ref>
<ref name="luebke2001">{{cite journal |last=Luebke |first=David |last2=Hallen |first2=Ben |title=Perceptually-driven simplification for interactive rendering |journal=Proceedings 12th International Conference on Parallel Processing |pages=223–230 |year=2001 |doi=10.1109/IPDPS.2001.925025}}</ref>
<ref name="Foveated360Video">{{Cite conference |last=Li |first=Jian |et al. |title=Foveated video coding for cloud-based VR streaming |journal=2021 IEEE International Conference on Multimedia and Expo |pages=1–6 |year=2021 |doi=10.1109/ICME51207.2021.9428350 |description=Foveated 360° video.}}</ref>
<ref name="guenter2012">{{cite conference |last=Guenter |first=Brian |last2=Grimes |first2=Mark |last3=Nehab |first3=Diego |last4=Sander |first4=Pedro V. |last5=Summa |first5=João |title=Efficient rerendering in viewport space |journal=ACM Transactions on Graphics |volume=31 |issue=6 |pages=1–13 |year=2012 |doi=10.1145/2366145.2366195}}</ref>
<ref name="MedicalVolumeRender">{{Cite conference |last=Gallo |first=Luigi |et al. |title=Foveation for 3D volume rendering |journal=2013 IEEE 15th International Conference on e-Health Networking, Applications and Services |pages=1–6 |year=2013 |doi=10.1109/HealthCom.2013.6720650 |description=Medical volume foveation.}}</ref>
<ref name="techcrunch2014">{{cite web |url=https://techcrunch.com/2014/09/09/fove/ |title=FOVE Uses Eye Tracking To Make Virtual Reality More Immersive |publisher=TechCrunch |date=2014-09-10}}</ref>
<ref name="HolographicFoveation">{{Cite conference |last=Chakravarthula |first=Praneeth |et al. |title=To the holograms born |journal=ACM Transactions on Graphics |volume=40 |issue=6 |pages=1–16 |year=2021 |doi=10.1145/3486108 |description=Holographic foveation.}}</ref>
<ref name="kickstarter2015">{{cite web |url=https://www.kickstarter.com/projects/fove/fove-the-worlds-first-eye-tracking-virtual-reality |title=FOVE: The World's First Eye Tracking Virtual Reality Headset |publisher=Kickstarter |date=2015-09-01}}</ref>
<ref name="uploadvr2016">{{cite web |url=https://uploadvr.com/smi-hands-on-250hz-eye-tracking/ |title=SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real, and the Cost May Surprise You |publisher=UploadVR |date=2016-01-15}}</ref>
<ref name="nvidia2016">{{cite web |url=https://blogs.nvidia.com/blog/2016/07/21/rendering-foveated-vr/ |title=NVIDIA Partners with SMI on Innovative Rendering Technique That Improves VR |publisher=NVIDIA |date=2016-07-21}}</ref>
<ref name="digitaltrends2016">{{cite web |url=https://www.digitaltrends.com/computing/nvidia-research-foveated-rendering-vr-smi/ |title=Nvidia plans to prove that new method improves image quality in virtual reality |publisher=Digital Trends |date=2016-07-23}}</ref>
<ref name="tomshardware2016">{{cite web |url=https://www.tomshardware.com/news/fove-vr-first-look-ces,30964.html |title=Exclusive: Fove's VR HMD At CES 2016 |publisher=Tom's Hardware |date=2016-01-11}}</ref>
<ref name="qualcomm2017">{{cite web |url=https://www.qualcomm.com/news/releases/2017/02/23/qualcomm-introduces-snapdragon-835-virtual-reality-development-kit |title=Qualcomm Introduces Snapdragon 835 Virtual Reality Development Kit |publisher=Qualcomm |date=2017-02-23}}</ref>
<ref name="starvr2018">{{cite web |url=https://arstechnica.com/gaming/2018/08/starvr-one-is-a-premium-vr-headset-with-built-in-eye-tracking/ |title=StarVR One is a premium VR headset with built-in eye-tracking |publisher=Ars Technica |date=2018-08-14}}</ref>
<ref name="theverge2019">{{cite web |url=https://www.theverge.com/2019/1/7/18172064/htc-vive-pro-eye-vr-headset-eye-tracking-announced-features-price-release |title=HTC announces new Vive Pro Eye VR headset with native eye tracking |publisher=The Verge |date=2019-01-07}}</ref>
<ref name="venturebeat2019">{{cite web |url=https://venturebeat.com/2019/12/22/oculus-quest-gets-dynamic-fixed-foveated-rendering/ |title=Oculus Quest gets dynamic fixed foveated rendering |publisher=VentureBeat |date=2019-12-22}}</ref>
<ref name="deepfovea2019">{{cite web |url=https://dl.acm.org/doi/10.1145/3306307.3328186 |title=DeepFovea: Neural Reconstruction for Foveated Rendering |publisher=ACM SIGGRAPH |year=2019}}</ref>
<ref name="sonyblog2022">{{cite web |url=https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/ |title=PlayStation VR2 and PlayStation VR2 Sense controller: The next generation of VR gaming on PS5 |publisher=PlayStation Blog |date=2022-01-04}}</ref>
<ref name="pcmag2025">{{cite web |url=https://www.pcmag.com/picks/the-best-vr-headsets |title=The Best VR Headsets for 2025 |publisher=PC Magazine}}</ref>
<ref name="UnityFR_Roadmap">{{cite web |url=https://unity.com/roadmap/1356-xr-foveated-rendering |title=XR Foveated Rendering - Unity Roadmap}}</ref>
<ref name="UnityFR_Roadmap">{{cite web |url=https://unity.com/roadmap/1356-xr-foveated-rendering |title=XR Foveated Rendering - Unity Roadmap}}</ref>
<ref name="UnityDocsFR">{{cite web |url=https://docs.unity3d.com/6000.2/Documentation/Manual/xr-foveated-rendering.html |title=Foveated rendering - Unity Manual}}</ref>
<ref name="UnityDocsFR">{{cite web |url=https://docs.unity3d.com/6000.2/Documentation/Manual/xr-foveated-rendering.html |title=Foveated rendering - Unity Manual}}</ref>
<ref name="unity2024">{{cite web |url=https://docs.unity3d.com/6000.0/Documentation/Manual/xr-foveated-rendering.html |title=Foveated rendering |publisher=Unity Documentation |year=2024}}</ref>
<ref name="PicoUnityFFR">{{cite web |url=https://developer.picoxr.com/document/unity/fixed-foveated-rendering/ |title=Fixed foveated rendering - PICO Unity Integration SDK}}</ref>
<ref name="MetaUnityFFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-fixed-foveated-rendering/ |title=Using Fixed Foveated Rendering - Unity}}</ref>
<ref name="MetaUnityFFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-fixed-foveated-rendering/ |title=Using Fixed Foveated Rendering - Unity}}</ref>
<ref name="PicoUnityETFR">{{cite web |url=https://developer.picoxr.com/document/unity/eye-tracked-foveated-rendering/ |title=Eye tracked foveated rendering - PICO Unity Integration SDK}}</ref>
<ref name="PicoUnityETFR">{{cite web |url=https://developer.picoxr.com/document/unity/eye-tracked-foveated-rendering/ |title=Eye tracked foveated rendering - PICO Unity Integration SDK}}</ref>
<ref name="ViveUnrealFFR">{{cite web |url=https://developer.vive.com/resources/openxr/unreal/unreal-tutorials/rendering/foveated-rendering/ |title=Foveated Rendering - VIVE OpenXR Unreal}}</ref>
<ref name="ViveUnrealFFR">{{cite web |url=https://developer.vive.com/resources/openxr/unreal/unreal-tutorials/rendering/foveated-rendering/ |title=Foveated Rendering - VIVE OpenXR Unreal}}</ref>
<ref name="UnrealVRS_FFR">{{cite web |url=https.dev.epicgames.com/documentation/en-us/unreal-engine/vr-performance-features?application_version=4.27 |title=VR Performance Features - Unreal Engine 4.27 Documentation}}</ref>
<ref name="MetaUnrealETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unreal/unreal-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unreal}}</ref>
<ref name="PSVR2_TechSpecs">{{cite web |url=https.www.playstation.com/en-ca/ps-vr2/ps-vr2-tech-specs/ |title=PlayStation VR2 tech specs | PlayStation}}</ref>
<ref name="UnrealVRS_FFR">{{cite web |url=https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-performance-features?application_version=4.27 |title=VR Performance Features - Unreal Engine 4.27 Documentation}}</ref>
<ref name="openxr2024">{{cite web |url=https://www.uploadvr.com/openxr-1-1/ |title=OpenXR 1.1 Brings Foveated Rendering & More Into The Spec |publisher=UploadVR |year=2024}}</ref>
<ref name="MetaQuestProUnityETFR">{{cite web |url=https://developers.meta.com/horizon/documentation/unity/unity-eye-tracked-foveated-rendering/ |title=Eye Tracked Foveated Rendering - Unity}}</ref>
<ref name="PSVR2_TechSpecs">{{cite web |url=https://www.playstation.com/en-ca/ps-vr2/ps-vr2-tech-specs/ |title=PlayStation VR2 tech specs | PlayStation}}</ref>
<ref name="PicoUnrealLegacy">{{cite web |url=https://developer.picoxr.com/document/unreal/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal Integration SDK}}</ref>
<ref name="VarjoAeroSpecs">{{cite web |url=https://varjo.com/products/aero |title=Varjo Aero - Varjo}}</ref>
<ref name="VarjoAeroSpecs">{{cite web |url=https://varjo.com/products/aero |title=Varjo Aero - Varjo}}</ref>
<ref name="VarjoEyeTrackingSpecs">{{cite web |url=https://developer.varjo.com/docs/get-started/eye-tracking-with-varjo-headset |title=Eye tracking with Varjo headset}}</ref>
<ref name="VarjoEyeTrackingSpecs">{{cite web |url=https://developer.varjo.com/docs/get-started/eye-tracking-with-varjo-headset |title=Eye tracking with Varjo headset}}</ref>
<ref name="VarjoFoveationPage">{{cite web |url=https://support.varjo.com/hc/en-us/foveated-rendering |title=Foveated rendering - Varjo Support}}</ref>
<ref name="PimaxCrystalSuper">{{cite web |url=https://pimax.com/pages/pimax-crystal-super |title=Pimax Crystal Super}}</ref>
<ref name="PimaxCrystalSuper">{{cite web |url=https://pimax.com/pages/pimax-crystal-super |title=Pimax Crystal Super}}</ref>
<ref name="PCMagBestVR">{{cite web |url=https.www.pcmag.com/picks/the-best-vr-headsets |title=The Best VR Headsets for 2025}}</ref>
<ref name="PimaxDFR_About">{{cite web |url=https://pimax.com/blogs/blogs/about-dynamic-foveated-rendering-dfr-in-virtual-reality-vr |title=About Dynamic Foveated Rendering (DFR) in Virtual Reality (VR)}}</ref>
<ref name="VarjoFoveationPage">{{cite web |url=https://support.varjo.com/hc/en-us/foveated-rendering |title=Foveated rendering - Varjo Support}}</ref>
<ref name="UnityFR_Test_Reddit">{{cite web |url=https://www.reddit.com/r/oculus/comments/3bls3q/unity_foveated_rendering_test_4x_fps_increase/ |title=unity foveated rendering test 4x fps increase with a pretty simple rendering strategy}}</ref>
<ref name="PimaxDFR_About">{{cite web |url=httpshttps://pimax.com/blogs/blogs/about-dynamic-foveated-rendering-dfr-in-virtual-reality-vr |title=About Dynamic Foveated Rendering (DFR) in Virtual Reality (VR)}}</ref>
<ref name="PicoUnrealLegacy">{{cite web |url=https://developer.picoxr.com/document/unreal/fixed-foveated-rendering/ |title=Foveated rendering - PICO Unreal Integration SDK}}</ref>
<ref name="SIGA16_GazeTracked">{{cite web |url=http://cwyman.org/papers/siga16_gazeTrackedFoveatedRendering.pdf |title=A Perceptually-Based Foveated Real-Time Renderer}}</ref>
<ref name="SIGA16_GazeTracked">{{cite web |url=http://cwyman.org/papers/siga16_gazeTrackedFoveatedRendering.pdf |title=A Perceptually-Based Foveated Real-Time Renderer}}</ref>
<ref name="PerceptuallyBasedFR_Nvidia">{{cite web |url=https::research.nvidia.com/publication/2016-07_perceptually-based-foveated-virtual-reality |title=Perceptually-Based Foveated Virtual Reality | Research}}</ref>
<ref name="PerceptuallyBasedFR_Nvidia">{{cite web |url=https://research.nvidia.com/publication/2016-07_perceptually-based-foveated-virtual-reality |title=Perceptually-Based Foveated Virtual Reality | Research}}</ref>
<ref name="QualcommSpacesUnityFR">{{cite web |url=httpss://docs.spaces.qualcomm.com/unity/setup/foveated-rendering |title=Foveated Rendering - Snapdragon Spaces}}</ref>
<ref name="franke2021">{{cite web |url=https://onlinelibrary.wiley.com/doi/10.1111/cgf.14176 |title=Time-Warped Foveated Rendering for Virtual Reality Headsets |publisher=Computer Graphics Forum |year=2021}}</ref>
<ref name="PSVR2_DevTime">{{cite web |url=httpshttps://www.reddit.com/r/PSVR/comments/1eacq3v/do_all_psvr2_games_use_foverated_rendering/ |title=Do all PSVR2 games use foverated rendering?}}</ref>
<ref name="QualcommSpacesUnityFR">{{cite web |url=https://docs.spaces.qualcomm.com/unity/setup/foveated-rendering |title=Foveated Rendering - Snapdragon Spaces}}</ref>
<ref name="Levoy1990">{{Cite journal |last=Levoy |first=Marc |last2=Whitaker |first2=Robert |title=Gaze-directed volume rendering |journal=Proceedings of the 1990 Symposium on Interactive 3D Graphics |pages=361–369 |year=1990 |doi=10.1145/91394.91431 |description=Foundational gaze-directed rendering paper.}}</ref>
<ref name="PSVR2_DevTime">{{cite web |url=https://www.reddit.com/r/PSVR/comments/1eacq3v/do_all_psvr2_games_use_foverated_rendering/ |title=Do all PSVR2 games use foverated rendering?}}</ref>
<ref name="FOVE_TechCrunch">{{Cite web |title=FOVE Uses Eye Tracking To Make Virtual Reality More Immersive |url=https://techcrunch.com/2014/09/09/fove/ |publisher=TechCrunch |date=2014-09-10 |access-date=2025-10-26}}</ref>
<ref name="fovolnet2022">{{cite web |url=https://arxiv.org/abs/2209.09965 |title=FoVolNet: Fast Volume Rendering using Foveated Deep Neural Networks |publisher=arXiv |year=2022}}</ref>
<ref name="FOVE_Kickstarter">{{Cite web |title=FOVE: The World's First Eye Tracking Virtual Reality Headset |url=https://www.kickstarter.com/projects/fove/fove-the-worlds-first-eye-tracking-virtual-reality |publisher=Kickstarter |date=2015-09-01 |access-date=2025-10-26}}</ref>
<ref name="fovealnet2024">{{cite web |url=https://arxiv.org/abs/2412.10456 |title=FovealNet: Advancing AI-Driven Gaze Tracking Solutions for Optimized Foveated Rendering System Performance in Virtual Reality |publisher=arXiv |year=2024}}</ref>
<ref name="SMI_UploadVR">{{Cite web |title=SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real |url=https://uploadvr.com/smi-hands-on-250hz-eye-tracking/ |publisher=UploadVR |date=2016-01-15 |access-date=2025-10-26}}</ref>
<ref name="LuminanceContrastAware">{{cite web |url=https://history.siggraph.org/learning/luminance-contrast-aware-foveated-rendering-by-tursun-arabadzhiyska-koleva-wernikowski-mantiuk-seidel-et-al/ |title="Luminance-Contrast-Aware Foveated Rendering" by Tursun, Arabadzhiyska-Koleva, Wernikowski, Mantiuk, Seidel, et al.}}</ref>
<ref name="Nvidia_Blog_SMI">{{Cite web |title=NVIDIA Partners with SMI on Innovative Rendering Technique That Improves VR |url=https://blogs.nvidia.com/blog/2016/07/21/rendering-foveated-vr/ |publisher=NVIDIA |date=2016-07-21 |access-date=2025-10-26}}</ref>
<ref name="IndividualizedFR">{{cite web |url=https://www.researchgate.net/publication/377532315_Individualized_foveated_rendering_with_eye-tracking_head-mounted_display |title=Individualized foveated rendering with eye-tracking head-mounted display}}</ref>
<ref name="Qualcomm835">{{Cite web |title=Qualcomm Introduces Snapdragon 835 Virtual Reality Development Kit |url=https://www.qualcomm.com/news/releases/2017/02/23/qualcomm-introduces-snapdragon-835-virtual-reality-development-kit |publisher=Qualcomm |date=2017-02-23 |access-date=2025-10-26}}</ref>
<ref name="EyeDominanceGuided">{{cite web |url=https://research.google/pubs/eye-dominance-guided-foveated-rendering/ |title=Eye-Dominance-Guided Foveated Rendering}}</ref>
<ref name="ViveProEye_Verge">{{Cite web |title=HTC announces new Vive Pro Eye virtual reality headset with native eye tracking |url=https://www.theverge.com/2019/1/7/18172700/htc-vive-pro-eye-tracking-virtual-reality-headset-features-pricing-release-date-ces-2019 |publisher=The Verge |date=2019-01-07 |access-date=2025-10-26}}</ref>
<ref name="QuestFFR_VentureBeat">{{Cite web |title=Oculus Quest gets dynamic fixed foveated rendering |url=https://venturebeat.com/2019/12/22/oculus-quest-gets-dynamic-fixed-foveated-rendering/ |publisher=VentureBeat |date=2019-12-22 |access-date=2025-10-26}}</ref>
<ref name="PSVR2_Blog">{{Cite web |title=PlayStation VR2 and PlayStation VR2 Sense controller: The next generation of VR gaming on PS5 |url=https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/ |publisher=PlayStation Blog |date=2Details. |access-date=2025-10-26}}</ref>
<ref name="VisionPro_Dev">{{Cite web |title=Discover visionOS |url=https://developer.apple.com/visionos/ |publisher=Apple Developer |date=2023-06-07 |access-date=2025-10-26}}</ref>
<ref name="DeepFovea_Paper">{{cite web |title=DeepFovea: Neural Reconstruction for Foveated Rendering |url=https://dl.acm.org/doi/10.1145/3306307.3328186 |publisher=ACM SIGGRAPH |year=2019}}</ref>
<ref name="fovolnet2022">{{cite web |title=FoVolNet: Fast Volume Rendering using Foveated Deep Neural Networks |url=https://arxiv.org/abs/2209.09965 |publisher=arXiv |year=2022}}</ref>
<ref name="vrsplatting2024">{{cite web |title=VR-Splatting: Foveated Radiance Field Rendering via 3D Gaussian Splatting and Neural Points |url=https://dl.acm.org/doi/10.1145/3728302 |publisher=ACM |year=2024}}</ref>
</references>
</references>
==External links==
* [https://www.tobii.com/products/integration/xr-headsets Tobii XR Eye Tracking]
* [https://developers.meta.com/horizon/documentation/unity/unity-eye-tracked-foveated-rendering Meta Quest Eye Tracked Foveated Rendering Documentation]
* [https://docs.unity3d.com/Manual/xr-foveated-rendering.html Unity Foveated Rendering Documentation]
* [https://microsoft.github.io/DirectX-Specs/d3d/VariableRateShading.html DirectX Variable Rate Shading Specification]
* [https://developer.varjo.com/docs/native/foveated-rendering-api Varjo Foveated Rendering API Documentation]
* [https://mbucchia.github.io/OpenXR-Toolkit/fr.html OpenXR Toolkit Foveated Rendering]


[[Category:Terms]]
[[Category:Terms]]
[[Category:Technology]]
[[Category:Technology]]
[[Category:Rendering techniques]]
[[Category:Virtual reality]]
[[Category:Virtual reality]]
[[Category:Augmented reality]]
[[Category:Augmented reality]]
[[Category:Computer graphics]]
[[Category:Rendering techniques]]
[[Category:Eye tracking]]
[[Category:Eye tracking]]
[[Category:Graphics optimization]]
[[Category:Graphics optimization]]
[[Category:Computer graphics]]

Revision as of 22:21, 25 October 2025

Template:Short description Template:About Template:Use dmy dates

Template:Infobox technology

Foveated rendering is a computer graphics performance optimization technique that leverages the known properties of the human visual system (HVS) to reduce the computational workload on a GPU.[1][2] The technique is based on the biological fact that human visual acuity is not uniform across the visual field; it is highest in the very center of the gaze, a region known as the fovea, and drops off sharply in the peripheral vision.[3][4]

By rendering the area of the image that falls on the user's fovea at the highest resolution and progressively reducing the quality of the image in the periphery, foveated rendering can achieve significant performance gains with little to no perceptible loss in visual quality.[5][6] This makes it a critical enabling technology for virtual reality (VR) and augmented reality (AR) head-mounted displays (HMDs), which must render high-resolution, stereoscopic images at very high frame rates to provide a comfortable and immersive experience.[7]

Implementations of foveated rendering are broadly categorized into two types: fixed foveated rendering (FFR), which assumes the user is always looking at the center of the screen, and dynamic (or eye-tracked) foveated rendering (ETFR or DFR), which uses integrated eye tracking hardware to update the high-quality region in real-time to match the user's gaze.[8]

Biological Foundation: The Human Visual System

The efficacy of foveated rendering is entirely dependent on the unique, non-uniform characteristics of the human visual system. The design of the human retina is the biological blueprint that computer graphics engineers seek to mimic for performance optimization.

Foveal vs. Peripheral Vision

The retina is not a uniform sensor. It contains a small, specialized central region called the fovea, which is responsible for sharp, detailed, and color-rich central vision (also known as foveal vision).[9] This region is densely packed with cone cells, the photoreceptors responsible for high-acuity and color perception. The fovea covers only about 1-2 degrees of the visual field (approximately 2.6-3.6° in total span), yet it consumes approximately 50% of the neural resources in the visual cortex.[10][11]

As one moves away from the fovea into the peripheral vision, the density of cone cells decreases rapidly, while the density of rod cells, which are more sensitive to light and motion but not to color or fine detail, increases.[12] This anatomical arrangement means that our ability to perceive detail, color, and stereoscopic depth diminishes significantly with increasing eccentricity (the angular distance from the point of gaze).[13]

Human visual acuity falls off rapidly with eccentricity: it is highest within 5 degrees and drops to about 10% of peak at 30 degrees.[14] Visual acuity follows a hyperbolic decay model where the minimum resolvable angle increases linearly with eccentricity from the gaze point, described by the equation ω(e) = me + ω₀, where e represents eccentricity in degrees.[15]

However, our peripheral vision is highly attuned to detecting motion and flicker.[16] Contrast sensitivity also varies with eccentricity, with peak sensitivity occurring at 3-5 cycles per degree in the fovea but shifting below 1 cycle per degree at 30° eccentricity.[17]

Foveated rendering exploits this exact trade-off. It allocates the bulk of the GPU's rendering budget to the small foveal region of the image where the user's eye can actually perceive high detail, and saves resources by rendering the much larger peripheral areas at a lower quality. The subjective experience of a uniformly high-resolution world is maintained because the brain naturally integrates the high-resolution "snapshots" from the fovea as the eyes rapidly scan the environment through quick movements called saccades.[13]

Perceptual Phenomena: Saccadic Masking and Visual Attention

Two key perceptual phenomena make foveated rendering even more effective and are critical for its implementation.

The first is saccadic masking (also known as saccadic suppression), a mechanism where the brain selectively blocks visual processing during a saccade.[18] This prevents the perception of motion blur as the eyes sweep across the visual field, effectively creating a brief window of functional blindness. This period of suppressed sensitivity begins about 50 ms before a saccade and lasts until about 100 ms after it begins.[19] Human eyes can perform saccades at up to 900-1000 degrees per second.[20]

The second phenomenon is visual attention. Research has shown that the HVS's capabilities are not static but are modulated by cognitive factors. When a user is concentrating on a visually demanding task at their point of gaze, their contrast sensitivity in the periphery drops significantly.[21][22]

Core Principles and Technical Methodologies

Transitioning from the biological "why" to the technical "how," foveated rendering is implemented through a combination of gaze-tracking paradigms and specific GPU-level rendering techniques.

The Gaze-Contingent Paradigm

At its core, dynamic foveated rendering is an application of the gaze-contingency paradigm, a concept in human-computer interaction where a system's display changes in real-time based on where the user is looking.[1][23] The typical rendering pipeline for a gaze-contingent foveated system operates on a per-frame basis:[9]

  1. Gaze Capture: An integrated eye tracker, typically using infrared cameras, captures images of the user's eyes.
  2. Gaze Vector Calculation: Image processing algorithms determine the orientation of each eye to calculate a precise gaze vector.
  3. Fixation Point Determination: The gaze vector is projected into the virtual scene to find the fixation point on the 2D display surface.
  4. Region Definition: The system defines concentric regions of varying quality around the fixation point. These typically include a high-resolution foveal region, a medium-resolution parafoveal or transition region, and a low-resolution peripheral region.
  5. Instruction to GPU: The graphics pipeline is instructed to render each of these regions at its designated quality level using one of the methods described below.
  6. Display Update: The final, composited multi-resolution image is presented to the user.

This entire loop must be completed within the frame budget (e.g., under 11.1 ms for a 90 Hz display) to ensure a smooth experience.

Methods of Quality Reduction

The term "reducing quality" encompasses several distinct techniques that can be applied to the peripheral regions to save computational power. These methods can be used individually or in combination:[5]

  • Resolution Scaling / Subsampling: This is the most common and intuitive method. The peripheral regions are rendered into a smaller off-screen buffer (e.g., at half or quarter resolution) and then upscaled to fit the final display. This directly reduces the number of pixels that need to be processed and shaded.[24]
  • Shading Rate Reduction: This method focuses on reducing the workload of the pixel shader (also known as a fragment shader). Instead of executing a complex shading program for every single pixel in the periphery, a single shader result can be applied to a block of multiple pixels. This is the core mechanism behind Variable Rate Shading (VRS).[12][25]
  • Geometric Simplification: The geometric complexity of the scene can be reduced in the periphery. This involves using lower-polygon level of detail models for objects that are outside the user's direct gaze.
  • Other Methods: More advanced or experimental techniques include chromatic degradation (reducing color precision, since the periphery is less sensitive to color), simplifying lighting and shadow calculations, and spatio-temporal deterioration.

Key Implementation Technologies

Modern GPUs and graphics APIs provide specialized features that make implementing foveated rendering highly efficient.

Variable Rate Shading (VRS)

Variable Rate Shading (VRS) is a hardware feature available on modern GPUs (e.g., NVIDIA Turing architecture and newer, AMD RDNA 2 and newer, Intel Gen11+) that provides fine-grained control over the pixel shading rate.[12][26][27] It allows a single pixel shader operation to compute the color for a block of pixels, such as a 2x2 or 4x4 block, instead of just a single pixel.[28][29] The technique supports shading rates from 1×1 (full quality) to 4×4 (coarse, one shade per 16 pixels).

Multi-View Rendering & Quad Views

An alternative approach, notably used by Varjo and available in Unreal Engine, is to render multiple distinct views for each eye.[28][30] For example, a "Quad Views" implementation renders four views in total for a stereo image: a high-resolution central "focus" view for each eye, and a lower-resolution peripheral "context" view for each eye. These are then composited into the final image.[31]

Multiview rendering uses OpenGL OVR_multiview extensions to render these views in a single pass, achieving 74.4% pixel reduction with circular stencil masks further reducing the count to 78.4%.[32]

Fragment Density Maps (FDM)

At a lower level, graphics APIs like Vulkan provide powerful tools for foveation. The VK_EXT_fragment_density_map extension allows an application to provide the GPU with a small texture, known as a fragment density map, that specifies the desired shading rate for different parts of the render target.[33][34] Extensions like VK_QCOM_fragment_density_map_offset allow this map to be shifted efficiently without regenerating it each frame, reducing latency.[35]

Kernel Foveated Rendering

Kernel foveated rendering applies log-polar coordinate transformations based on cortical magnification models.[36] The forward transform maps screen coordinates to a reduced buffer using mathematical kernels, achieving 2.16× speedup with appropriate parameterization.

Types of Foveated Rendering

Foveated rendering is not a monolithic technology but a category of techniques that can be broadly classified based on whether they utilize real-time gaze data.

Fixed Foveated Rendering (FFR)

Fixed Foveated Rendering is the most basic implementation of the concept. It operates without any eye-tracking hardware and instead relies on the assumption that a user will predominantly look towards the center of the screen.[1][37] Consequently, FFR systems render a static, high-resolution region in the center of each eye's display, while the quality degrades in fixed concentric rings towards the edges.[38]

Advantages:

  • No Eye Tracking Required: The primary benefit is that it does not require the additional cost, power consumption, and complexity of integrated eye-tracking cameras. This makes it an ideal optimization for more affordable standalone headsets like the Meta Quest 2 and Meta Quest 3.[39][40]
  • Simplicity: It is relatively simple for developers to implement and for hardware to support.

Disadvantages:

  • Sub-optimal Gains: Because the system cannot know where the user is actually looking, the central high-quality region must be made conservatively large to account for natural eye movements. This limits the potential performance savings compared to dynamic systems.[41]
  • Visible Artifacts: If a user moves their eyes to look at the periphery without turning their head, they can easily notice the drop in resolution.[39][42]

Dynamic (Eye-Tracked) Foveated Rendering (ETFR / DFR)

Dynamic Foveated Rendering represents the full realization of the concept. It requires a head-mounted display with integrated eye-tracking cameras to determine the user's precise point of gaze in real-time.[6][1] The high-resolution foveal region is then dynamically moved to match this gaze point on a frame-by-frame basis, ensuring that the user is always looking at a fully rendered part of the scene.[43]

Advantages:

  • Maximum Performance: ETFR allows for much more aggressive foveation—a smaller foveal region and a more significant quality reduction in the periphery—resulting in substantially greater performance and power savings.[40][29]
  • Perceptually Seamless: When implemented with low latency, the effect is imperceptible to the user.[6]

Disadvantages:

  • Hardware Requirements: It is entirely dependent on the presence and quality of eye-tracking hardware, which increases the cost, weight, and power consumption of the HMD.
  • Sensitivity to Latency: The technique is highly sensitive to system latency. If the delay between an eye movement and the corresponding display update is too long, the user will perceive artifacts.[19]
Comparison of Foveated Rendering Methodologies
Feature ! Fixed Foveated Rendering (FFR) ! Dynamic (Eye-Tracked) Foveated Rendering (ETFR/DFR)
Core Principle Tracks the user's real-time gaze to place the high-quality region precisely where they are looking.
Hardware Requirement None (beyond a capable GPU). | Integrated eye tracking cameras and processing hardware.
High-Quality Region Dynamic, moves with the user's fovea. Can be made smaller and more aggressive.
User Experience When latency is low, the effect is imperceptible to the user, providing a consistently high-quality experience.[6]
Performance Savings Significant. Allows for more aggressive degradation, leading to greater GPU savings (e.g., 33-52% savings reported for Meta Quest Pro).[40][44]
Ideal Use Cases High-end PC VR and standalone headsets, demanding simulations, applications seeking maximum visual fidelity and performance.[37]
Key Drawback Increased hardware cost, complexity, power consumption, and high sensitivity to system latency.

Predictive and Attention-Aware Foveation

As the technology matures, research is exploring more advanced forms of foveation that incorporate predictive and cognitive models.

  • Predictive Foveation: Some systems attempt to predict the landing point of a saccade based on its initial trajectory and velocity. This allows the rendering system to begin shifting the foveal region to the target destination before the eye movement is complete.[18][40]
  • Attention-Aware Foveation: This is a cutting-edge research area that aims to model the user's cognitive state of attention. Peripheral visual sensitivity decreases when foveal attention is high.[21][22]

Performance, Efficacy, and Benchmarks

The primary motivation for implementing foveated rendering is to improve performance. The efficacy of the technique can be measured through several key metrics, and real-world benchmarks demonstrate substantial gains across a variety of hardware platforms.

Metrics for Performance Gain

The benefits of foveated rendering are quantified using the following metrics:

  • GPU Frame Time: The most direct measurement of performance. This is the time, in milliseconds (ms), that the GPU takes to render a single frame.[45]
  • Frames Per Second (FPS): Lower frame times enable higher and more stable frame rates. Maintaining a high FPS (typically 90 FPS or more) is critical for a comfortable and immersive VR experience.[46]
  • Power Consumption: On battery-powered standalone headsets, reducing the GPU workload directly translates to lower power consumption, leading to longer battery life and reduced thermal output.[37]
  • Increased Visual Fidelity: Instead of simply increasing FPS, developers can reinvest the saved GPU performance for higher base resolution (supersampling), more complex lighting and shader effects, or higher-quality assets.[25][47]

Real-World Performance Gains

Comprehensive Performance Benchmarks Across Platforms
Platform Test Content Baseline FFR Gain ETFR Gain Notes
Meta Quest 2 Various 100% 26-36% N/A Level 3 FFR achieves up to 43% savings[44]
Meta Quest Pro Default res 100% 26-43% 33-52% ETFR provides 7-9% additional benefit[48]
Meta Quest Pro Red Matter 2 Default density N/A +33% pixels 77% more total pixels in optical center[47]
PlayStation VR2 Unity Demo 33.2ms 14.3ms (2.5×) 9.2ms (3.6×) Eye tracking provides dramatic improvement[49]
Varjo Aero Professional apps 100% 30-40% 50-60% 200Hz eye tracking enables aggressive foveation[50]
Pimax Crystal VRS Method 100% N/A 10-40% 120Hz Tobii eye tracking[46]
Pimax Crystal Quad Views 100% N/A 50-100% More aggressive peripheral reduction[46]
ARM Mali GPU CircuitVR 488M cycles 397M cycles N/A 18.6% cycle reduction[32]
NVIDIA GTX 1080 Shadow Warrior 2 60 FPS N/A 78 FPS 30% performance gain[15]
PowerGS 3D Gaussian Splatting 100% power N/A 37% power 63% power reduction[51]

History

Research into foveated rendering dates back over three decades, evolving from theoretical psychophysics to practical implementations in immersive technologies.

Year Milestone Key Contributors/Devices Description
1990 Gaze-directed volume rendering Levoy and Whitaker First application of foveation to volume data visualization, reducing samples in peripheral regions.[52]
1991 Foundational research Academic papers Theoretical concept of adapting rendering to HVS acuity established.[1]
1996 Gaze-directed adaptive rendering Ohshima et al. Introduced adaptive resolution based on eye position for virtual environments.[53]
2001 Perceptually-driven simplification Luebke and Hallen LOD techniques guided by visual attention models.[54]
2012 Foveated 3D graphics Guenter et al. Rasterization-based system achieving 6.2× speedup in VR.[55]
2014 First consumer HMD prototype FOVE Unveiled eye-tracked VR headset with foveated rendering at TechCrunch Disrupt SF. First public demonstration of foveated rendering in a VR headset.[56]
2015 Kickstarter success FOVE Raised funds for production of first commercial eye-tracked HMD with foveated rendering.[57]
2016 (January) High-speed eye tracking demo SMI (SensoMotoric Instruments) Demonstrated 250Hz eye tracking system with foveated rendering at CES, achieving 2-4× performance boost with imperceptible quality loss.[58]
2016 (July) SIGGRAPH demonstration NVIDIA & SMI NVIDIA demonstrated "perceptually-guided" foveated rendering techniques with 50-66% pixel shading load reduction.[59][60]
2016 (November) First commercial release FOVE 0 FOVE 0 headset shipped to developers, first commercially available HMD with integrated eye tracking and foveated rendering support.[61]
2017 Mobile VR support Qualcomm Snapdragon 835 VRDK announced with "Adreno Foveation" for mobile VR, signaling technology's arrival on mobile processors.[62]
2017 SMI acquired Apple Apple acquired SMI, indicating growing interest in foveated rendering for AR/VR.[1]
2018-2019 Enterprise adoption StarVR One, Varjo VR-1 Professional headsets with integrated Tobii eye-tracking for foveated rendering. Varjo's "bionic display" used hardware-level foveation.[63]
2019 (January) Consumer eye tracking HTC Vive Pro Eye First mainstream consumer VR headset aimed at general users with dynamic foveated rendering support.[64]
2019 (December) SDK support Oculus Quest Fixed Foveated Rendering exposed in SDK, marking first large-scale commercial deployment.[65]
2020 Neural reconstruction Facebook Reality Labs DeepFovea demonstrated AI-based foveated reconstruction with up to 10-14× pixel count reduction.[66]
2021 Chipset integration Qualcomm XR2 Built-in support for foveated rendering and eye tracking in standalone VR chipset.[1]
2022 Consumer ETFR Meta Quest Pro First mainstream standalone headset with Eye-Tracked Foveated Rendering, achieving 33-52% performance gains.[48][1]
2023 (February) Console integration PlayStation VR2 Eye tracking with foveated rendering standard in every unit, achieving up to 3.6× speedup.[49][67]
2024 (February) Spatial computing Apple Vision Pro High-end mixed reality headset with sophisticated eye-tracking system for foveated rendering and interaction.[68]

Software Support

Game Engines

Unity

Unity provides native support for foveated rendering on supported XR platforms through its Scriptable Render Pipelines (URP and HDRP).[69][70] Unity 6 introduced cross-platform foveated rendering through the Scriptable Render Pipeline (SRP) Foveation API, supporting both VRS and Variable Rate Rasterization.[71]

Developers can enable the feature within the XR Plug-in Management project settings. At runtime, the strength of the foveation effect is controlled by setting the XRDisplaySubsystem.foveatedRenderingLevel property to a value between 0 (off) and 1 (maximum). To enable gaze-based foveation on supported hardware, the foveatedRenderingFlags property must be set to allow gaze input.[42][70]

Platform-specific SDKs provide their own wrappers and APIs. The Meta XR SDK and the PICO Unity Integration SDK expose dedicated components and functions for enabling and configuring both FFR and ETFR.[72][73][74]

Unreal Engine

Support for foveated rendering in Unreal Engine is managed through platform-specific plugins like the Meta XR Plugin or the PICO Unreal OpenXR Plugin.[75][29] Unreal Engine 5 implements foveated rendering through Variable Rate Shading for PCVR and Meta's OpenXR plugin for Quest devices.

Configuration is handled through Project Settings and console variables.[76][29] For example, developers can set the foveation level using a console command like xr.OpenXRFBFoveationLevel=2 for medium foveation. On mobile platforms, ETFR support is often a Vulkan-only feature.[76][29]

API and Standards

Graphics APIs

Variable Rate Shading (VRS) is a core feature of DirectX 12 (requiring Tier 2 support) and is also supported in Vulkan.[77][70] DirectX 12 Tier 2 Variable Rate Shading provides granular control through shading rate surfaces.

The Vulkan API offers powerful extensions for foveation. The VK_EXT_fragment_density_map extension allows the VR runtime to provide the GPU with a custom texture that dictates the rendering resolution across the framebuffer.[33][34]

OpenXR

OpenXR is a royalty-free, open standard from the Khronos Group that provides high-performance access to AR and VR platforms. OpenXR 1.1 standardized foveated rendering through multiple vendor extensions.[78] Key extensions include:

  • XR_FB_foveation
  • XR_FB_foveation_configuration
  • XR_META_foveation_eye_tracked
  • XR_VARJO_foveated_rendering

These extensions allow an application to query for foveation support, configure its parameters, and enable or disable it at runtime.[29][75] The OpenXR Toolkit can inject foveated rendering capabilities into OpenXR applications that may not natively support it.[26][30]

Hardware Implementations

Consumer Devices

Foveated Rendering Capabilities of Major Commercial VR Headsets
Headset Release Year Display Resolution (per eye) Eye Tracking Eye Tracker Specs Foveated Rendering Support
Meta Quest 2 1832 x 1920 | No | N/A | Fixed Foveated Rendering (FFR) only[40]
Meta Quest 3 2064 x 2208 | No | N/A | Fixed Foveated Rendering (FFR) with improved efficiency[40]
Meta Quest Pro 1800 x 1920 | Yes | Internal cameras, gaze prediction, 46-57ms latency[47] | Eye-Tracked Foveated Rendering (ETFR) & FFR[1][79]
PlayStation VR2 2000 x 2040 | Yes | 1x IR camera per eye, Tobii technology[80] | Eye-Tracked Foveated Rendering (ETFR)[1][45]
HTC Vive Pro Eye 1440 x 1600 | Yes | Tobii eye tracking, 120Hz | Dynamic Foveated Rendering[64]
HTC Vive Focus 3 2448 x 2448 | No (Add-on available) | N/A | Fixed Foveated Rendering (via VRS)[77]
Pico 4 Standard 2160 x 2160 | No | N/A | Fixed Foveated Rendering[72]
Pico 4 Pro 2160 x 2160 | Yes | Internal cameras | Eye-Tracked Foveated Rendering (ETFR) & FFR[74][81]
Apple Vision Pro ~3660 x 3200 (est.) | Yes | High-speed cameras and IR illuminators, M2 chip processing | Eye-Tracked Foveated Rendering[68]

Professional & Enthusiast Devices

  • Varjo (Aero, XR-4, VR-3, etc.): Professional headsets with industry-leading visual fidelity featuring 200Hz Tobii eye-tracking system. Support advanced foveation techniques including both VRS and proprietary "dynamic projection" method. Achieve over 70 pixels per degree in focus area using bionic displays.[82][83][84]
  • Pimax (Crystal, Crystal Super): Enthusiast headsets with 2880 x 2880 per eye resolution, integrating 120Hz Tobii-powered eye tracking. Support both VRS (10-40% gain) and Quad Views rendering (50-100% gain).[85][46][86]
  • StarVR One: Enterprise headset with 210° field of view, integrated Tobii eye-tracking for foveated rendering across ultra-wide displays.[63]
  • FOVE 0: First commercially available HMD with integrated eye tracking and foveated rendering support (2016-2017). Featured infrared eye tracking with 100° field of view.[61]

Hardware Components

The quality of eye-tracking hardware directly impacts ETFR effectiveness. Key specifications include:

  • Frequency: Commercial headsets feature tracking frequencies from 120Hz to 200Hz. Higher frequencies reduce time between eye movement and detection.[46][82]
  • Accuracy: Sub-degree accuracy (typically 0.5-1.0°) necessary to ensure correct foveal region placement. Mobile VR headsets typically achieve 0.5-1.1° accuracy compared to sub-0.5° for research-grade systems.[11]
  • Latency: Total end-to-end delay from eye movement to data availability. Must remain below 50-70ms for imperceptible artifacts, with latencies beyond 80-150ms causing significant quality degradation.[15]
  • Implementation: Typical systems use one or more small infrared (IR) cameras mounted inside the headset, aimed at each eye, and illuminated by IR LEDs to capture pupil and corneal reflections.[46][80]

Challenges and Limitations

Eye-Tracking Latency, Accuracy, and Jitter

The quality of the eye-tracking subsystem is the single most critical factor for the success of ETFR.

  • Latency: High end-to-end latency is the primary antagonist of foveated rendering. If the system cannot update the foveal region before the user's saccade completes and saccadic masking wears off, the user will perceive artifacts known as "pop-in."[87] Research indicates that while latencies of 80-150ms cause significant issues, a total system latency of 50-70ms can be tolerated.[19]
  • Accuracy and Jitter: The tracking system must be accurate enough to place the foveal region correctly. "Jitter," or small fluctuations in the reported gaze position, can cause the high-resolution area to shimmer or vibrate.

Perceptual Artifacts and Mitigation Strategies

Even with good eye tracking, aggressive or poorly implemented foveation can introduce noticeable visual artifacts.

  • "Tunnel Vision": If the peripheral region is blurred too aggressively or if the filtering process causes significant contrast loss, it creates a subjective feeling of looking through a narrow tunnel.[88]
  • Flicker and Aliasing: Simple subsampling can introduce temporal artifacts like shimmering and flickering, or spatial artifacts like jagged edges (aliasing) in the periphery.[89]
  • Edge Artifacts: Producing jagged boundaries during smooth pursuits.[90]
  • "Chasing" Effect: From excessive latency where users perceive the sharp region following their gaze.[90]

Mitigation strategies include creating smooth "blend" regions between quality zones, applying contrast enhancement to the periphery, and using sophisticated anti-aliasing algorithms.[5][88]

Developer Adoption and Implementation Complexity

While modern game engines and APIs have made implementation easier, foveated rendering is not always simple.

  • Rendering Pipeline Incompatibility: Foveation can be incompatible with certain post-processing effects that operate on full-screen images. Rendering to intermediate textures can break the foveation pipeline.[72][91]
  • Tuning and Testing: No universal "best" foveation setting exists. Optimal balance depends on specific content.[39][92]
  • Fallback Support: Applications must gracefully fall back to FFR or no foveation when eye tracking is unavailable.[76]

Hardware Limitations

  • Mobile vs Desktop Performance: Mobile GPU architectures see smaller benefits than console/desktop GPUs—Quest Pro achieves 33-45% savings while PSVR2 reaches 72%.[48][49]
  • Cost and Complexity: Eye-tracking hardware increases headset cost, weight, and power consumption.
  • Calibration Requirements: Individual calibration typically required for each user to map eye movements accurately.

Neural Reconstruction Approaches

DeepFovea

DeepFovea, developed by Facebook Reality Labs and presented at SIGGRAPH Asia 2019, pioneered neural reconstruction for foveated rendering. The system renders only 10% of peripheral pixels and reconstructs missing pixels using a convolutional neural network, enabling up to 10-14× pixel count reduction with minimal perceptual impact.[66]

Recent Advances

  • FoVolNet (2022): Achieved 25× speedup over DeepFovea through hybrid direct and kernel prediction for volume rendering.[93]
  • VR-Splatting (2024): Combines 3D Gaussian Splatting with foveated rendering for photorealistic VR at 90Hz, achieving 63% power reduction.[51]
  • FovealNet (2024): Integrates gaze prediction using AI to compensate for latency, advancing real-time performance.[94]

Current Research Frontiers

The evolution of foveated rendering continues with researchers exploring more sophisticated models of human perception:

  • Luminance-Contrast-Aware Foveation: Recognizes that HVS sensitivity to detail depends not just on eccentricity but also local image content. Applies more aggressive foveation in very dark or low-contrast areas.[95]
  • Attention-Aware Foveation: Incorporates cognitive factors, using task difficulty to dynamically adjust peripheral degradation level.[21][22]
  • Individualized Foveated Rendering (IFR): Tailors foveation parameters to unique perceptual abilities of each user through brief calibration processes.[96]
  • Eye-Dominance-Guided Foveation: Renders the image for the dominant eye at slightly higher quality, providing performance savings without noticeable impact on stereo perception.[97]
  • Predictive Foveation: Systems predict saccade landing points based on initial trajectory and velocity, allowing rendering systems to begin shifting the foveal region before eye movement completes.[18][40]

Future Developments

Near-term (2025-2026)

  • Production deployment of neural reconstruction techniques in consumer headsets
  • Software-only gaze prediction enabling foveated rendering without eye tracking hardware
  • OpenXR standardization eliminating platform fragmentation
  • NPU acceleration for neural reconstruction on mobile VR platforms

Mid-term (2026-2028)

  • Power optimization critical for wireless VR and AR glasses
  • Adaptive foveated rendering personalizing quality curves per user
  • Retinal resolution displays (60-70 pixels per degree) making foveated rendering mandatory
  • Multi-modal foveation extending to audio and haptics

Long-term (2028+)

  • Neural Radiance Fields (NeRF) with foveated rendering
  • Cloud and edge rendering with dynamic foveated transport
  • Theoretical limit of 20-100× improvements versus current rendering
  • Foveated rendering as therapeutic tool for cybersickness mitigation

See Also

References

  1. 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 "Foveated rendering - Wikipedia". https://en.wikipedia.org/wiki/Foveated_rendering.
  2. "What is Foveated Rendering - Unity". https://unity.com/glossary/foveated-rendering.
  3. "Foveated rendering - Unity Manual". https://docs.unity3d.com/6000.2/Documentation/Manual/xr-foveated-rendering.html.
  4. "Foveated Rendering". https://unity.com/glossary/foveated-rendering.
  5. 5.0 5.1 5.2 "An integrative view of foveated rendering". https://www.researchgate.net/publication/355503409_An_integrative_view_of_foveated_rendering.
  6. 6.0 6.1 6.2 6.3 "What is foveated rendering?". https://support.varjo.com/hc/en-us/what-is-foveated-rendering.
  7. "Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations". https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/.
  8. "Save GPU with Eye Tracked Foveated Rendering". https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/.
  9. 9.0 9.1 "Gaze-Contingent Rendering for Deferred Shading". https://graphics.tu-bs.de/upload/publications/stengel2016adaptsampling.pdf.
  10. "Foveated rendering: A state-of-the-art survey". https://www.researchgate.net/publication/366842988_Foveated_rendering_A_state-of-the-art_survey.
  11. 11.0 11.1 "What is foveated rendering?". Tobii. 2023-03-15. https://www.tobii.com/blog/what-is-foveated-rendering.
  12. 12.0 12.1 12.2 "Type of Movement and Attentional Task Affect the Efficacy of a Foveated Rendering Method in Virtual Reality". https://research.manchester.ac.uk/files/296585058/toyf.pdf.
  13. 13.0 13.1 "Eye tracking in virtual reality: a comprehensive overview of the human visual system, eye movement types, and technical considerations". https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/.
  14. Template:Cite arxiv
  15. 15.0 15.1 15.2 "Latency Requirements for Foveated Rendering in Virtual Reality". NVIDIA Research. 2017. https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf.
  16. "Eye tracking in virtual reality: a comprehensive overview". https://pmc.ncbi.nlm.nih.gov/articles/PMC10449001/.
  17. "Foveated rendering: A state-of-the-art survey". Computational Visual Media. 2023. https://link.springer.com/article/10.1007/s41095-022-0306-4.
  18. 18.0 18.1 18.2 "Eye Tracking & Foveated Rendering Explained". https://www.reddit.com/r/oculus/comments/afj50w/eye_tracking_foveated_rendering_explained_what_it/.
  19. 19.0 19.1 19.2 "Latency Requirements for Eye-Tracked Foveated Rendering". https://research.nvidia.com/sites/default/files/pubs/2017-09_Latency-Requirements-for/a25-albert.pdf.
  20. "A Quick-Start Guide to Foveated Rendering". Road to VR. 2016-02-16. https://www.roadtovr.com/a-pocket-guide-to-foveated-rendering-from-smi/.
  21. 21.0 21.1 21.2 "Towards Attention-Aware Foveated Rendering". https://www.computationalimaging.org/publications/attention-aware/.
  22. 22.0 22.1 22.2 ""Towards Attention–Aware Foveated Rendering" by Krajancich, Kellnhofer and Wetzstein". https://history.siggraph.org/learning/towards-attention-aware-foveated-rendering-by-krajancich-kellnhofer-and-wetzstein/.
  23. "Gaze-Contingent Multiresolution Visualization for Large-Scale Vector and Volume Data". https://vgl.cs.usfca.edu/assets/Foveated_Visualization___VDA_2020.pdf.
  24. "Gaze Contingent Foveated Rendering for 2D Displays". http://stanford.edu/class/ee367/Winter2017/mehra_sankar_ee367_win17_report.pdf.
  25. 25.0 25.1 "What is Foveated Rendering? - autovrse". https://www.autovrse.com/foveated-rendering.
  26. 26.0 26.1 "Foveated Rendering - OpenXR Toolkit". https://mbucchia.github.io/OpenXR-Toolkit/fr.html.
  27. "Variable Rate Shading: a scalpel in a world of sledgehammers". Microsoft DirectX Blog. 2019. https://devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/.
  28. 28.0 28.1 "Foveated Rendering - Varjo for Developers". https://developer.varjo.com/docs/native/foveated-rendering-api.
  29. 29.0 29.1 29.2 29.3 29.4 29.5 "Foveated rendering - PICO Unreal OpenXR Plugin". https://developer.picoxr.com/document/unreal-openxr/fixed-foveated-rendering/.
  30. 30.0 30.1 "DCS Dynamic Foveated Rendering available for more headsets". https://www.reddit.com/r/hoggit/comments/15ep59q/dcs_dynamic_foveated_rendering_available_for_more/.
  31. "Quad Views Foveated Rendering for Pimax Crystal". https://pimax.com/blogs/blogs/quad-views-foveated-rendering-for-pimax-crystal.
  32. 32.0 32.1 "Foveated Rendering Current and Future Technologies for Virtual Reality". ARM Developer. 2020. https://developer.arm.com/-/media/developer/Graphics%20and%20Multimedia/White%20Papers/Foveated%20Rendering%20Whitepaper.pdf.
  33. 33.0 33.1 "Vulkan for Mobile VR Rendering". https://developers.meta.com/horizon/blog/vulkan-for-mobile-vr-rendering/.
  34. 34.0 34.1 "Vulkan API Documentation: VK_EXT_fragment_density_map". https://expipiplus1.github.io/vulkan/vulkan-3.8.1-docs/Vulkan-Extensions-VK_EXT_fragment_density_map.html.
  35. "Improving Foveated Rendering with the Fragment Density Map Offset Extension for Vulkan". https://www.qualcomm.com/developer/blog/2022/08/improving-foveated-rendering-fragment-density-map-offset-extension-vulkan.
  36. "Kernel Foveated Rendering". ResearchGate. 2018. https://www.researchgate.net/publication/326636875_Kernel_Foveated_Rendering.
  37. 37.0 37.1 37.2 37.3 "What Is Foveated Rendering? - JigSpace". https://www.jig.com/spatial-computing/foveated-rendering.
  38. "Save GPU with Eye Tracked Foveated Rendering". https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/.
  39. 39.0 39.1 39.2 39.3 "Fixed foveated rendering (FFR) - Meta Quest". https://developers.meta.com/horizon/documentation/unity/os-fixed-foveated-rendering/.
  40. 40.0 40.1 40.2 40.3 40.4 40.5 40.6 "What is foveated rendering and what does it mean for VR?". https://vrx.vr-expert.com/what-is-foveated-rendering-and-what-does-it-mean-for-vr/.
  41. "Power, Performance, and Quality of Gaze-Tracked Foveated Rendering in Practical XR Systems". https://3dvar.com/Singh2023Power.pdf.
  42. 42.0 42.1 42.2 "Foveated rendering in OpenXR - Unity Manual". https://docs.unity3d.com/Packages/[email protected]/manual/features/foveatedrendering.html.
  43. "Eye tracking and dynamic foveated rendering - Tobii". https://www.tobii.com/resource-center/reports-and-papers/eye-tracking-and-dynamic-foveated-rendering.
  44. 44.0 44.1 44.2 "Quest Pro Foveated Rendering GPU Savings Detailed". https://www.uploadvr.com/quest-pro-foveated-rendering-performance/.
  45. 45.0 45.1 "PSVR 2 Specs Run 3.6x Faster Using Eye-Tracking Technology". https://www.playstationlifestyle.net/2022/03/28/psvr-2-specs-eye-tracking-foveated-rendering/.
  46. 46.0 46.1 46.2 46.3 46.4 46.5 "The Crystal Super's Secret Weapon: Dynamic Foveated Rendering". https://pimax.com/blogs/blogs/the-crystal-supers-secret-weapon-dynamic-foveated-rendering.
  47. 47.0 47.1 47.2 "Save GPU with Eye Tracked Foveated Rendering". https://developers.meta.com/horizon/blog/save-gpu-with-eye-tracked-foveated-rendering/.
  48. 48.0 48.1 48.2 "Here's The Exact Performance Benefit Of Foveated Rendering On Quest Pro". UploadVR. October 2022. https://www.uploadvr.com/quest-pro-foveated-rendering-performance/.
  49. 49.0 49.1 49.2 "PSVR 2 Foveated Rendering Provides 3.6x Faster Performance - Unity". UploadVR. March 2023. https://www.uploadvr.com/psvr-2-eye-tracking-foveated-rendering-gdc/.
  50. "Foveated Rendering". Varjo for developers. 2023. https://developer.varjo.com/docs/native/foveated-rendering-api.
  51. 51.0 51.1 "VR-Splatting: Foveated Radiance Field Rendering via 3D Gaussian Splatting and Neural Points". ACM. 2024. https://dl.acm.org/doi/10.1145/3728302.
  52. Levoy, Marc
    Whitaker, Robert(1990). "Gaze-directed volume rendering".{Template:Journal
    361–369. doi:10.1145/91394.91431.
  53. Ohshima, Takashi; Satoh, Keiichi; Tamaki, Hiroaki (1999). "AR²HMD: Augmented reality with high resolution head mounted display". pp. 110–119. Template:Hide in printTemplate:Only in print.
  54. Luebke, David
    Hallen, Ben(2001). "Perceptually-driven simplification for interactive rendering".{Template:Journal
    223–230. doi:10.1109/IPDPS.2001.925025.
  55. Guenter, Brian; Grimes, Mark; Nehab, Diego; Sander, Pedro V.; Summa, João (2012). "Efficient rerendering in viewport space". 31. pp. 1–13. Template:Hide in printTemplate:Only in print.
  56. "FOVE Uses Eye Tracking To Make Virtual Reality More Immersive". TechCrunch. 2014-09-10. https://techcrunch.com/2014/09/09/fove/.
  57. "FOVE: The World's First Eye Tracking Virtual Reality Headset". Kickstarter. 2015-09-01. https://www.kickstarter.com/projects/fove/fove-the-worlds-first-eye-tracking-virtual-reality.
  58. "SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real, and the Cost May Surprise You". UploadVR. 2016-01-15. https://uploadvr.com/smi-hands-on-250hz-eye-tracking/.
  59. "NVIDIA Partners with SMI on Innovative Rendering Technique That Improves VR". NVIDIA. 2016-07-21. https://blogs.nvidia.com/blog/2016/07/21/rendering-foveated-vr/.
  60. "Nvidia plans to prove that new method improves image quality in virtual reality". Digital Trends. 2016-07-23. https://www.digitaltrends.com/computing/nvidia-research-foveated-rendering-vr-smi/.
  61. 61.0 61.1 "Exclusive: Fove's VR HMD At CES 2016". Tom's Hardware. 2016-01-11. https://www.tomshardware.com/news/fove-vr-first-look-ces,30964.html.
  62. "Qualcomm Introduces Snapdragon 835 Virtual Reality Development Kit". Qualcomm. 2017-02-23. https://www.qualcomm.com/news/releases/2017/02/23/qualcomm-introduces-snapdragon-835-virtual-reality-development-kit.
  63. 63.0 63.1 "StarVR One is a premium VR headset with built-in eye-tracking". Ars Technica. 2018-08-14. https://arstechnica.com/gaming/2018/08/starvr-one-is-a-premium-vr-headset-with-built-in-eye-tracking/.
  64. 64.0 64.1 "HTC announces new Vive Pro Eye VR headset with native eye tracking". The Verge. 2019-01-07. https://www.theverge.com/2019/1/7/18172064/htc-vive-pro-eye-vr-headset-eye-tracking-announced-features-price-release.
  65. "Oculus Quest gets dynamic fixed foveated rendering". VentureBeat. 2019-12-22. https://venturebeat.com/2019/12/22/oculus-quest-gets-dynamic-fixed-foveated-rendering/.
  66. 66.0 66.1 "DeepFovea: Neural Reconstruction for Foveated Rendering". ACM SIGGRAPH. 2019. https://dl.acm.org/doi/10.1145/3306307.3328186.
  67. "PlayStation VR2 and PlayStation VR2 Sense controller: The next generation of VR gaming on PS5". PlayStation Blog. 2022-01-04. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/.
  68. 68.0 68.1 "The Best VR Headsets for 2025". PC Magazine. https://www.pcmag.com/picks/the-best-vr-headsets.
  69. "XR Foveated Rendering - Unity Roadmap". https://unity.com/roadmap/1356-xr-foveated-rendering.
  70. 70.0 70.1 70.2 "Foveated rendering - Unity Manual". https://docs.unity3d.com/6000.2/Documentation/Manual/xr-foveated-rendering.html.
  71. "Foveated rendering". Unity Documentation. 2024. https://docs.unity3d.com/6000.0/Documentation/Manual/xr-foveated-rendering.html.
  72. 72.0 72.1 72.2 "Fixed foveated rendering - PICO Unity Integration SDK". https://developer.picoxr.com/document/unity/fixed-foveated-rendering/.
  73. "Using Fixed Foveated Rendering - Unity". https://developers.meta.com/horizon/documentation/unity/unity-fixed-foveated-rendering/.
  74. 74.0 74.1 "Eye tracked foveated rendering - PICO Unity Integration SDK". https://developer.picoxr.com/document/unity/eye-tracked-foveated-rendering/.
  75. 75.0 75.1 "Foveated Rendering - VIVE OpenXR Unreal". https://developer.vive.com/resources/openxr/unreal/unreal-tutorials/rendering/foveated-rendering/.
  76. 76.0 76.1 76.2 "Eye Tracked Foveated Rendering - Unreal". https://developers.meta.com/horizon/documentation/unreal/unreal-eye-tracked-foveated-rendering/.
  77. 77.0 77.1 "VR Performance Features - Unreal Engine 4.27 Documentation". https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-performance-features?application_version=4.27.
  78. "OpenXR 1.1 Brings Foveated Rendering & More Into The Spec". UploadVR. 2024. https://www.uploadvr.com/openxr-1-1/.
  79. "Eye Tracked Foveated Rendering - Unity". https://developers.meta.com/horizon/documentation/unity/unity-eye-tracked-foveated-rendering/.
  80. 80.0 80.1 "PlayStation VR2 tech specs". https://www.playstation.com/en-ca/ps-vr2/ps-vr2-tech-specs/.
  81. "Foveated rendering - PICO Unreal Integration SDK". https://developer.picoxr.com/document/unreal/fixed-foveated-rendering/.
  82. 82.0 82.1 "Varjo Aero - Varjo". https://varjo.com/products/aero.
  83. "Eye tracking with Varjo headset". https://developer.varjo.com/docs/get-started/eye-tracking-with-varjo-headset.
  84. "Foveated rendering - Varjo Support". https://support.varjo.com/hc/en-us/foveated-rendering.
  85. "Pimax Crystal Super". https://pimax.com/pages/pimax-crystal-super.
  86. "About Dynamic Foveated Rendering (DFR) in Virtual Reality (VR)". https://pimax.com/blogs/blogs/about-dynamic-foveated-rendering-dfr-in-virtual-reality-vr.
  87. "unity foveated rendering test 4x fps increase with a pretty simple rendering strategy". https://www.reddit.com/r/oculus/comments/3bls3q/unity_foveated_rendering_test_4x_fps_increase/.
  88. 88.0 88.1 "A Perceptually-Based Foveated Real-Time Renderer". http://cwyman.org/papers/siga16_gazeTrackedFoveatedRendering.pdf.
  89. "Perceptually-Based Foveated Virtual Reality". https://research.nvidia.com/publication/2016-07_perceptually-based-foveated-virtual-reality.
  90. 90.0 90.1 "Time-Warped Foveated Rendering for Virtual Reality Headsets". Computer Graphics Forum. 2021. https://onlinelibrary.wiley.com/doi/10.1111/cgf.14176.
  91. "Foveated Rendering - Snapdragon Spaces". https://docs.spaces.qualcomm.com/unity/setup/foveated-rendering.
  92. "Do all PSVR2 games use foverated rendering?". https://www.reddit.com/r/PSVR/comments/1eacq3v/do_all_psvr2_games_use_foverated_rendering/.
  93. "FoVolNet: Fast Volume Rendering using Foveated Deep Neural Networks". arXiv. 2022. https://arxiv.org/abs/2209.09965.
  94. "FovealNet: Advancing AI-Driven Gaze Tracking Solutions for Optimized Foveated Rendering System Performance in Virtual Reality". arXiv. 2024. https://arxiv.org/abs/2412.10456.
  95. ""Luminance-Contrast-Aware Foveated Rendering" by Tursun, Arabadzhiyska-Koleva, Wernikowski, Mantiuk, Seidel, et al.". https://history.siggraph.org/learning/luminance-contrast-aware-foveated-rendering-by-tursun-arabadzhiyska-koleva-wernikowski-mantiuk-seidel-et-al/.
  96. "Individualized foveated rendering with eye-tracking head-mounted display". https://www.researchgate.net/publication/377532315_Individualized_foveated_rendering_with_eye-tracking_head-mounted_display.
  97. "Eye-Dominance-Guided Foveated Rendering". https://research.google/pubs/eye-dominance-guided-foveated-rendering/.

External links