Jump to content

Stereoscopic rendering: Difference between revisions

Created page with "{{Infobox technology | name = Stereoscopic Rendering | image = | caption = | type = Computer graphics technique | inventor = | inception = 1838 (concept), 1968 (computer graphics) | manufacturer = | available = | current_supplier = | last_production = | introduced = | discontinued = | cost = | applications = Virtual reality, Augmented reality, 3D gaming, Medical visualization | industry = VR/AR ($15.9 billion as of 2024) | users = 171 millio..."
 
 
(7 intermediate revisions by the same user not shown)
Line 17: Line 17:
| users = 171 million (2024)
| users = 171 million (2024)
}}
}}
[[File:stereoscopic rendering1.jpg|300px|right]]
[[File:stereoscopic rendering2.jpg|300px|right]]


'''Stereoscopic rendering''' is the foundational [[computer graphics]] technique that creates the perception of three-dimensional depth in [[virtual reality]] (VR) and [[augmented reality]] (AR) systems by generating two slightly different images from distinct viewpoints corresponding to the left and right eyes.<ref name="arm2021">ARM Software. "Introduction to Stereo Rendering - VR SDK for Android." ARM Developer Documentation, 2021. https://arm-software.github.io/vr-sdk-for-android/IntroductionToStereoRendering.html</ref> This technique exploits [[binocular disparity]]—the horizontal displacement between corresponding points in the two images—enabling the [[visual cortex]] to reconstruct depth information through [[stereopsis]], the same process human eyes use to perceive the real world.<ref name="numberanalytics2024">Number Analytics. "Stereoscopy in VR: A Comprehensive Guide." 2024. https://www.numberanalytics.com/blog/ultimate-guide-stereoscopy-vr-ar-development</ref> By delivering two offset images (one per eye) that the brain combines into a single scene, stereoscopic rendering produces an illusion of depth that mimics natural [[binocular vision]].<ref name="drawandcode">Draw & Code. "What Is Stereoscopic VR Technology." January 23, 2024. https://drawandcode.com/learning-zone/what-is-stereoscopic-vr-technology/</ref>
'''Stereoscopic rendering''' is the foundational [[computer graphics]] technique that creates the perception of three-dimensional depth in [[virtual reality]] (VR) and [[augmented reality]] (AR) systems by generating two slightly different images from distinct viewpoints corresponding to the left and right eyes.<ref name="arm2021">ARM Software. "Introduction to Stereo Rendering - VR SDK for Android." ARM Developer Documentation, 2021. https://arm-software.github.io/vr-sdk-for-android/IntroductionToStereoRendering.html</ref> This technique exploits [[binocular disparity]]—the horizontal displacement between corresponding points in the two images—enabling the [[visual cortex]] to reconstruct depth information through [[stereopsis]], the same process human eyes use to perceive the real world.<ref name="numberanalytics2024">Number Analytics. "Stereoscopy in VR: A Comprehensive Guide." 2024. https://www.numberanalytics.com/blog/ultimate-guide-stereoscopy-vr-ar-development</ref> By delivering two offset images (one per eye) that the brain combines into a single scene, stereoscopic rendering produces an illusion of depth that mimics natural [[binocular vision]].<ref name="drawandcode">Draw & Code. "What Is Stereoscopic VR Technology." January 23, 2024. https://drawandcode.com/learning-zone/what-is-stereoscopic-vr-technology/</ref>
Line 75: Line 78:
| 1991 || Virtuality VR arcades || Real-time stereoscopic multiplayer VR
| 1991 || Virtuality VR arcades || Real-time stereoscopic multiplayer VR
|-
|-
| 1995 || Nintendo Virtual Boy || Portable stereoscopic gaming console
| 1995 || [[Nintendo Virtual Boy]] || Portable stereoscopic gaming console
|-
|-
| 2010 || Oculus Rift prototype || Modern stereoscopic HMD revival
| 2010 || [[Oculus Rift]] prototype || Modern stereoscopic HMD revival
|-
|-
| 2016 || HTC Vive/Oculus Rift CV1 release || Consumer room-scale stereoscopic VR
| 2016 || [[HTC Vive]]/[[Oculus Rift CV1]] release || Consumer room-scale stereoscopic VR
|-
|-
| 2023 || Apple Vision Pro || High-resolution stereoscopic mixed reality (70 pixels per degree)
| 2023 || [[Apple Vision Pro]] || High-resolution stereoscopic mixed reality (70 pixels per degree)
|}
|}


Line 139: Line 142:


Single-pass stereo rendering optimizes by traversing the scene graph once while rendering to both eye buffers.<ref name="nvidia2018">NVIDIA Developer. "Turing Multi-View Rendering in VRWorks." NVIDIA Technical Blog, 2018. https://developer.nvidia.com/blog/turing-multi-view-rendering-vrworks/</ref> Single-pass instanced approach uses GPU instancing with instance count of 2, where the [[vertex shader]] outputs positions for both views simultaneously. Example shader code:
Single-pass stereo rendering optimizes by traversing the scene graph once while rendering to both eye buffers.<ref name="nvidia2018">NVIDIA Developer. "Turing Multi-View Rendering in VRWorks." NVIDIA Technical Blog, 2018. https://developer.nvidia.com/blog/turing-multi-view-rendering-vrworks/</ref> Single-pass instanced approach uses GPU instancing with instance count of 2, where the [[vertex shader]] outputs positions for both views simultaneously. Example shader code:
<code>
<pre>
uniform EyeUniforms {
uniform EyeUniforms {
mat4 mMatrix[2];
    mat4 mMatrix[2];
};
};
vec4 pos = mMatrix[gl_InvocationID] * vertex;
vec4 pos = mMatrix[gl_InvocationID] * vertex;
</code>
</pre>
 
This technique halves draw call count compared to multi-pass, reducing CPU bottlenecks in complex scenes.<ref name="iquilez">Quilez, Inigo. "Stereo rendering." 2024. https://iquilezles.org/articles/stereo/</ref>
 
=== Hardware-Accelerated Techniques ===
 
[[NVIDIA]]'s [[Pascal (microarchitecture)|Pascal]] architecture introduced '''Simultaneous Multi-Projection (SMP)''' enabling true Single Pass Stereo where geometry processes once and projects to both eyes simultaneously using hardware acceleration.<ref name="anandtech2016">AnandTech. "Simultaneous Multi-Projection: Reusing Geometry on the Cheap." 2016. https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/11</ref> The [[Turing (microarchitecture)|Turing]] architecture expanded this to Multi-View Rendering supporting up to 4 projection views in a single pass.
 
'''Lens Matched Shading''' divides each eye's view into 4 quadrants with adjusted projections approximating the barrel-distorted output shape after lens correction, reducing rendered pixels from 2.1 megapixels to 1.4 megapixels per eye—a 50% increase in available pixel shading throughput.<ref name="roadtovr2016">Road to VR. "NVIDIA Explains Pascal's 'Lens Matched Shading' for VR." 2016. https://www.roadtovr.com/nvidia-explains-pascal-simultaneous-multi-projection-lens-matched-shading-for-vr/</ref>
 
=== Advanced Optimization Techniques ===
 
==== Stereo Shading Reprojection ====
 
This technique developed by [[Oculus]] in 2017 reuses the image from one eye to help render the other eye's image. The scene is rendered normally for one eye, then the depth buffer information is used to extrapolate or reproject the image for the second eye's viewpoint. Any missing parts due to occluded areas are filled in with a secondary rendering pass. Oculus reported that in shader-heavy scenes this method can save substantial GPU work (approximately 20% in their tests).<ref name="meta-reprojection">Meta for Developers. "Introducing Stereo Shading Reprojection." 2024. https://developers.meta.com/horizon/blog/introducing-stereo-shading-reprojection-for-unity/</ref>
 
==== Foveated Rendering ====
 
[[Foveated rendering]] renders the small area corresponding to the user's fovea at full resolution while progressively reducing peripheral quality. Fixed foveated rendering (no eye tracking required) achieves 34-43% GPU savings on [[Meta Quest 2]], while eye-tracked dynamic foveated rendering on [[PlayStation VR2]] demonstrates approximately 72% savings.<ref name="vrx2024">VRX. "What is Foveated Rendering? - VR Expert Blog." 2024. https://vrx.vr-expert.com/what-is-foveated-rendering-and-what-does-it-mean-for-vr/</ref>
 
==== Asynchronous Reprojection ====
 
Technologies like Oculus's [[Asynchronous Spacewarp]] (ASW) and [[Timewarp|Asynchronous Timewarp]] do not directly reduce rendering cost but help maintain smooth output by re-projecting images if a new frame is late. This allows applications to render at lower framerates (45 FPS rendered doubled to 90 FPS displayed) without the user perceiving jitter.<ref name="google_atw2024">Google. "Asynchronous Reprojection." Google VR Developers, 2024. https://developers.google.com/vr/discover/async-reprojection</ref>
 
{| class="wikitable"
|+ Comparison of Stereo Rendering Techniques
! Technique !! Core Principle !! CPU Overhead !! GPU Overhead !! Primary Advantage !! Primary Disadvantage
|-
| '''Multi-Pass''' || Render entire scene twice sequentially || Very High || High || Simple implementation; maximum compatibility || Extremely inefficient; doubles CPU workload
|-
| '''Single-Pass (Double-Wide)''' || Render to double-width render target || Medium || High || Reduces render state changes || Still duplicates some work; largely deprecated
|-
| '''Single-Pass Instanced''' || Use GPU instancing with instance count=2 || Very Low || High || Drastically reduces CPU overhead || Requires GPU/API support (DirectX 11+)
|-
| '''Multiview''' || Single draw renders to multiple texture array slices || Very Low || High || Most efficient for mobile/standalone VR || Requires specific GPU/API support
|-
| '''Stereo Shading Reprojection''' || Reuse one eye's pixels for the other via depth || Low || Medium || 20% GPU savings in shader-heavy scenes || Can introduce artifacts; complex implementation
|}
 
== Hardware Requirements ==
 
{| class="wikitable"
! Tier !! GPU Examples !! Use Case !! Resolution Support !! Performance Target
|-
| Minimum || [[GeForce GTX 1060|GTX 1060]] 6GB, [[Radeon RX 480|RX 480]] || Baseline VR || 1080×1200 per eye @ 90Hz || Low settings
|-
| Recommended || [[GeForce RTX 2060|RTX 2060]], [[GeForce GTX 1070|GTX 1070]] || Comfortable VR || 1440×1600 per eye @ 90Hz || Medium settings
|-
| Premium || [[GeForce RTX 3070|RTX 3070]], [[Radeon RX 6800|RX 6800]] || High-fidelity VR || 2160×2160 per eye @ 90Hz || High settings
|-
| Enthusiast || [[GeForce RTX 4090|RTX 4090]], [[Radeon RX 7900 XTX|RX 7900 XTX]] || Maximum quality || 4K+ per eye @ 120Hz+ || Ultra settings with ray tracing
|}
 
Modern VR rendering demands GPU capabilities significantly beyond traditional gaming.<ref name="computercity2024">ComputerCity. "VR PC Hardware Requirements: Minimum and Recommended Specs." 2024. https://computercity.com/hardware/vr/vr-pc-hardware-requirements</ref> To prevent [[simulation sickness]], VR applications must maintain consistently high frame rates—typically 90 frames per second or higher—and motion-to-photon latency under 20 milliseconds.<ref name="daqri2024">DAQRI. "Motion to Photon Latency in Mobile AR and VR." Medium, 2024. https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926</ref>
 
== Display Technologies ==
 
Various technologies have been developed to present the stereoscopic image pair to viewers:
 
* '''[[Head-Mounted Display]]s (HMDs)''': Modern VR and AR headsets achieve perfect image separation using either two separate micro-displays (one for each eye) or a single display partitioned by optics. This direct-view approach completely isolates the left and right eye views, eliminating [[crosstalk]].<ref name="drawandcode"/>
 
* '''[[Color Filtering]] ([[Anaglyph 3D|Anaglyph]])''': Uses glasses with filters of different colors, typically red and cyan. Very inexpensive but suffers from severe color distortion and ghosting.<ref name="basic_principles"/>
 
* '''[[Polarized 3D system|Polarization]]''': Uses glasses with differently polarized lenses. Linear polarization orients filters at 90 degrees; circular polarization uses opposite clockwise/counter-clockwise polarization. Commonly used in 3D cinemas.<ref name="palušová2023">Palušová, P. "Stereoscopy in Extended Reality: Utilizing Natural Binocular Disparity." 2023. https://www.petrapalusova.com/stereoscopy</ref>
 
* '''[[Time Multiplexing]] (Active Shutter)''': Display alternates between left and right images at high speed (120+ Hz). Viewer wears LCD shutter glasses synchronized to the display. Delivers full resolution to each eye.<ref name="basic_principles"/>
 
* '''[[Autostereoscopy]] (Glasses-Free 3D)''': Uses optical elements like [[parallax barrier]]s or [[lenticular lens]]es to direct different pixels to each eye. Limited by narrow optimal viewing angle.<ref name="palušová2023"/>
 
== Applications ==
 
=== Gaming ===
 
Gaming dominates current VR usage with 48.3% of market revenue and 70% of VR users playing games regularly.<ref name="marketgrowth2024">Marketgrowthreports. "Virtual and Augmented Reality Industry Market Size, Trends 2033." 2024. https://www.marketgrowthreports.com/market-reports/virtual-and-augmented-reality-market-100490</ref> The depth cues from stereoscopic rendering prove essential for gameplay mechanics requiring accurate spatial judgment—from grabbing objects in [[Beat Saber]] to navigating complex environments in [[Half-Life: Alyx]].
 
=== Healthcare ===
 
Healthcare leads with 28.2% compound annual growth rate, the fastest-growing sector. Medical applications include:
* Surgical simulation and pre-surgical planning
* FDA-cleared VR therapies for pain management
* Mental health treatment for phobias and PTSD
* Medical student training with virtual cadavers
 
Medical students practice procedures in stereoscopic VR environments that provide depth perception critical for developing hand-eye coordination without requiring physical cadavers.<ref name="mordor2024"/>
 
=== Enterprise Training ===
 
Corporate training has achieved remarkable ROI with 67,000+ enterprises adopting VR-based training by 2024. Notable examples:
* [[Boeing]] reduced training hours by 75% using VR simulation
* [[Delta Air Lines]] increased technician proficiency by 5,000% through immersive maintenance training
* Break-even occurs at 375 learners; beyond 3,000 learners VR becomes 52% more cost-effective than traditional methods
 
=== Industrial Visualization ===
 
Industrial and manufacturing applications leverage stereoscopic rendering for product visualization, enabling engineers to examine [[CAD]] models at natural scale before physical prototyping. The AR/VR IoT manufacturing market projects $40-50 billion by 2025, with 75% of implementing companies reporting 10% operational efficiency improvements.
 
=== Augmented Reality ===
 
In AR, stereoscopic rendering is used in see-through devices to overlay 3D graphics into the user's view of the real world. Modern AR smartglasses like the [[Microsoft HoloLens]] and [[Magic Leap 2]] have dual waveguide displays that project virtual images with slight differences to each eye, ensuring virtual objects appear at specific depths in the real environment.<ref name="palušová2023"/>
 
== Industry Standards ==
 
=== OpenXR ===
 
The [[OpenXR]] standard, managed by the [[Khronos Group]] and finalized as version 1.0 in 2019, represents the VR/AR industry's convergence on a unified API.<ref name="khronos2024">Khronos Group. "OpenXR - High-performance access to AR and VR." 2024. https://www.khronos.org/openxr/</ref> OpenXR 1.1 includes stereo rendering with foveated rendering as a core capability, along with:
* [[Varjo]]'s quad-view configuration for bionic displays
* Local floor coordinate spaces for mixed reality
* 13 new interaction profiles spanning controllers and styluses
 
=== Engine Support ===
 
[[Unity (game engine)|Unity]]'s stereo rendering paths include:<ref name="unity_manual2024">Unity. "Unity - Manual: Stereo rendering." Unity Documentation, 2024. https://docs.unity3d.com/Manual/SinglePassStereoRendering.html</ref>
* Multi-Pass: Separate pass per eye, most compatible
* Single-Pass: Double-wide render target, modest savings
* Single-Pass Instanced: GPU instancing with texture arrays, optimal on supported platforms
 
[[Unreal Engine]] provides instanced stereo rendering through project settings, delivering 50% draw call reduction once enabled.
 
== Challenges and Solutions ==
 
=== Vergence-Accommodation Conflict ===
 
The vergence-accommodation conflict represents stereoscopic rendering's most fundamental limitation.<ref name="wikipedia_vac"/> In natural vision, vergence (eye rotation) and accommodation (lens focusing) work in synchrony. Stereoscopic displays decouple these: vergence indicates virtual object distance based on binocular disparity, but accommodation remains fixed at the physical display distance (typically 2-3 meters for HMDs).
 
Current mitigation strategies:
* Software best practices: Keep virtual content >0.5 meters from user
* Avoid rapid depth changes
* Limit session duration for sensitive users
 
Emerging solutions:
* '''[[Light field display]]s''': Reproduce the complete light field, allowing natural focus at different depths
* '''Varifocal displays''': Physically or optically adjust focal distance based on eye tracking
* '''[[Holographic display]]s''': Use wavefront reconstruction to create true 3D images
 
=== Latency and Motion-to-Photon Delay ===
 
Total system latency typically ranges 40-90ms, far exceeding the <20ms target required for presence. [[Asynchronous Time Warp]] (ATW) mitigates tracking latency by warping the last rendered frame using the latest head pose just before display refresh.<ref name="google_atw2024"/>
 
=== Common Artifacts ===
 
* '''Crosstalk (Ghosting)''': Incomplete separation of left/right images
* '''Cardboarding''': Objects appear flat due to insufficient interaxial distance
* '''Window Violations''': Objects with negative parallax clipped by screen edges
* '''Vertical Parallax''': Results from incorrect camera toe-in, causes immediate eye strain
 
== Emerging Technologies ==
 
=== Light Field Displays ===
 
[[Light field display]]s represent the most promising solution to the vergence-accommodation conflict. [[CREAL]], founded in 2017, has demonstrated functional prototypes progressing from table-top systems (2019) to AR glasses form factors (2024+).<ref name="roadtovr_creal2024">Road to VR. "Hands-on: CREAL's Light-field Display Brings a New Layer of Immersion to AR." 2024. https://www.roadtovr.com/creal-light-field-display-new-immersion-ar/</ref>
 
=== Neural Rendering ===
 
[[Neural Radiance Fields]] (NeRF) represent 3D scenes as continuous volumetric functions rather than discrete polygons. FoV-NeRF extends this for VR by incorporating human visual and stereo acuity, achieving 99% latency reduction compared to standard NeRF.<ref name="arxiv2024">arXiv. "Neural Rendering and Its Hardware Acceleration: A Review." 2024. https://arxiv.org/html/2402.00028v1</ref>
 
=== Holographic Displays ===
 
Recent breakthroughs include Meta/Stanford research demonstrating full-color 3D holographic AR with metasurface waveguides in eyeglass-scale form factors, and NVIDIA's "Holographic Glasses" with 2.5mm optical stack thickness.
 
== Future Trajectory ==
 
Stereoscopic rendering remains indispensable for VR and AR experiences requiring depth perception. The technique's evolution from mechanical stereoscopes to real-time GPU-accelerated rendering reflects advancing hardware capabilities and algorithmic innovations. Modern implementations reduce computational overhead by 30-70% compared to naive approaches, making immersive VR accessible on $300 standalone headsets rather than requiring $2000 gaming PCs.
 
The fundamental vergence-accommodation conflict represents a limitation of current display technology rather than stereoscopic rendering itself—one being actively solved through light field displays, holographic waveguides, and varifocal systems. The industry's convergence on OpenXR as a unified standard, combined with mature optimization techniques integrated into Unity and Unreal Engine, enables developers to target diverse platforms efficiently. The 171 million VR users in 2024 represent early adoption, with enterprise applications demonstrating stereoscopic rendering's value extends far beyond entertainment into training, healthcare, and industrial visualization.
 
== See Also ==
* [[Virtual reality]]
* [[Augmented reality]]
* [[Stereoscopy]]
* [[Binocular vision]]
* [[Depth perception]]
* [[Foveated rendering]]
* [[Light field display]]
* [[Neural rendering]]
* [[OpenXR]]
* [[Vergence-accommodation conflict]]
* [[Head-mounted display]]
* [[Computer graphics pipeline]]
* [[Projection matrix]]
* [[GPU]]
* [[Parallax]]
* [[Stereoscopic 3D]]
* [[Interpupillary distance]]
* [[Asynchronous Spacewarp]]
* [[Asynchronous Timewarp]]
* [[VR SLI]]
 
== References ==
<references>
<ref name="arm2021">ARM Software. "Introduction to Stereo Rendering - VR SDK for Android." ARM Developer Documentation, 2021. https://arm-software.github.io/vr-sdk-for-android/IntroductionToStereoRendering.html</ref>
<ref name="numberanalytics2024">Number Analytics. "Stereoscopy in VR: A Comprehensive Guide." 2024. https://www.numberanalytics.com/blog/ultimate-guide-stereoscopy-vr-ar-development</ref>
<ref name="mordor2024">Mordor Intelligence. "Virtual Reality (VR) Market Size, Report, Share & Growth Trends 2025-2030." 2024. https://www.mordorintelligence.com/industry-reports/virtual-reality-market</ref>
<ref name="drawandcode">Draw & Code. "What Is Stereoscopic VR Technology." January 23, 2024. https://drawandcode.com/learning-zone/what-is-stereoscopic-vr-technology/</ref>
<ref name="borisfx2024">Boris FX. "Monoscopic vs Stereoscopic 360 VR: Key Differences." 2024. https://borisfx.com/blog/monoscopic-vs-stereoscopic-360-vr-key-differences/</ref>
<ref name="afifi2020">Afifi, Mahmoud. "Basics of stereoscopic imaging in virtual and augmented reality systems." Medium, 2020. https://medium.com/@mahmoudnafifi/basics-of-stereoscopic-imaging-6f69a7916cfd</ref>
<ref name="wikipedia_depth">Wikipedia. "Depth perception." https://en.wikipedia.org/wiki/Depth_perception</ref>
<ref name="basic_principles">Newcastle University. "Basic Principles of Stereoscopic 3D." 2024. https://www.ncl.ac.uk/</ref>
<ref name="scratchapixel2024">Scratchapixel. "The Perspective and Orthographic Projection Matrix." 2024. https://www.scratchapixel.com/lessons/3d-basic-rendering/perspective-and-orthographic-projection-matrix/building-basic-perspective-projection-matrix.html</ref>
<ref name="wikipedia_vac">Wikipedia. "Vergence-accommodation conflict." https://en.wikipedia.org/wiki/Vergence-accommodation_conflict</ref>
<ref name="packet39_2017">Packet39. "The Accommodation-Vergence conflict and how it affects your kids (and yourself)." 2017. https://packet39.com/blog/2017/12/25/the-accommodation-vergence-conflict-and-how-it-affects-your-kids-and-yourself/</ref>
<ref name="googlearts2024">Google Arts & Culture. "Stereoscopy: the birth of 3D technology." 2024. https://artsandculture.google.com/story/stereoscopy-the-birth-of-3d-technology-the-royal-society/pwWRTNS-hqDN5g</ref>
<ref name="nextgen2024">Nextgeninvent. "Virtual Reality's Evolution From Science Fiction to Mainstream Technology." 2024. https://nextgeninvent.com/blogs/the-evolution-of-virtual-reality/</ref>
<ref name="siggraph2024">ACM SIGGRAPH. "Remember Stereo 3D on the PC? Have You Ever Wondered What Happened to It?" 2024. https://blog.siggraph.org/2024/10/stereo-3d-pc-history-decline.html/</ref>
<ref name="cavendish2024">Cavendishprofessionals. "The Evolution of VR and AR in Gaming: A Historical Perspective." 2024. https://www.cavendishprofessionals.com/the-evolution-of-vr-and-ar-in-gaming-a-historical-perspective/</ref>
<ref name="songho2024">Song Ho Ahn. "OpenGL Projection Matrix." 2024. https://www.songho.ca/opengl/gl_projectionmatrix.html</ref>
<ref name="bourke">Bourke, P. "Stereoscopic Rendering." 2024. http://paulbourke.net/stereographics/stereorender/</ref>
<ref name="utah2024">University of Utah. "Projection and View Frustums." Computer Graphics Course Material, 2024. https://my.eng.utah.edu/~cs6360/Lectures/frustum.pdf</ref>
<ref name="unity2024">Unity. "How to maximize AR and VR performance with advanced stereo rendering." Unity Blog, 2024. https://blog.unity.com/technology/how-to-maximize-ar-and-vr-performance-with-advanced-stereo-rendering</ref>
<ref name="nvidia2018">NVIDIA Developer. "Turing Multi-View Rendering in VRWorks." NVIDIA Technical Blog, 2018. https://developer.nvidia.com/blog/turing-multi-view-rendering-vrworks/</ref>
<ref name="iquilez">Quilez, Inigo. "Stereo rendering." 2024. https://iquilezles.org/articles/stereo/</ref>
<ref name="anandtech2016">AnandTech. "Simultaneous Multi-Projection: Reusing Geometry on the Cheap." 2016. https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/11</ref>
<ref name="roadtovr2016">Road to VR. "NVIDIA Explains Pascal's 'Lens Matched Shading' for VR." 2016. https://www.roadtovr.com/nvidia-explains-pascal-simultaneous-multi-projection-lens-matched-shading-for-vr/</ref>
<ref name="meta-reprojection">Meta for Developers. "Introducing Stereo Shading Reprojection." 2024. https://developers.meta.com/horizon/blog/introducing-stereo-shading-reprojection-for-unity/</ref>
<ref name="vrx2024">VRX. "What is Foveated Rendering? - VR Expert Blog." 2024. https://vrx.vr-expert.com/what-is-foveated-rendering-and-what-does-it-mean-for-vr/</ref>
<ref name="google_atw2024">Google. "Asynchronous Reprojection." Google VR Developers, 2024. https://developers.google.com/vr/discover/async-reprojection</ref>
<ref name="computercity2024">ComputerCity. "VR PC Hardware Requirements: Minimum and Recommended Specs." 2024. https://computercity.com/hardware/vr/vr-pc-hardware-requirements</ref>
<ref name="daqri2024">DAQRI. "Motion to Photon Latency in Mobile AR and VR." Medium, 2024. https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926</ref>
<ref name="palušová2023">Palušová, P. "Stereoscopy in Extended Reality: Utilizing Natural Binocular Disparity." 2023. https://www.petrapalusova.com/stereoscopy</ref>
<ref name="marketgrowth2024">Marketgrowthreports. "Virtual and Augmented Reality Industry Market Size, Trends 2033." 2024. https://www.marketgrowthreports.com/market-reports/virtual-and-augmented-reality-market-100490</ref>
<ref name="khronos2024">Khronos Group. "OpenXR - High-performance access to AR and VR." 2024. https://www.khronos.org/openxr/</ref>
<ref name="unity_manual2024">Unity. "Unity - Manual: Stereo rendering." Unity Documentation, 2024. https://docs.unity3d.com/Manual/SinglePassStereoRendering.html</ref>
<ref name="roadtovr_creal2024">Road to VR. "Hands-on: CREAL's Light-field Display Brings a New Layer of Immersion to AR." 2024. https://www.roadtovr.com/creal-light-field-display-new-immersion-ar/</ref>
<ref name="arxiv2024">arXiv. "Neural Rendering and Its Hardware Acceleration: A Review." 2024. https://arxiv.org/html/2402.00028v1</ref>
</references>
 
[[Category:Terms]]
[[Category:Computer graphics]]
[[Category:3D rendering]]
[[Category:Display technology]]
[[Category:Human–computer interaction]]