Jump to content

Stereoscopic rendering: Difference between revisions

No edit summary
 
(5 intermediate revisions by the same user not shown)
Line 17: Line 17:
| users = 171 million (2024)
| users = 171 million (2024)
}}
}}
[[File:stereoscopic rendering1.jpg|300px|right]]
[[File:stereoscopic rendering2.jpg|300px|right]]


'''Stereoscopic rendering''' is the foundational [[computer graphics]] technique that creates the perception of three-dimensional depth in [[virtual reality]] (VR) and [[augmented reality]] (AR) systems by generating two slightly different images from distinct viewpoints corresponding to the left and right eyes.<ref name="arm2021">ARM Software. "Introduction to Stereo Rendering - VR SDK for Android." ARM Developer Documentation, 2021. https://arm-software.github.io/vr-sdk-for-android/IntroductionToStereoRendering.html</ref> This technique exploits [[binocular disparity]]—the horizontal displacement between corresponding points in the two images—enabling the [[visual cortex]] to reconstruct depth information through [[stereopsis]], the same process human eyes use to perceive the real world.<ref name="numberanalytics2024">Number Analytics. "Stereoscopy in VR: A Comprehensive Guide." 2024. https://www.numberanalytics.com/blog/ultimate-guide-stereoscopy-vr-ar-development</ref> By delivering two offset images (one per eye) that the brain combines into a single scene, stereoscopic rendering produces an illusion of depth that mimics natural [[binocular vision]].<ref name="drawandcode">Draw & Code. "What Is Stereoscopic VR Technology." January 23, 2024. https://drawandcode.com/learning-zone/what-is-stereoscopic-vr-technology/</ref>
'''Stereoscopic rendering''' is the foundational [[computer graphics]] technique that creates the perception of three-dimensional depth in [[virtual reality]] (VR) and [[augmented reality]] (AR) systems by generating two slightly different images from distinct viewpoints corresponding to the left and right eyes.<ref name="arm2021">ARM Software. "Introduction to Stereo Rendering - VR SDK for Android." ARM Developer Documentation, 2021. https://arm-software.github.io/vr-sdk-for-android/IntroductionToStereoRendering.html</ref> This technique exploits [[binocular disparity]]—the horizontal displacement between corresponding points in the two images—enabling the [[visual cortex]] to reconstruct depth information through [[stereopsis]], the same process human eyes use to perceive the real world.<ref name="numberanalytics2024">Number Analytics. "Stereoscopy in VR: A Comprehensive Guide." 2024. https://www.numberanalytics.com/blog/ultimate-guide-stereoscopy-vr-ar-development</ref> By delivering two offset images (one per eye) that the brain combines into a single scene, stereoscopic rendering produces an illusion of depth that mimics natural [[binocular vision]].<ref name="drawandcode">Draw & Code. "What Is Stereoscopic VR Technology." January 23, 2024. https://drawandcode.com/learning-zone/what-is-stereoscopic-vr-technology/</ref>
Line 75: Line 78:
| 1991 || Virtuality VR arcades || Real-time stereoscopic multiplayer VR
| 1991 || Virtuality VR arcades || Real-time stereoscopic multiplayer VR
|-
|-
| 1995 || Nintendo Virtual Boy || Portable stereoscopic gaming console
| 1995 || [[Nintendo Virtual Boy]] || Portable stereoscopic gaming console
|-
|-
| 2010 || Oculus Rift prototype || Modern stereoscopic HMD revival
| 2010 || [[Oculus Rift]] prototype || Modern stereoscopic HMD revival
|-
|-
| 2016 || HTC Vive/Oculus Rift CV1 release || Consumer room-scale stereoscopic VR
| 2016 || [[HTC Vive]]/[[Oculus Rift CV1]] release || Consumer room-scale stereoscopic VR
|-
|-
| 2023 || Apple Vision Pro || High-resolution stereoscopic mixed reality (70 pixels per degree)
| 2023 || [[Apple Vision Pro]] || High-resolution stereoscopic mixed reality (70 pixels per degree)
|}
|}


Line 139: Line 142:


Single-pass stereo rendering optimizes by traversing the scene graph once while rendering to both eye buffers.<ref name="nvidia2018">NVIDIA Developer. "Turing Multi-View Rendering in VRWorks." NVIDIA Technical Blog, 2018. https://developer.nvidia.com/blog/turing-multi-view-rendering-vrworks/</ref> Single-pass instanced approach uses GPU instancing with instance count of 2, where the [[vertex shader]] outputs positions for both views simultaneously. Example shader code:
Single-pass stereo rendering optimizes by traversing the scene graph once while rendering to both eye buffers.<ref name="nvidia2018">NVIDIA Developer. "Turing Multi-View Rendering in VRWorks." NVIDIA Technical Blog, 2018. https://developer.nvidia.com/blog/turing-multi-view-rendering-vrworks/</ref> Single-pass instanced approach uses GPU instancing with instance count of 2, where the [[vertex shader]] outputs positions for both views simultaneously. Example shader code:
```glsl
<pre>
uniform EyeUniforms {
uniform EyeUniforms {
     mat4 mMatrix[2];
     mat4 mMatrix[2];
};
};
vec4 pos = mMatrix[gl_InvocationID] * vertex;
vec4 pos = mMatrix[gl_InvocationID] * vertex;
```
</pre>


This technique halves draw call count compared to multi-pass, reducing CPU bottlenecks in complex scenes.<ref name="iquilez">Quilez, Inigo. "Stereo rendering." 2024. https://iquilezles.org/articles/stereo/</ref>
This technique halves draw call count compared to multi-pass, reducing CPU bottlenecks in complex scenes.<ref name="iquilez">Quilez, Inigo. "Stereo rendering." 2024. https://iquilezles.org/articles/stereo/</ref>
Line 205: Line 208:
* '''[[Head-Mounted Display]]s (HMDs)''': Modern VR and AR headsets achieve perfect image separation using either two separate micro-displays (one for each eye) or a single display partitioned by optics. This direct-view approach completely isolates the left and right eye views, eliminating [[crosstalk]].<ref name="drawandcode"/>
* '''[[Head-Mounted Display]]s (HMDs)''': Modern VR and AR headsets achieve perfect image separation using either two separate micro-displays (one for each eye) or a single display partitioned by optics. This direct-view approach completely isolates the left and right eye views, eliminating [[crosstalk]].<ref name="drawandcode"/>


* '''Color Filtering ([[Anaglyph 3D|Anaglyph]])''': Uses glasses with filters of different colors, typically red and cyan. Very inexpensive but suffers from severe color distortion and ghosting.<ref name="basic_principles"/>
* '''[[Color Filtering]] ([[Anaglyph 3D|Anaglyph]])''': Uses glasses with filters of different colors, typically red and cyan. Very inexpensive but suffers from severe color distortion and ghosting.<ref name="basic_principles"/>


* '''[[Polarized 3D system|Polarization]]''': Uses glasses with differently polarized lenses. Linear polarization orients filters at 90 degrees; circular polarization uses opposite clockwise/counter-clockwise polarization. Commonly used in 3D cinemas.<ref name="palušová2023">Palušová, P. "Stereoscopy in Extended Reality: Utilizing Natural Binocular Disparity." 2023. https://www.petrapalusova.com/stereoscopy</ref>
* '''[[Polarized 3D system|Polarization]]''': Uses glasses with differently polarized lenses. Linear polarization orients filters at 90 degrees; circular polarization uses opposite clockwise/counter-clockwise polarization. Commonly used in 3D cinemas.<ref name="palušová2023">Palušová, P. "Stereoscopy in Extended Reality: Utilizing Natural Binocular Disparity." 2023. https://www.petrapalusova.com/stereoscopy</ref>


* '''Time Multiplexing (Active Shutter)''': Display alternates between left and right images at high speed (120+ Hz). Viewer wears LCD shutter glasses synchronized to the display. Delivers full resolution to each eye.<ref name="basic_principles"/>
* '''[[Time Multiplexing]] (Active Shutter)''': Display alternates between left and right images at high speed (120+ Hz). Viewer wears LCD shutter glasses synchronized to the display. Delivers full resolution to each eye.<ref name="basic_principles"/>


* '''[[Autostereoscopy]] (Glasses-Free 3D)''': Uses optical elements like [[parallax barrier]]s or [[lenticular lens]]es to direct different pixels to each eye. Limited by narrow optimal viewing angle.<ref name="palušová2023"/>
* '''[[Autostereoscopy]] (Glasses-Free 3D)''': Uses optical elements like [[parallax barrier]]s or [[lenticular lens]]es to direct different pixels to each eye. Limited by narrow optimal viewing angle.<ref name="palušová2023"/>
Line 369: Line 372:
</references>
</references>


[[Category:Virtual reality]]
[[Category:Terms]]
[[Category:Augmented reality]]
[[Category:Computer graphics]]
[[Category:Computer graphics]]
[[Category:3D rendering]]
[[Category:3D rendering]]
[[Category:Display technology]]
[[Category:Display technology]]
[[Category:Human–computer interaction]]
[[Category:Human–computer interaction]]