Jump to content

Passthrough: Difference between revisions

No edit summary
No edit summary
Line 1: Line 1:
{{see also|Terms|Technical Terms}}
{{see also|Terms|Technical Terms}}
[[Passthrough]], often referred to as '''video passthrough''', is a feature found in [[Virtual Reality]] (VR) and [[Mixed Reality]] (MR) [[head-mounted display|headsets]] that utilizes external [[camera|cameras]] to capture a live video feed of the physical environment around the user and display it on the internal screens within the headset.<ref name="XRToday_def">XR Today - What is VR Passthrough and How is it Shaping the Future of XR? (Immersive Learning News, Dec 2024)</ref><ref name="VIVE_Blog_Sauce">[https://blog.vive.com/us/what-is-vr-passthrough-mixed-realitys-secret-sauce/ VIVE Blog: What is VR Passthrough? Mixed Reality's Secret Sauce]</ref> This capability effectively allows users to see the "real world" without removing the headset, bridging the gap between fully immersive virtual experiences and the user's actual surroundings.
[[Passthrough]], often referred to as '''video passthrough''', is a feature found in [[Virtual Reality]] (VR) and [[Mixed Reality]] (MR) [[head-mounted display|headsets]] that utilizes external [[camera|cameras]] to capture a live video feed of the physical environment around the user and display it on the internal screens within the headset.<ref name="XRToday_def">[https://www.xrtoday.com/virtual-reality/what-is-vr-passthrough-and-how-is-it-shaping-the-future-of-xr/ XR Today – What is VR Passthrough and How is it Shaping the Future of XR?] (Dec 2024)</ref><ref name="VIVE_Blog_Sauce">[https://blog.vive.com/us/what-is-vr-passthrough-mixed-realitys-secret-sauce/ VIVE Blog: What is VR Passthrough? Mixed Reality's Secret Sauce]</ref> This capability effectively allows users to see the "real world" without removing the headset, bridging the gap between fully immersive virtual experiences and the user's actual surroundings.


While primarily a feature of VR headsets aiming to add environmental awareness or MR capabilities, it functions as a form of [[Augmented Reality]] (AR), often termed "Video See-Through AR" (VST AR) or sometimes "pseudo-AR," as opposed to "[[Optical See-Through]] AR" (OST AR) systems which use transparent displays.<ref name="SkarbezVSTvsOST">[https://www.researchgate.net/publication/315722770_Revisiting_Milgram_and_Kishino%27s_Reality-Virtuality_Continuum Revisiting Milgram and Kishino's Reality-Virtuality Continuum] - Discusses the spectrum including Video See-Through.</ref> Passthrough is a key enabler of [[mixed reality]] and [[spatial computing]] experiences on modern headsets.
While primarily a feature of VR headsets aiming to add environmental awareness or MR capabilities, it functions as a form of [[Augmented Reality]] (AR), often termed "Video See-Through AR" (VST AR) or sometimes "pseudo-AR," as opposed to "[[Optical See-Through]] AR" (OST AR) systems which use transparent displays.<ref name="SkarbezVSTvsOST">[https://www.researchgate.net/publication/315722770_Revisiting_Milgram_and_Kishino%27s_Reality-Virtuality_Continuum Revisiting Milgram and Kishino's Reality-Virtuality Continuum] - Discusses the spectrum including Video See-Through.</ref> Passthrough is a key enabler of [[mixed reality]] and [[spatial computing]] experiences on modern headsets.
Line 7: Line 7:
The fundamental principle of passthrough involves a real-time processing pipeline:
The fundamental principle of passthrough involves a real-time processing pipeline:


# '''Capture:''' One or more outward-facing digital cameras mounted on the headset capture video of the external world. Early or basic systems might use a single camera (providing a monoscopic view), while more advanced systems use two or more cameras to capture [[stereoscopic]] video, enabling [[depth perception]].<ref name="StereoPassthrough">[https://ieeexplore.ieee.org/document/9191148 Example paper discussing stereoscopic passthrough challenges]</ref> Modern systems often use a combination of [[RGB]] color cameras and monochrome (grayscale) sensors for different purposes (for example capturing color data vs. motion/detail).<ref name="MixedNews_Cambria">MIXED News - Project Cambria: Meta explains new passthrough technology (Tomislav Bezmalinović, May 16, 2022)</ref>
# '''Capture:''' One or more outward-facing digital cameras mounted on the headset capture video of the external world. Early or basic systems might use a single camera (providing a monoscopic view), while more advanced systems use two or more cameras to capture [[stereoscopic]] video, enabling [[depth perception]].<ref name="StereoPassthrough">[https://ieeexplore.ieee.org/document/9191148 Example paper discussing stereoscopic passthrough challenges]</ref> Modern systems often use a combination of [[RGB]] color cameras and monochrome (grayscale) sensors for different purposes (for example capturing color data vs. motion/detail).<ref name="MixedNews_Cambria">[https://mixed-news.com/en/project-cambria-meta-explains-new-passthrough-technology/ MIXED News – Project Cambria: Meta explains new passthrough technology] (May 16 2022)</ref>
# '''Processing:''' The captured video footage is sent to the headset's [[processor]] (either an onboard [[System on a Chip|SoC]] or a connected PC's [[GPU]]). This stage is computationally intensive and critical for a usable and comfortable experience. It typically involves several steps:
# '''Processing:''' The captured video footage is sent to the headset's [[processor]] (either an onboard [[System on a Chip|SoC]] or a connected PC's [[GPU]]). This stage is computationally intensive and critical for a usable and comfortable experience. It typically involves several steps:
#* '''Rectification/Undistortion:''' Correcting [[lens distortion]] inherent in the wide-angle cameras typically used to maximize [[field of view|FOV]].
#* '''Rectification/Undistortion:''' Correcting [[lens distortion]] inherent in the wide-angle cameras typically used to maximize [[field of view|FOV]].
#* '''Reprojection/Warping:''' Adjusting the captured image perspective to align with the user's eye position inside the headset, rather than the camera's physical position on the outside. This difference in viewpoint causes [[parallax]], and correcting it ("perspective correction") is crucial for accurate spatial representation, correct scale perception, and minimizing [[motion sickness]].<ref name="PassthroughChallengesUploadVR">[https://uploadvr.com/passthrough-ar-technical-challenges/ Passthrough AR: The Technical Challenges of Blending Realities] - UploadVR article discussing latency, distortion, etc.</ref><ref name="KGuttag_Align">KGOnTech (Karl Guttag) - Perspective Correct Passthrough (Sept 26, 2023)</ref> Algorithms based on [[Computer Vision]] and potentially [[Inertial Measurement Unit|IMU]] sensor data are used. Some modern headsets, like the [[Meta Quest Pro]] and [[Meta Quest 3]], employ [[Machine Learning]] or [[Neural Network|neural networks]] to improve the realism and accuracy of this reconstruction.<ref name="QuestProPassthrough">[https://www.meta.com/blog/quest/meta-reality-passthrough-quest-pro/ Meta Blog: Inside Meta Reality and Passthrough on Quest Pro]</ref>
#* '''Reprojection/Warping:''' Adjusting the captured image perspective to align with the user's eye position inside the headset, rather than the camera's physical position on the outside. This difference in viewpoint causes [[parallax]], and correcting it ("perspective correction") is crucial for accurate spatial representation, correct scale perception, and minimizing [[motion sickness]].<ref name="PassthroughChallengesUploadVR">[https://www.uploadvr.com/quest-3-passthrough-will-improve-meta-cto/ UploadVR – Passthrough AR: The Technical Challenges of Blending Realities] (Oct 23 2023)</ref><ref name="KGuttag_Align">[https://kguttag.com/2023/09/26/apple-vision-pro-part-6-passthrough-mixed-reality-ptmr-problems/ KGOnTech – Perspective‑Correct Passthrough] (Sept 26 2023)</ref> Algorithms based on [[Computer Vision]] and potentially [[Inertial Measurement Unit|IMU]] sensor data are used. Some modern headsets, like the [[Meta Quest Pro]] and [[Meta Quest 3]], employ [[Machine Learning]] or [[Neural Network|neural networks]] to improve the realism and accuracy of this reconstruction.<ref name="QuestProPassthrough">[https://www.meta.com/blog/quest/meta-reality-passthrough-quest-pro/ Meta Blog: Inside Meta Reality and Passthrough on Quest Pro]</ref>
#* '''[[Sensor Fusion]]:''' Combining data from multiple cameras (for example fusing monochrome detail with RGB color<ref name="MixedNews_Cambria"/>) and integrating tracking data (for example from [[inside-out tracking]] sensors or [[depth sensor]]s) to ensure the passthrough view remains stable, depth-correct, and aligned with the user's head movements.
#* '''[[Sensor Fusion]]:''' Combining data from multiple cameras (for example fusing monochrome detail with RGB color<ref name="MixedNews_Cambria"/>) and integrating tracking data (for example from [[inside-out tracking]] sensors or [[depth sensor]]s) to ensure the passthrough view remains stable, depth-correct, and aligned with the user's head movements.
#* '''Color Correction & Enhancement:''' Adjusting colors, brightness, and contrast to appear more natural, especially under varying lighting conditions. This can also involve [[Artificial Intelligence|AI]]-based denoising or upscaling.<ref name="UploadVR_Q3Review">UploadVR - Quest 3 Review: Excellent VR With Limited MR (David Heaney, Oct 9, 2023)</ref>
#* '''Color Correction & Enhancement:''' Adjusting colors, brightness, and contrast to appear more natural, especially under varying lighting conditions. This can also involve [[Artificial Intelligence|AI]]-based denoising or upscaling.<ref name="UploadVR_Q3Review">[https://www.uploadvr.com/quest-3-review/ UploadVR – Quest 3 Review: Excellent VR With Limited MR] (Oct 9 2023)</ref>
# '''Display:''' The processed video feed is rendered onto the headset's internal [[display|displays]], replacing or being overlaid upon the virtual content. The primary goal is to achieve this entire pipeline with minimal [[latency (engineering)|latency]] (ideally under 20 milliseconds<ref name="LatencyThreshold">[https://research.nvidia.com/publication/2016-07_Latency-Requirements-Plausible-Interaction-Augmented-and-Virtual-Reality Latency Requirements for Plausible Interaction in Augmented and Virtual Reality] - Research discussing latency impact.</ref>) to avoid discomfort and maintain realism.
# '''Display:''' The processed video feed is rendered onto the headset's internal [[display|displays]], replacing or being overlaid upon the virtual content. The primary goal is to achieve this entire pipeline with minimal [[latency (engineering)|latency]] (ideally under 20 milliseconds<ref name="LatencyThreshold">[https://research.nvidia.com/publication/2016-07_Latency-Requirements-Plausible-Interaction-Augmented-and-Virtual-Reality Latency Requirements for Plausible Interaction in Augmented and Virtual Reality] - Research discussing latency impact.</ref>) to avoid discomfort and maintain realism.


Line 18: Line 18:
While the concept of video passthrough existed in research labs for decades,<ref name="MilgramKishino1994">Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Transactions on Information Systems, E77-D(12), 1321-1329.</ref> its implementation in consumer VR headsets evolved significantly:
While the concept of video passthrough existed in research labs for decades,<ref name="MilgramKishino1994">Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Transactions on Information Systems, E77-D(12), 1321-1329.</ref> its implementation in consumer VR headsets evolved significantly:


*'''Early Stages (Mid-2010s):''' Passthrough began appearing primarily as a safety feature. In 2016, the [[HTC Vive]] prototype (Vive Pre) introduced a front-facing camera providing a basic, monochrome, 2D view for obstacle avoidance. Valve's software projected this onto a virtual sphere to approximate perspective.<ref name="RoadToVR_Vive">Road to VR - 8 Minutes of the HTC Vive’s Front-facing Camera in Action (Paul James, Mar 10, 2016)</ref> It was low-resolution and intended for brief checks.
*'''Early Stages (Mid-2010s):''' Passthrough began appearing primarily as a safety feature. In 2016, the [[HTC Vive]] prototype (Vive Pre) introduced a front-facing camera providing a basic, monochrome, 2D view for obstacle avoidance. Valve's software projected this onto a virtual sphere to approximate perspective.<ref name="RoadToVR_Vive">[https://www.roadtovr.com/htc-vive-front-facing-camera-video/ Road to VR – 8 Minutes of the HTC Vive’s Front‑facing Camera in Action] (Mar 10 2016)</ref> It was low-resolution and intended for brief checks.
*'''Integrated Monochrome (Late 2010s):''' Headsets using inside-out tracking leveraged their tracking cameras for improved passthrough. The [[Oculus Rift S]] (2019) offered "Passthrough+" using its multiple monochrome cameras for a stereoscopic view.<ref name="RiftS_Docs">Oculus Rift S Product Documentation. (2019).</ref> The original [[Meta Quest|Oculus Quest]] (2019) and [[Meta Quest 2]] (2020) provided similar basic monochrome passthrough, mainly for setting up the [[Guardian system]] and quick environment checks.<ref name="Quest2Passthrough">[https://www.meta.com/blog/quest/oculus-quest-2-passthrough-public-api-passthrough/ Quest Blog on Quest 2 Passthrough improvements]</ref>
*'''Integrated Monochrome (Late 2010s):''' Headsets using inside-out tracking leveraged their tracking cameras for improved passthrough. The [[Oculus Rift S]] (2019) offered "Passthrough+" using its multiple monochrome cameras for a stereoscopic view.<ref name="RiftS_Docs">[https://www.meta.com/help/quest/459034058728993/ Oculus Rift S Product Documentation] (2019)</ref> The original [[Meta Quest|Oculus Quest]] (2019) and [[Meta Quest 2]] (2020) provided similar basic monochrome passthrough, mainly for setting up the [[Guardian system]] and quick environment checks.<ref name="Quest2Passthrough">[https://www.meta.com/blog/quest/oculus-quest-2-passthrough-public-api-passthrough/ Quest Blog on Quest 2 Passthrough improvements]</ref>
*'''Early Mixed Reality Steps (Early 2020s):''' In 2021, Meta released an experimental Passthrough API for Quest 2 developers, allowing apps to overlay virtual elements onto the monochrome feed, marking a step towards consumer MR.<ref name="PCMag_passthrough">PCMag - Oculus Experiments With Mixed Reality via New Passthrough API (Nathaniel Mott, July 25, 2021)</ref> Simultaneously, enterprise headsets like the [[Varjo]] XR-1 (2019) and XR-3 (2021) pushed high-fidelity color passthrough with dual high-resolution cameras, setting a benchmark for quality.<ref name="Skarredghost_Varjo">The Ghost Howls - Varjo XR-3 hands-on review (Tony Vitillo, June 8, 2022)</ref>
*'''Early Mixed Reality Steps (Early 2020s):''' In 2021, Meta released an experimental Passthrough API for Quest 2 developers, allowing apps to overlay virtual elements onto the monochrome feed, marking a step towards consumer MR.<ref name="PCMag_passthrough">[https://developers.meta.com/horizon/blog/mixed-reality-with-passthrough/ Meta Developer Blog – Oculus Experiments With Mixed Reality via New Passthrough API] (July 25 2021)</ref> Simultaneously, enterprise headsets like the [[Varjo]] XR-1 (2019) and XR-3 (2021) pushed high-fidelity color passthrough with dual high-resolution cameras, setting a benchmark for quality.<ref name="Skarredghost_Varjo">[https://skarredghost.com/2022/06/08/varjo-xr-3-hands-on-review/ The Ghost Howls – Varjo XR‑3 hands‑on review] (June 8 2022)</ref>
*'''Mainstream Color Passthrough (2022-Present):'''
*'''Mainstream Color Passthrough (2022-Present):'''
**The [[Meta Quest Pro]] (2022) was the first major consumer headset featuring high-quality, stereoscopic color passthrough, using a novel camera array (monochrome for depth/detail, RGB for color) and ML reconstruction.<ref name="MixedNews_Cambria"/>
**The [[Meta Quest Pro]] (2022) was the first major consumer headset featuring high-quality, stereoscopic color passthrough, using a novel camera array (monochrome for depth/detail, RGB for color) and ML reconstruction.<ref name="MixedNews_Cambria"/>
**Competitors like the [[Pico 4]] (late 2022) and [[HTC Vive XR Elite]] (2023) also introduced color passthrough, although early implementations like the Pico 4's were initially monoscopic and lacked depth correction.<ref name="Reddit_PicoMono">Reddit - Meta Quest Pro vs PICO 4 Passthrough Comparison (discussion summary, 2022)</ref><ref name="ViveXRElite">[https://www.vive.com/us/product/vive-xr-elite/overview/ HTC Vive XR Elite Product Page]</ref>
**Competitors like the [[Pico 4]] (late 2022) and [[HTC Vive XR Elite]] (2023) also introduced color passthrough, although early implementations like the Pico 4's were initially monoscopic and lacked depth correction.<ref name="Reddit_PicoMono">[https://www.reddit.com/r/virtualreality/comments/ynz7yv/meta_quest_pro_vs_pico_4_passthrough_comparison/ Reddit – Meta Quest Pro vs PICO 4 Passthrough Comparison] (2022 thread)</ref><ref name="ViveXRElite">[https://www.vive.com/us/product/vive-xr-elite/overview/ HTC Vive XR Elite Product Page]</ref>
**Sony's [[PlayStation VR2]] (2023) included stereo passthrough, but kept it black-and-white, accessible via a dedicated button for quick checks.<ref name="RoadToVR_PSVR2">Road to VR - PSVR 2 Review (Ben Lang, Feb 22, 2023)</ref>
**Sony's [[PlayStation VR2]] (2023) included stereo passthrough, but kept it black-and-white, accessible via a dedicated button for quick checks.<ref name="RoadToVR_PSVR2">[https://www.roadtovr.com/psvr-2-review/ Road to VR – PSVR 2 Review] (Feb 22 2023)</ref>
**The [[Meta Quest 3]] (late 2023) brought high-resolution stereo color passthrough with an active depth sensor (structured light projector) to the mainstream consumer market, offering significantly improved clarity and depth accuracy over Quest 2 and Quest Pro.<ref name="UploadVR_Q3Review"/><ref name="Quest3PassthroughReview">[https://www.roadtovr.com/meta-quest-3-review-vr-mixed-reality-ar/ RoadToVR Quest 3 Review detailing passthrough improvements]</ref>
**The [[Meta Quest 3]] (late 2023) brought high-resolution stereo color passthrough with an active depth sensor (structured light projector) to the mainstream consumer market, offering significantly improved clarity and depth accuracy over Quest 2 and Quest Pro.<ref name="UploadVR_Q3Review"/><ref name="Quest3PassthroughReview">[https://www.roadtovr.com/meta-quest-3-review-vr-mixed-reality-ar/ RoadToVR Quest 3 Review detailing passthrough improvements]</ref>
**The [[Apple Vision Pro]] (2023 announcement, 2024 release) emphasized passthrough-based MR ("[[spatial computing]]"), using dual high-resolution color cameras, advanced processing ([[Apple R1]] chip), and a [[LiDAR]] scanner for precise depth mapping.<ref name="VisionProPassthrough">[https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/ Apple Vision Pro Announcement]</ref><ref name="Verge_VisionPro">The Verge - Apple Vision Pro review: magic, until it’s not (Nilay Patel, June 2023)</ref>
**The [[Apple Vision Pro]] (2023 announcement, 2024 release) emphasized passthrough-based MR ("[[spatial computing]]"), using dual high-resolution color cameras, advanced processing ([[Apple R1]] chip), and a [[LiDAR]] scanner for precise depth mapping.<ref name="VisionProPassthrough">[https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/ Apple Vision Pro Announcement]</ref><ref name="Verge_VisionPro">The Verge - Apple Vision Pro review: magic, until it’s not (Nilay Patel, June 2023)</ref>
**Other high-end devices like the [[Pimax Crystal]] (2023) and [[Varjo XR-4]] (late 2023) continued to push resolution and fidelity.<ref name="PimaxCrystal">[https://pimax.com/crystal/ Pimax Crystal Product Page]</ref><ref name="VarjoXR4">[https://varjo.com/products/xr-4/ Varjo XR-4 Product Page]</ref>
**Other high-end devices like the [[Pimax Crystal]] (2023) and [[Varjo XR-4]] (late 2023) continued to push resolution and fidelity.<ref name="PimaxCrystal">[https://pimax.com/crystal/ Pimax Crystal Product Page]</ref><ref name="VarjoXR4">[https://varjo.com/products/xr-4/ Varjo XR-4 Product Page]</ref>
**Even mid-range devices began incorporating improved color passthrough and depth sensing, like the anticipated Pico 4 Ultra (2024).<ref name="Auganix_Pico4Ultra">Auganix - Pico Unveils Pico 4 Ultra… (Aug 21, 2024)</ref>
**Even mid-range devices began incorporating improved color passthrough and depth sensing, like the anticipated Pico 4 Ultra (2024).<ref name="Auganix_Pico4Ultra">[https://www.auganix.org/vr-news-pico-launches-pico-4-ultra/ Auganix – Pico Unveils ‘Pico 4 Ultra’] (Aug 21 2024)</ref>


Passthrough has evolved from a basic safety utility to a core feature enabling sophisticated mixed reality experiences, blurring the lines between traditional VR and AR.
Passthrough has evolved from a basic safety utility to a core feature enabling sophisticated mixed reality experiences, blurring the lines between traditional VR and AR.
Line 39: Line 39:


===Color Passthrough===
===Color Passthrough===
Uses [[RGB]] color cameras for a full-color view of the real world, greatly enhancing realism and enabling use cases like reading phone screens or interacting with colored objects. First widely available consumer example was Meta Quest Pro.<ref name="MixedNews_Cambria"/> Quality varies significantly based on camera resolution, processing, and calibration (for example Quest 3 offers ~10x the passthrough pixels of Quest 2).<ref name="UploadVR_specs">UploadVR - Quest 3 Specs Compared To Quest 2 & Apple Vision Pro (David Heaney, Sep 27, 2023)</ref> High-quality color passthrough (for example Varjo XR series, Vision Pro) aims for near-photorealism.<ref name="Skarredghost_Varjo"/><ref name="VisionProPassthrough"/> Requires more powerful hardware and sophisticated software.
Uses [[RGB]] color cameras for a full-color view of the real world, greatly enhancing realism and enabling use cases like reading phone screens or interacting with colored objects. First widely available consumer example was Meta Quest Pro.<ref name="MixedNews_Cambria"/> Quality varies significantly based on camera resolution, processing, and calibration (for example Quest 3 offers ~10x the passthrough pixels of Quest 2).<ref name="UploadVR_specs">[https://www.uploadvr.com/quest-3-specs/ UploadVR – Quest 3 Specs Compared to Quest 2 & Apple Vision Pro] (Sept 27 2023)</ref> High-quality color passthrough (for example Varjo XR series, Vision Pro) aims for near-photorealism.<ref name="Skarredghost_Varjo"/><ref name="VisionProPassthrough"/> Requires more powerful hardware and sophisticated software.


===Monoscopic vs. Stereoscopic===
===Monoscopic vs. Stereoscopic===
Line 66: Line 66:
Creating high-quality, comfortable passthrough involves overcoming significant hurdles:
Creating high-quality, comfortable passthrough involves overcoming significant hurdles:


*'''[[Latency (engineering)|Latency]]:''' The delay between real-world motion and the passthrough display update ([[photon-to-photon latency]]). High latency (>~20ms<ref name="LatencyThreshold"/>) causes disorientation, [[motion sickness]] ("world swimming"), and breaks immersion. Fast processing pipelines are essential.<ref name="PassthroughChallengesUploadVR"/> Residual latency can cause ghosting or trailing artifacts on moving objects.<ref name="UploadVR_ghosting">UploadVR - Quest 3 Review (on ghosting issue)</ref>
*'''[[Latency (engineering)|Latency]]:''' The delay between real-world motion and the passthrough display update ([[photon-to-photon latency]]). High latency (>~20ms<ref name="LatencyThreshold"/>) causes disorientation, [[motion sickness]] ("world swimming"), and breaks immersion. Fast processing pipelines are essential.<ref name="PassthroughChallengesUploadVR"/> Residual latency can cause ghosting or trailing artifacts on moving objects.<ref name="UploadVR_ghosting">[https://www.uploadvr.com/quest-3-review/#passthrough Ghosting issue noted in UploadVR Quest 3 Review] (Oct 9 2023)</ref>
*'''Resolution and Image Quality:''' Camera feeds are often lower resolution than human vision, leading to pixelation or blurriness, making fine details (like text) hard to see.<ref name="CameraLimitations">[https://arstechnica.com/gadgets/2023/10/quest-3-review-finally-real-mixed-reality-for-under-500/ Ars Technica Quest 3 Review discussing passthrough quality]</ref> Limited [[dynamic range]] struggles with bright highlights and dark shadows compared to the human eye. Poor [[low-light performance]] results in noisy, grainy images.<ref name="CameraLimitations"/> Achieving high resolution and good image quality requires better sensors and significant processing power.
*'''Resolution and Image Quality:''' Camera feeds are often lower resolution than human vision, leading to pixelation or blurriness, making fine details (like text) hard to see.<ref name="CameraLimitations">[https://arstechnica.com/gadgets/2023/10/quest-3-review-finally-real-mixed-reality-for-under-500/ Ars Technica Quest 3 Review discussing passthrough quality]</ref> Limited [[dynamic range]] struggles with bright highlights and dark shadows compared to the human eye. Poor [[low-light performance]] results in noisy, grainy images.<ref name="CameraLimitations"/> Achieving high resolution and good image quality requires better sensors and significant processing power.
*'''Camera Placement and Perspective Mismatch:''' Cameras are offset from the user's eyes, causing [[parallax]] errors if not corrected. Naive display leads to distorted views, incorrect scale, and depth perception issues, especially for close objects.<ref name="PassthroughChallengesUploadVR"/> Sophisticated [[reprojection]] algorithms are needed to warp the camera view to match the eye's perspective, but perfect correction is difficult.<ref name="KGuttag_Align"/> This geometric misalignment can cause eye strain or discomfort.<ref name="KGuttag_Align"/> Close objects (<~0.5m) often appear warped even in good systems due to sensor/lens limitations and reprojection challenges.<ref name="UploadVR_Q3Review_MR"/>
*'''Camera Placement and Perspective Mismatch:''' Cameras are offset from the user's eyes, causing [[parallax]] errors if not corrected. Naive display leads to distorted views, incorrect scale, and depth perception issues, especially for close objects.<ref name="PassthroughChallengesUploadVR"/> Sophisticated [[reprojection]] algorithms are needed to warp the camera view to match the eye's perspective, but perfect correction is difficult.<ref name="KGuttag_Align"/> This geometric misalignment can cause eye strain or discomfort.<ref name="KGuttag_Align"/> Close objects (<~0.5m) often appear warped even in good systems due to sensor/lens limitations and reprojection challenges.<ref name="UploadVR_Q3Review_MR"/>
Line 97: Line 97:


===Enterprise and Professional Uses===
===Enterprise and Professional Uses===
*'''Collaboration:''' Design reviews where virtual prototypes are viewed in a real meeting room alongside physical mockups or colleagues.<ref name="XRToday_enterprise">XR Today - VR Passthrough in Enterprise (Immersive Learning News)</ref> Remote collaboration where experts guide on-site technicians using virtual annotations overlaid on the real equipment view.
*'''Collaboration:''' Design reviews where virtual prototypes are viewed in a real meeting room alongside physical mockups or colleagues.<ref name="XRToday_enterprise">[https://www.xrtoday.com/mixed-reality/meta-quest-for-enterprise-the-ultimate-guide-to-meta-quest-for-enterprise-applications/ XR Today – VR Passthrough in Enterprise] (2024)</ref> Remote collaboration where experts guide on-site technicians using virtual annotations overlaid on the real equipment view.
*'''Training and Simulation:''' Combining virtual scenarios with physical controls or environments (for example flight simulation using a real cockpit visible via passthrough, medical training on physical manikins with virtual overlays).<ref name="VIVE_Blog_Sauce"/>
*'''Training and Simulation:''' Combining virtual scenarios with physical controls or environments (for example flight simulation using a real cockpit visible via passthrough, medical training on physical manikins with virtual overlays).<ref name="VIVE_Blog_Sauce"/>
*'''Visualization:''' Architects visualizing 3D models on a real site, designers overlaying virtual concepts onto physical products.
*'''Visualization:''' Architects visualizing 3D models on a real site, designers overlaying virtual concepts onto physical products.
*'''Productivity:''' Creating expansive virtual workspaces integrated with the physical office environment, improving multitasking while maintaining awareness.<ref name="XRToday_benefits">XR Today - What is VR Passthrough... (on benefits of passthrough)</ref>
*'''Productivity:''' Creating expansive virtual workspaces integrated with the physical office environment, improving multitasking while maintaining awareness.<ref name="XRToday_benefits">[https://www.xrtoday.com/virtual-reality/what-is-vr-passthrough-and-how-is-it-shaping-the-future-of-xr/ XR Today – What is VR Passthrough… (benefits)] (Dec 2024)</ref>


===Industrial and Field Uses===
===Industrial and Field Uses===
Line 131: Line 131:
*'''[[Lynx R1]]:''' Standalone headset project focusing specifically on delivering quality color passthrough at a competitive price point.<ref name="LynxR1">[https://www.lynx-r.com/ Lynx R1 Official Website]</ref>
*'''[[Lynx R1]]:''' Standalone headset project focusing specifically on delivering quality color passthrough at a competitive price point.<ref name="LynxR1">[https://www.lynx-r.com/ Lynx R1 Official Website]</ref>
*'''[[PlayStation VR2]]:''' Features stereo black-and-white passthrough primarily for setup and quick environment checks.<ref name="RoadToVR_PSVR2"/>
*'''[[PlayStation VR2]]:''' Features stereo black-and-white passthrough primarily for setup and quick environment checks.<ref name="RoadToVR_PSVR2"/>
*'''[[Valve Index]]:''' Basic stereoscopic monochrome passthrough via front cameras.<ref name="Index_Docs">Valve Index Hardware Documentation. (2019).</ref>
*'''[[Valve Index]]:''' Basic stereoscopic monochrome passthrough via front cameras.<ref name="Index_Docs">[https://steamcdn-a.akamaihd.net/store/valve_index/ValveIndexManual032021.pdf Valve Index Hardware Manual (PDF)] (Mar 2021)</ref>


==Future Developments==
==Future Developments==
Line 141: Line 141:
*Implementing selective passthrough (showing only specific real-world elements like hands or keyboards) and potentially "augmented reality" filters applied to the real-world view.
*Implementing selective passthrough (showing only specific real-world elements like hands or keyboards) and potentially "augmented reality" filters applied to the real-world view.
*Utilizing [[eye tracking]] for [[foveated rendering]] of the passthrough feed or dynamic depth-of-field adjustments.
*Utilizing [[eye tracking]] for [[foveated rendering]] of the passthrough feed or dynamic depth-of-field adjustments.
*Exploring novel camera technologies like light field cameras (for example Meta's "Flamera" concept<ref name="KGuttag_Flamera">KGOnTech (Karl Guttag) - Meta Flamera Light Field Passthrough</ref>) to better solve perspective issues.
*Exploring novel camera technologies like light field cameras (for example Meta's "Flamera" concept<ref name="KGuttag_Flamera">[https://kguttag.com/2023/09/26/apple-vision-pro-part-6-passthrough-mixed-reality-ptmr-problems/#flamera KGOnTech – Meta Flamera Light‑Field Passthrough] (Sept 26 2023)</ref>) to better solve perspective issues.


As technology matures, VST passthrough aims to provide a near-seamless blend between the virtual and physical worlds, potentially unifying VR and AR capabilities into single, versatile devices.
As technology matures, VST passthrough aims to provide a near-seamless blend between the virtual and physical worlds, potentially unifying VR and AR capabilities into single, versatile devices.
Line 196: Line 196:
<ref name="UploadVR_ghosting">UploadVR - Quest 3 Review (on ghosting issue)</ref>
<ref name="UploadVR_ghosting">UploadVR - Quest 3 Review (on ghosting issue)</ref>
<ref name="CameraLimitations">[https://arstechnica.com/gadgets/2023/10/quest-3-review-finally-real-mixed-reality-for-under-500/ Ars Technica Quest 3 Review discussing passthrough quality]</ref>
<ref name="CameraLimitations">[https://arstechnica.com/gadgets/2023/10/quest-3-review-finally-real-mixed-reality-for-under-500/ Ars Technica Quest 3 Review discussing passthrough quality]</ref>
<ref name="Varjo_blog">Varjo Blog - Video Pass-Through XR - Merge Real and Virtual (Urho Konttori, 2020)</ref>
<ref name="Varjo_blog">[https://varjo.com/blog/video-pass-through-xr-changes-reality-as-you-know-it/ Varjo Blog – Video Pass‑Through XR: Merge Real and Virtual] (2020)</ref>
<ref name="TechTarget_ARdef">[https://www.techtarget.com/whatis/definition/augmented-reality-AR TechTarget: What is augmented reality (AR)?]</ref>
<ref name="TechTarget_ARdef">[https://www.techtarget.com/whatis/definition/augmented-reality-AR TechTarget: What is augmented reality (AR)?]</ref>
<ref name="XRToday_enterprise">XR Today - VR Passthrough in Enterprise (Immersive Learning News)</ref>
<ref name="XRToday_enterprise">XR Today - VR Passthrough in Enterprise (Immersive Learning News)</ref>