Spatial mapping: Difference between revisions
Appearance
Xinreality (talk | contribs) |
Xinreality (talk | contribs) |
||
| Line 291: | Line 291: | ||
=== Google ARCore === | === Google ARCore === | ||
Google ARCore launched in 2017 as the company's platform-agnostic augmented reality SDK, providing cross-platform APIs for Android, iOS, Unity, and Web after discontinuing the hardware-dependent Project Tango. ARCore achieves spatial understanding without specialized sensors through depth-from-motion algorithms that compare multiple device images from different angles, combining visual information with IMU measurements running at 1000 Hz. The system performs motion tracking at 60 fps using Simultaneous Localization and Mapping with visual and inertial data fusion.<ref name="arcore">{{cite web |url=https://developers.google.com/ar |title=ARCore Overview |publisher=Google Developers |access-date=2025-10-27}}</ref> | [[Google ARCore]] launched in 2017 as the company's platform-agnostic augmented reality SDK, providing cross-platform APIs for Android, iOS, Unity, and Web after discontinuing the hardware-dependent Project Tango. ARCore achieves spatial understanding without specialized sensors through depth-from-motion algorithms that compare multiple device images from different angles, combining visual information with IMU measurements running at 1000 Hz. The system performs motion tracking at 60 fps using Simultaneous Localization and Mapping with visual and inertial data fusion.<ref name="arcore">{{cite web |url=https://developers.google.com/ar |title=ARCore Overview |publisher=Google Developers |access-date=2025-10-27}}</ref> | ||
The Depth API public launch in ARCore 1.18 (June 2020) brought occlusion capabilities to hundreds of millions of compatible Android devices. The depth-from-motion algorithm creates depth images using RGB camera and device movement, selectively using machine learning to increase depth processing even with minimal motion. Depth images store 16-bit unsigned integers per pixel representing distance from camera to environment, with depth range of 0 to 65 meters and most accurate results from 0.5 to 5 meters from real-world scenes.<ref name="RoadtoVR"/> | The Depth API public launch in ARCore 1.18 (June 2020) brought occlusion capabilities to hundreds of millions of compatible Android devices. The depth-from-motion algorithm creates depth images using RGB camera and device movement, selectively using machine learning to increase depth processing even with minimal motion. Depth images store 16-bit unsigned integers per pixel representing distance from camera to environment, with depth range of 0 to 65 meters and most accurate results from 0.5 to 5 meters from real-world scenes.<ref name="RoadtoVR"/> | ||