Jump to content

Spatial mapping: Difference between revisions

Line 151: Line 151:
Depending on the primary sensors used, SLAM can be categorized into several types:
Depending on the primary sensors used, SLAM can be categorized into several types:


* '''Visual SLAM (vSLAM)''': Uses one or more cameras to track visual features.<ref name="MathWorksSLAM"/>
* '''[[Visual SLAM]] (vSLAM)''': Uses one or more cameras to track visual features.<ref name="MathWorksSLAM"/>
* '''LiDAR SLAM''': Uses a LiDAR sensor to build a precise geometric map.<ref name="MathWorksSLAM"/>
* '''[[LiDAR SLAM]]''': Uses a LiDAR sensor to build a precise geometric map.<ref name="MathWorksSLAM"/>
* '''Multi-Sensor SLAM''': Fuses data from various sources (e.g., cameras, IMU, LiDAR) for enhanced robustness and accuracy.<ref name="MathWorksSLAM"/>
* '''[[Multi-Sensor SLAM]]''': Fuses data from various sources (e.g., cameras, IMU, LiDAR) for enhanced robustness and accuracy.<ref name="MathWorksSLAM"/>


Spatial mapping is typically accomplished via SLAM algorithms, which build a map of the environment in real time while tracking the device's position within it.<ref name="Adeia">{{cite web |url=https://adeia.com/blog/spatial-mapping-empowering-the-future-of-ar |title=Spatial Mapping: Empowering the Future of AR |publisher=Adeia |date=2022-03-02 |access-date=2025-10-27}}</ref>
Spatial mapping is typically accomplished via SLAM algorithms, which build a map of the environment in real time while tracking the device's position within it.<ref name="Adeia">{{cite web |url=https://adeia.com/blog/spatial-mapping-empowering-the-future-of-ar |title=Spatial Mapping: Empowering the Future of AR |publisher=Adeia |date=2022-03-02 |access-date=2025-10-27}}</ref>