Jump to content

Inside-out tracking: Difference between revisions

From VR & AR Wiki
No edit summary
No edit summary
Line 1: Line 1:
{{stub}}
{{see also|Markerless inside-out tracking‎|Positional tracking|Outside-in tracking}}
{{see also|Markerless inside-out tracking‎|Positional tracking}}
[[File:Acer mixed reality headset inside-out.png|thumb|Figure 1. A Windows Mixed Reality headset, an example of an HMD using markerless inside-out tracking via cameras visible on the front (image: www.wareable.com)]]
[[File:F2Ak4iE.jpg|thumbnail|Figure 2. Early [[Lighthouse]] prototype, an inside-out tracking system with 2-dimensional barcodes as [[fiducial markers]].]]
[[File:Acer mixed reality headset inside-out.png|thumb|Figure 3. Inside-out tracking HMD (image: www.wareable.com)]]
'''Inside-out tracking''' is a method of [[positional tracking]] commonly used in [[virtual reality]] (VR) technologies, specifically for tracking the position of [[head-mounted display|head-mounted displays]] (HMDs) and motion controller accessories.


It is egocentric. It usually involves [[SLAM]] or [[VIO]] for markerless tracking.<ref name="m249">{{cite web | title=Inside-out tracking | website=XVRWiki | date=2024-09-30 | url=https://www.xvrwiki.org/wiki/Inside-out_tracking | access-date=2024-10-05}}</ref>
'''Inside-out tracking''' is a method of [[positional tracking]] commonly used in [[virtual reality]] (VR), [[augmented reality]] (AR), and [[mixed reality]] (MR) technologies. It enables devices like [[head-mounted display|head-mounted displays]] (HMDs) and [[motion controller]]s to determine their position and orientation in 3D space by using [[sensor]]s located *on* the device itself, looking outward at the surrounding environment.


It differentiates itself from [[outside-in tracking]] by the location of the cameras or other sensors that are used to determine the object’s position in space (Figure 1). In inside-out positional tracking, the camera or sensors are located on the device being tracked (e.g. HMD) while in outside-in the sensors are placed in a stationary location. <ref name=”1”> Ribo, M., Pinz, A. and Fuhrmann, A.L. (2001). A new optical tracking system for virtual and augmented reality applications. Instrumentation and Measurement Technology Conference Proceedings</ref> <ref name=”2”> Boger, Y. (2014). Positional tracking: "Outside-in" vs. "Inside-out.” Retrieved from http://vrguy.blogspot.pt/2014/08/positional-tracking-outside-in-vs.html</ref> <ref name=”3”> Ishii, K. (2010). Augmented Reality: Fundamentals and nuclear related applications. Nuclear Safety and Simulation, 1(1)</ref>
It contrasts with [[outside-in tracking]], where external sensors (e.g., cameras or laser emitters) are placed in the environment to track sensors or markers located on the HMD or controllers. In inside-out tracking, the "eyes" (cameras or other sensors) are on the moving object, making the system inherently egocentric.<ref name=”Ribo2001”> Ribo, M., Pinz, A. and Fuhrmann, A.L. (2001). A new optical tracking system for virtual and augmented reality applications. Instrumentation and Measurement Technology Conference Proceedings</ref><ref name=”Boger2014InOut”> Boger, Y. (2014). Positional tracking: "Outside-in" vs. "Inside-out.” Retrieved from http://vrguy.blogspot.pt/2014/08/positional-tracking-outside-in-vs.html</ref>
__NOTOC__
A VR device using inside-out tracking looks out to determine how its position changes in relation to the environment. When the headset moves, the sensors readjusts its place in the room and the virtual environment responds accordingly in real time. This type of positional tracking can be achieved with or without markers placed in the environment. The latter is called [[markerless inside-out tracking]]. <ref name=”4”> Langley, H. (2017). Inside-out v Outside-in: How VR tracking works, and how it's going to change. Retrieved from https://www.wareable.com/vr/inside-out-vs-outside-in-vr-tracking-343</ref>


The cameras (or any other optical sensors) that are placed on the HMD observe features of the surrounding environment. When using markers, these are designed to be easily detected by the tracking system and placed in a specific area. These [[fiducial markers]] include primitive shapes like points, squares, and circles (Figure 2). QR codes are an example of positional markers that can be placed in the outside world to serve as reference points for the tracking camera. Inside-out positional tracking can also be achieved using infra-red (IR) markers and a camera that is sensitive to this type of light. In case of using markers, the inside-out system works only as long as it can detect the markers. If these are out of its field of view, positional tracking will be affected. <ref name=”2”></ref> <ref name=”3”></ref> <ref name=”5”> Mehling, M. (2006). Implementation of a Low Cost Marker Based Infrared Optical Tracking System. PhD thesis, Fachhochschule Stuttgart</ref>
== How it Works ==
An inside-out tracking system observes the external world from the perspective of the device being tracked. As the device moves, the sensors detect changes relative to the environment, allowing the system to calculate the device's updated position and orientation ([[6DoF|six degrees of freedom]] - 3 translational, 3 rotational). This calculation typically happens in real-time, allowing the virtual environment to respond accordingly.


With markerless inside-out tracking - a method based on natural features - uses distinctive characteristics that originally exist in the environment to determine position and orientation. The system’s algorithms identify specific images or shapes and uses them to calculate the device’s position in space. Data from accelerometers and gyroscopes can also be used to increase the precision of positional tracking. <ref name=”3”></ref> <ref name=”6”> Boger, Y. (2014). Overview of positional tracking technologies for virtual reality. Retrieved from http://www.roadtovr.com/overview-of-positional-tracking-technologies-virtual-reality/</ref>
There are two main approaches:


==Devices using Inside-out tracking==
=== Marker-based Inside-out Tracking ===
* ''See also: [[Markerless, inside-out tracking#Markerless inside-out tracking Devices|Devices using markerless inside-out tracking]]''
This approach relies on placing artificial markers, known as [[fiducial markers]], in the environment. The sensors on the tracked device are designed to detect these specific markers.
*   '''Mechanism:''' Cameras or other optical sensors on the device identify the known patterns or shapes of the fiducial markers (e.g., QR codes, specific geometric patterns, [[infra-red|infrared]] (IR) LEDs). By analyzing the position, size, and orientation of these markers in the sensor's view, the system can calculate the device's pose relative to them.<ref name=”Ishii2010”> Ishii, K. (2010). Augmented Reality: Fundamentals and nuclear related applications. Nuclear Safety and Simulation, 1(1)</ref><ref name=”Mehling2006”> Mehling, M. (2006). Implementation of a Low Cost Marker Based Infrared Optical Tracking System. PhD thesis, Fachhochschule Stuttgart</ref>
*  '''Limitations:''' Tracking only functions when the markers are within the sensor's [[field of view]] and are not occluded. Requires setup of the environment with markers.
*  '''Example Concept:''' While not typically used for modern VR HMDs, the [[Nintendo Wii Remote]] utilizes a form of marker-based inside-out tracking. The IR camera in the remote tracks the position of IR LEDs located in the stationary Sensor Bar to determine where the remote is pointing (though primarily for orientation and relative pointing, not full 6DoF positional tracking).


'''[[HTC Vive]] (including [[HTC Vive Developer Editions|developer editions]])'''
=== Markerless Inside-out Tracking ===
{{main|Markerless inside-out tracking}}
This is the most common approach used in modern standalone VR/MR headsets. It uses [[computer vision]] techniques to track the device's position based on recognizing natural features in the surrounding environment, without requiring any artificial markers.
'''Mechanism:''' Cameras on the HMD capture video of the environment. Sophisticated algorithms, often involving [[SLAM]] (Simultaneous Localization and Mapping) or [[VIO]] (Visual-Inertial Odometry), identify and track distinct features (corners, edges, textures) in the environment. As the headset moves, the system tracks how these features move across the cameras' views to calculate the device's motion. Data from onboard [[IMU|Inertial Measurement Units]] ([[accelerometer]]s and [[gyroscope]]s) is typically fused with the visual data. The IMU provides high-frequency motion data and helps predict position during moments when visual tracking is temporarily lost (e.g., fast movements, poor lighting).<ref name="m249">{{cite web | title=Inside-out tracking | website=XVRWiki | date=2024-09-30 | url=https://www.xvrwiki.org/wiki/Inside-out_tracking | access-date=2024-10-05}}</ref><ref name=”Boger2014Overview”> Boger, Y. (2014). Overview of positional tracking technologies for virtual reality. Retrieved from http://www.roadtovr.com/overview-of-positional-tracking-technologies-virtual-reality/</ref>
*  '''Environment Mapping:''' SLAM-based systems often create and update a map of the environment as the user moves around, allowing the device to re-localize itself within a known space.


'''[[Microsoft HoloLens]]'''
== Advantages ==
'''Simpler Setup:''' No need to install external sensors, base stations, or markers in the room.
*  '''Portability:''' Systems are often self-contained, making them easier to move between different locations.
*  '''Potentially Unlimited Tracking Volume:''' Tracking is limited by the environment the sensors can see, not by the placement of external hardware. The device can theoretically be tracked in any sufficiently featured area.
*  '''Lower System Cost (Potentially):''' Eliminating external tracking hardware can reduce the overall cost of the VR system, although it increases the complexity and [[computational cost]] on the headset itself.


'''[[Nintendo Wii Remote]] (not officially used for VR)'''
== Disadvantages ==
'''Controller [[Occlusion]]:''' Tracking of controllers can be lost if they move outside the [[field of view]] of the HMD's cameras (e.g., behind the user's back, too close to the HMD). Predictive algorithms using IMU data help mitigate this for short periods.
*  '''Environmental Dependence:''' Tracking performance can degrade in environments with poor lighting, few visual features (e.g., blank walls), or highly repetitive textures. Reflective surfaces can also sometimes cause issues.
*  '''Computational Load:''' Processing sensor data and running tracking algorithms (especially SLAM/VIO) requires significant processing power on the device itself (HMD or connected computer), potentially impacting battery life or requiring more powerful onboard processors.
*  '''Drift:''' While modern systems are robust, markerless tracking can sometimes accumulate small positional errors over time (drift), potentially requiring occasional recalibration or re-localization, although [[environment mapping]] helps significantly.
*  '''Initialization:''' The system needs to initialize and potentially map a new area before tracking becomes fully stable.


'''[[Samsung HMD Odyssey]]'''
== Examples of Devices/Platforms Using Inside-out Tracking ==
Many modern VR/AR/MR systems utilize markerless inside-out tracking:
'''[[Meta Quest]] series''' ([[Oculus Quest]], Quest 2, Quest 3, Quest Pro) - Uses the "[[Oculus Insight]]" tracking system.
*  '''[[Oculus Rift S]]''' - Also used Oculus Insight.
*  '''[[Windows Mixed Reality]]''' platform headsets (e.g., [[HP Reverb]] G1 & G2, [[Samsung HMD Odyssey]], Acer AH101, Lenovo Explorer).
*  '''[[HTC Vive Cosmos]]''' (with standard faceplate), '''[[HTC Vive Focus]]''' series, '''[[HTC Vive XR Elite]]'''.
*  '''Pico Neo''' series (e.g., Pico Neo 3, Pico 4).
*  '''[[Microsoft HoloLens]]''' 1 and 2 (AR headsets).
*  '''[[Valve Index]]''' (Uses inside-out cameras primarily for passthrough view, room setup scanning, and potentially some experimental features, but relies on [[Lighthouse]] (outside-in) for its primary HMD and controller tracking).
*  '''[[PlayStation VR2]]''' (Uses inside-out cameras for HMD and controller tracking).
*  '''[[Apple Vision Pro]]''' (MR headset).
*  '''[[Nintendo Wii Remote]]''' (non-VR example using marker-based inside-out tracking for pointing/orientation).


'''[[Oculus Quest]]''' <ref name=”7”> AR/VR Tips (2020). VR Headset Comparison. Retrieved from https://arvrtips.com/vr-headset-comparison-tool/</ref>
[[Category:Terms]] [[Category:Technical Terms]] [[Category:Tracking Technology]]


'''[[Oculus Rift S]]'''
== References ==
 
<references>
'''[[HP Reverb G1 & G2]]'''
<ref name=”Ribo2001”> Ribo, M., Pinz, A. and Fuhrmann, A.L. (2001). A new optical tracking system for virtual and augmented reality applications. Instrumentation and Measurement Technology Conference Proceedings</ref>
 
<ref name=”Boger2014InOut”> Boger, Y. (2014). Positional tracking: "Outside-in" vs. "Inside-out.” Retrieved from http://vrguy.blogspot.pt/2014/08/positional-tracking-outside-in-vs.html</ref>
==Inside-out tracking systems==
<ref name=”Ishii2010”> Ishii, K. (2010). Augmented Reality: Fundamentals and nuclear related applications. Nuclear Safety and Simulation, 1(1)</ref>
* ''See also: [[Markerless, inside-out tracking#Markerless, inside-out tracking Systems|Systems using markerless inside-out tracking]]''
<ref name=”Mehling2006”> Mehling, M. (2006). Implementation of a Low Cost Marker Based Infrared Optical Tracking System. PhD thesis, Fachhochschule Stuttgart</ref>
 
<ref name="m249">{{cite web | title=Inside-out tracking | website=XVRWiki | date=2024-09-30 | url=https://www.xvrwiki.org/wiki/Inside-out_tracking | access-date=2024-10-05}}</ref>
'''[[Lighthouse]] - [[SteamVR]]'''
<ref name=”Boger2014Overview”> Boger, Y. (2014). Overview of positional tracking technologies for virtual reality. Retrieved from http://www.roadtovr.com/overview-of-positional-tracking-technologies-virtual-reality/</ref>
 
<ref name=”Langley2017”> Langley, H. (2017). Inside-out v Outside-in: How VR tracking works, and how it's going to change. Retrieved from https://www.wareable.com/vr/inside-out-vs-outside-in-vr-tracking-343</ref> {{comment|Note: Kept this ref from original page, seems relevant for advantages/disadvantages context}}
'''[[Nintendo Wii Sensor Bar]] (not officially used for VR)'''
<ref name=”ARVRTips2020”> AR/VR Tips (2020). VR Headset Comparison. Retrieved from https://arvrtips.com/vr-headset-comparison-tool/</ref> {{comment|Note: Kept this ref from original page, potentially supports device examples}}
 
</references>
'''NMERSO'''
 
[[Category:Terms]] [[Category:Technical Terms]]
 
==References==

Revision as of 00:14, 24 April 2025

See also: Markerless inside-out tracking‎, Positional tracking and Outside-in tracking
Figure 1. A Windows Mixed Reality headset, an example of an HMD using markerless inside-out tracking via cameras visible on the front (image: www.wareable.com)

Inside-out tracking is a method of positional tracking commonly used in virtual reality (VR), augmented reality (AR), and mixed reality (MR) technologies. It enables devices like head-mounted displays (HMDs) and motion controllers to determine their position and orientation in 3D space by using sensors located *on* the device itself, looking outward at the surrounding environment.

It contrasts with outside-in tracking, where external sensors (e.g., cameras or laser emitters) are placed in the environment to track sensors or markers located on the HMD or controllers. In inside-out tracking, the "eyes" (cameras or other sensors) are on the moving object, making the system inherently egocentric.[1][2]

How it Works

An inside-out tracking system observes the external world from the perspective of the device being tracked. As the device moves, the sensors detect changes relative to the environment, allowing the system to calculate the device's updated position and orientation (six degrees of freedom - 3 translational, 3 rotational). This calculation typically happens in real-time, allowing the virtual environment to respond accordingly.

There are two main approaches:

Marker-based Inside-out Tracking

This approach relies on placing artificial markers, known as fiducial markers, in the environment. The sensors on the tracked device are designed to detect these specific markers.

  • Mechanism: Cameras or other optical sensors on the device identify the known patterns or shapes of the fiducial markers (e.g., QR codes, specific geometric patterns, infrared (IR) LEDs). By analyzing the position, size, and orientation of these markers in the sensor's view, the system can calculate the device's pose relative to them.[3][4]
  • Limitations: Tracking only functions when the markers are within the sensor's field of view and are not occluded. Requires setup of the environment with markers.
  • Example Concept: While not typically used for modern VR HMDs, the Nintendo Wii Remote utilizes a form of marker-based inside-out tracking. The IR camera in the remote tracks the position of IR LEDs located in the stationary Sensor Bar to determine where the remote is pointing (though primarily for orientation and relative pointing, not full 6DoF positional tracking).

Markerless Inside-out Tracking

Main article: Markerless inside-out tracking

This is the most common approach used in modern standalone VR/MR headsets. It uses computer vision techniques to track the device's position based on recognizing natural features in the surrounding environment, without requiring any artificial markers.

  • Mechanism: Cameras on the HMD capture video of the environment. Sophisticated algorithms, often involving SLAM (Simultaneous Localization and Mapping) or VIO (Visual-Inertial Odometry), identify and track distinct features (corners, edges, textures) in the environment. As the headset moves, the system tracks how these features move across the cameras' views to calculate the device's motion. Data from onboard Inertial Measurement Units (accelerometers and gyroscopes) is typically fused with the visual data. The IMU provides high-frequency motion data and helps predict position during moments when visual tracking is temporarily lost (e.g., fast movements, poor lighting).[5][6]
  • Environment Mapping: SLAM-based systems often create and update a map of the environment as the user moves around, allowing the device to re-localize itself within a known space.

Advantages

  • Simpler Setup: No need to install external sensors, base stations, or markers in the room.
  • Portability: Systems are often self-contained, making them easier to move between different locations.
  • Potentially Unlimited Tracking Volume: Tracking is limited by the environment the sensors can see, not by the placement of external hardware. The device can theoretically be tracked in any sufficiently featured area.
  • Lower System Cost (Potentially): Eliminating external tracking hardware can reduce the overall cost of the VR system, although it increases the complexity and computational cost on the headset itself.

Disadvantages

  • Controller Occlusion: Tracking of controllers can be lost if they move outside the field of view of the HMD's cameras (e.g., behind the user's back, too close to the HMD). Predictive algorithms using IMU data help mitigate this for short periods.
  • Environmental Dependence: Tracking performance can degrade in environments with poor lighting, few visual features (e.g., blank walls), or highly repetitive textures. Reflective surfaces can also sometimes cause issues.
  • Computational Load: Processing sensor data and running tracking algorithms (especially SLAM/VIO) requires significant processing power on the device itself (HMD or connected computer), potentially impacting battery life or requiring more powerful onboard processors.
  • Drift: While modern systems are robust, markerless tracking can sometimes accumulate small positional errors over time (drift), potentially requiring occasional recalibration or re-localization, although environment mapping helps significantly.
  • Initialization: The system needs to initialize and potentially map a new area before tracking becomes fully stable.

Examples of Devices/Platforms Using Inside-out Tracking

Many modern VR/AR/MR systems utilize markerless inside-out tracking:

References

  1. Ribo, M., Pinz, A. and Fuhrmann, A.L. (2001). A new optical tracking system for virtual and augmented reality applications. Instrumentation and Measurement Technology Conference Proceedings
  2. Boger, Y. (2014). Positional tracking: "Outside-in" vs. "Inside-out.” Retrieved from http://vrguy.blogspot.pt/2014/08/positional-tracking-outside-in-vs.html
  3. Ishii, K. (2010). Augmented Reality: Fundamentals and nuclear related applications. Nuclear Safety and Simulation, 1(1)
  4. Mehling, M. (2006). Implementation of a Low Cost Marker Based Infrared Optical Tracking System. PhD thesis, Fachhochschule Stuttgart
  5. "Inside-out tracking". 2024-09-30. https://www.xvrwiki.org/wiki/Inside-out_tracking.
  6. Boger, Y. (2014). Overview of positional tracking technologies for virtual reality. Retrieved from http://www.roadtovr.com/overview-of-positional-tracking-technologies-virtual-reality/

Cite error: <ref> tag with name "”Langley2017”" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "”ARVRTips2020”" defined in <references> is not used in prior text.