Jump to content

Developer Resource: Difference between revisions

From VR & AR Wiki
RealEditor (talk | contribs)
destub
No edit summary
 
Line 1: Line 1:
{{TOCRIGHT}}
{{TOCRIGHT}}
{{infoBox
| title = Developer Resource
| image =
| caption =
| content = This guide provides comprehensive information for developers working on [[Virtual Reality]] ([[VR]]) and [[Augmented Reality]] ([[AR]]) applications. Updated for 2025, it covers the latest [[SDKs]], [[runtimes]], [[app stores]], development tools, and best practices for creating immersive experiences across modern [[XR]] platforms.
}}
==Overview==
This article provides comprehensive information about resources for [[VR]] and [[AR]] development as of 2025, including [[SDKs]], [[game engines]], [[runtimes]], [[app stores]], and development tools for modern [[virtual reality]] and [[augmented reality]] platforms. The [[XR development]] landscape has evolved significantly, with [[OpenXR]] emerging as the industry-standard cross-platform API, supported by all major hardware manufacturers and game engines.
==The XR Development Landscape==
[[Extended Reality]] ([[XR]]) encompasses [[Virtual Reality]] ([[VR]]), [[Augmented Reality]] ([[AR]]), and [[Mixed Reality]] ([[MR]]), representing a paradigm shift in how users interact with digital content. For developers, this landscape is defined by a rich ecosystem of game engines, [[Software Development Kits]] ([[SDKs]]), and cross-platform standards designed to build immersive experiences.<ref name="UnityXRManual">https://docs.unity3d.com/6000.2/Documentation/Manual/XR.html - XR - Unity Documentation</ref>
At the heart of modern XR development are powerful game engines, primarily [[Unity]] and [[Unreal Engine]], which provide the rendering, physics, and scripting environments necessary to create interactive 3D worlds.<ref name="UnitySolutionsXR">https://unity.com/solutions/xr - XR Development in Unity: AR, VR and Spatial Solutions</ref> These engines are complemented by platform-specific SDKs that unlock the unique hardware features of devices like the [[Meta Quest]].<ref name="MetaDevOverview">https://developers.meta.com/horizon/documentation/unity/unity-development-overview/ - Unity Development Overview for Meta Quest</ref>
Underpinning this entire ecosystem is the [[OpenXR]] standard, a crucial initiative by the [[Khronos Group]] to create a unified [[API]] that allows developers to write code that can run across multiple hardware platforms with minimal modification, thereby reducing fragmentation and simplifying the development process.<ref name="KhronosOpenXR">https://www.khronos.org/OpenXR - OpenXR - High-performance access to AR and VR</ref>
==How to Get Started==
==How to Get Started==
{{see also|How to Get Started in VR Development}}
{{see also|How to Get Started in VR Development}}
{{:How to Get Started in VR Development}}


==SDKs==
Getting started with [[XR development]] in 2025 requires choosing a target platform and appropriate development tools. Modern [[VR development]] primarily uses [[OpenXR]] as a cross-platform standard, with platform-specific [[SDKs]] providing additional features.
[[SDKs]] or [[Software Development Kits]] are used to build [[VR apps]]. An [[app]] can either implement [[OVR]] or [[OpenVR]] or both. This means that the app has access to native functionality in it's corresponding runtime. '''SDKs do not handle [[asynchronous timewarp]] or [[reprojection]], those are handled by the runtime!'''
 
===Beginner Path===
For beginners, the recommended path is to:
# Choose a game engine ([[Unity]], [[Unreal Engine]], or [[Godot]])
# Select target [[VR headset|VR headsets]] (e.g., [[Meta Quest 3]], [[Meta Quest 3S]], [[Apple Vision Pro]], [[PSVR2]])
# Set up the appropriate [[SDK]] and development environment
# Study platform-specific guidelines and best practices
# Build a simple app and iterate
 
===Development Environment Setup===
'''1. Select a Development Environment:''' Use integrated development environments (IDEs) like [[Unity]] or [[Unreal Engine]], which support XR out of the box. Download the latest versions: Unity 6.x or Unreal Engine 5.4+.
 
'''2. Install SDKs:''' For Meta devices, use the [[Meta XR SDK]]. For Apple, the [[visionOS SDK]]. Adopt [[OpenXR]] for cross-platform compatibility.
 
'''3. Set Up Hardware:''' Ensure your development PC meets requirements (e.g., high-end GPU for tethered VR). For standalone devices, develop directly on the headset or use simulators like [[Meta XR Simulator]].
 
'''4. Learn Basics:''' Study motion tracking, anchors, environmental understanding, and depth APIs. Resources include official documentation from Meta, Apple, and Khronos Group.
 
'''5. Test and Iterate:''' Use tools like [[Meta XR Simulator]] for testing without hardware. Optimize for performance, considering battery life and refresh rates.
 
===Essential Concepts===
- Understand [[6DOF]] (six degrees of freedom) motion detection for immersive experiences
- Focus on user comfort to avoid motion sickness, using techniques like [[asynchronous timewarp]] or [[reprojection]]
- Integrate [[haptics]] and [[controllers]] for better interaction
- For [[AR]], leverage location anchors and scene semantics for real-world integration
 
==Modern SDKs and Development Kits==
[[SDKs]] or [[Software Development Kits]] are essential tools for building [[VR apps]] and [[AR apps]]. As of 2025, the [[XR]] development landscape has evolved significantly from the early [[Oculus Rift]] and [[HTC Vive]] era.
 
===OpenXR===
[[OpenXR]] is an open, royalty-free standard developed by the [[Khronos Group]] that provides a unified [[API]] for [[VR]] and [[AR]] platforms.<ref name="openxr-meta">https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR | Meta Horizon OS Developers</ref> [[OpenXR]] eliminates the need for multiple proprietary [[APIs]], allowing developers to create applications that work across different [[VR headsets]] and [[AR devices]].
 
As of 2025, [[Meta]] recommends [[OpenXR]] as the primary development path starting with SDK version 74.<ref name="openxr-recommended">https://www.uploadvr.com/meta-recommends-using-unity-unreal-built-in-openxr-support/ - Meta Will Recommend Using Unity & Unreal's Built-In OpenXR Support</ref> [[OpenXR]] support is built into major game engines and provides cross-platform compatibility.
 
====Core Architecture and Concepts====
The OpenXR API provides a set of abstractions for developers to interact with the XR system:<ref name="OpenXRWiki">https://en.wikipedia.org/wiki/OpenXR - OpenXR - Wikipedia</ref>
* '''XrInstance:''' Represents the connection between the application and the OpenXR runtime. It is the first object created.
* '''XrSystem:''' Represents the set of XR devices, including the headset and controllers.
* '''XrSession:''' Manages the interaction session between the application and the user. It controls the application's lifecycle, such as when it should render frames.
* '''XrSpace:''' Defines a 3D coordinate system. This is used to track the position and orientation of the headset, controllers, and other tracked objects.
* '''XrActions:''' Provides a high-level, action-based input system. Instead of querying for "button A press," a developer defines an action like "Jump" or "Grab" and maps it to different physical inputs on various controllers.
 
====Key OpenXR Features====
* Cross-platform [[VR]] and [[AR]] development
* Support for multiple [[VR headsets]] from different vendors
* Standardized input handling and [[tracking]]
* [[Hand tracking]], [[eye tracking]], and [[spatial anchors]]
* [[Passthrough]] and [[mixed reality]] capabilities
 
====Industry Adoption====
OpenXR has achieved widespread industry adoption. All major hardware and software platforms provide conformant OpenXR runtimes, including:<ref name="MetaOpenXRSupport">https://developers.meta.com/horizon/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis/ - Oculus All-in on OpenXR, Deprecates Proprietary APIs</ref><ref name="AutoVRSEOpenXR">https://www.autovrse.com/openxr - What is OpenXR?</ref>
* Meta (for Quest and Rift devices)
* Valve ([[SteamVR]])
* Microsoft (for [[Windows Mixed Reality]] and [[HoloLens 2]])
* HTC (for Vive headsets)
* Pico (for Pico headsets)


*'''[[OVR]]''' - Made by [[Oculus VR]] for the [[Oculus Rift]]. Current version (14th May 2016) is 1.3.1 and can access all features of the [[Oculus Runtime]].
====OpenXR vs. OpenVR====
A common point of confusion is the distinction between OpenXR and [[OpenVR]]:<ref name="steamvr-openvr">https://en.wikipedia.org/wiki/OpenVR - OpenVR - Wikipedia</ref><ref name="DCSForumsOpenVR">https://forum.dcs.world/topic/318110-confusion-steamvr-vs-openxr-opencomposite-8/ - Confusion SteamVR vs OpenXR / OpenComposite</ref>
* '''OpenVR''' is an older API developed by Valve. It was the primary API for the SteamVR platform and was "open" in the sense that other hardware could create drivers to be compatible with it. However, its development was ultimately controlled by Valve.
* '''OpenXR''' is the modern, multi-company standard managed by the Khronos Group. It is the successor to proprietary APIs like OpenVR and the old Oculus SDK.


*'''[[OpenVR]]''' - Made by [[Valve]] and supports [[Vive]] and [[Rift]] via the [[SteamVR Runtime]].
Today, the SteamVR platform itself is an OpenXR runtime. This means it can run applications built with the OpenXR API, while also maintaining backward compatibility with older applications built with the OpenVR API.<ref name="PimaxOpenVRvsOpenXR">https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?</ref>


Sidenote to SDK's and [[Unity]] games: Unity 5.3 currently has optimizations for VR in their native mode. The native mode supports Rift, [[Gear VR]] and [[PSVR]], but not [[SteamVR]]. A game compiled with Unity 5.3 can use those optimizations with the [[Oculus SDK]] but not the [[OpenVR SDK]]. The OpenVR SDK has it's own optimizations, which may or may not result in similar performance. However, the upcoming Unity 5.4 will support SteamVR natively and performance should be more or less identical. Please note: this is Unity specific and other engines might have similar or different optimizations for some or all headsets.
====Cross-Platform Considerations====
While OpenXR has successfully standardized the foundational layer of XR development, its "write once, run anywhere" promise is not absolute. For standalone Android-based headsets, developers often still need to create separate application packages (APKs) for different platforms like Meta Quest and Pico, as these platforms may use their own vendor-specific OpenXR loaders.<ref name="RedditOpenXRUniversal">https://www.reddit.com/r/virtualreality/comments/wi6w2x/vr_devs_just_how_universal_is_openxr_anyways/ - VR Devs, just how universal is OpenXR anyways?</ref>
 
Furthermore, innovative platform-specific features—such as Meta's advanced Passthrough capabilities, hand tracking enhancements, or [[spatial anchors]]—are exposed to developers through vendor-specific extensions to the OpenXR standard.<ref name="MetaOpenXRAdvancements">https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR</ref> To leverage these powerful features, a developer must write code that specifically targets that vendor's hardware.
 
===Meta Quest SDK===
The [[Meta Quest SDK]] (formerly [[Oculus SDK]]) supports [[Meta Quest 2]], [[Meta Quest 3]], [[Meta Quest 3S]], and [[Meta Quest Pro]] headsets. As of March 2025, [[Meta]] transitioned to recommending [[OpenXR]] as the primary development approach, while still providing the [[Meta XR Core SDK]] for [[Horizon OS]]-specific features.<ref name="openxr-meta"></ref>
 
====Supported Platforms====
* [[Meta Quest 2]]
* [[Meta Quest 3]]
* [[Meta Quest 3S]]
* [[Meta Quest Pro]]
 
====Key Features====
* Native [[OpenXR]] integration
* [[Hand tracking]] via [[OpenXR]] hand skeleton (as of SDK version 78)<ref name="meta-hand-tracking">https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Meta XR Interaction SDK Overview</ref>
* [[Passthrough API]] for [[mixed reality]]
* [[Spatial anchors]] and [[scene understanding]]
* [[Haptic feedback]] and [[controller]] support
* [[Meta XR Simulator]] for desktop testing
 
===Apple Vision Pro SDK (visionOS)===
The [[visionOS SDK]] enables development for [[Apple Vision Pro]], [[Apple]]'s [[spatial computing]] platform launched in February 2024.<ref name="visionos-launch">https://en.wikipedia.org/wiki/VisionOS - visionOS - Wikipedia</ref>
 
====Development Tools====
* [[Xcode]] 15.2 or later with [[visionOS SDK]]
* [[SwiftUI]] for [[UI]] development
* [[RealityKit]] for [[3D rendering]]
* [[ARKit]] for [[spatial awareness]]
* [[Reality Composer Pro]] for [[3D content]] creation
* [[Unity]] support (Apple Silicon Mac required)
 
====Key Features====
* [[Eye tracking]] and [[hand tracking]] as primary inputs
* [[Spatial audio]] with [[head tracking]]
* [[Passthrough]] and [[mixed reality]]
* [[Foveated rendering]] (Dynamically Foveated Rendering)
* [[Mac Virtual Display]] integration
* [[WebXR]] support in Safari
 
====Requirements====
* Mac with [[Apple Silicon]] (M1 or later)<ref name="visionos-unity">https://www.qualium-systems.com/blog/everything-youd-like-to-know-about-visionos-development/ - Everything about visionOS Development</ref>
* macOS Monterey or later
* [[Xcode]] 15.2 or later
* For [[Unity]]: Unity 2022 LTS (2022.3.191 or newer), Apple Silicon version only
 
===PlayStation VR2 SDK===
[[PSVR2]] development uses the same [[SDK]] as [[PlayStation 5]], making porting between platforms more straightforward.<ref name="psvr2-sdk">https://www.pushsquare.com/news/2022/09/developers-should-have-an-easier-time-porting-their-games-to-psvr2 - Developers Should Have an Easier Time Porting Games to PSVR2</ref>
 
====Supported Features====
* [[Unity]] and [[Unreal Engine]] support
* [[Hand tracking]] (added in 2024 SDK update)<ref name="psvr2-hand-tracking">https://gamerant.com/playstation-vr2-update-new-feature-hand-tracking/ - PlayStation VR2 SDK Update Adds Hand Tracking</ref>
* [[Eye tracking]]
* [[HDR]] [[OLED]] displays
* [[Haptic feedback]] via [[Sense controllers]]
* [[Foveated rendering]]
* [[PC VR]] support via [[PSVR2 PC adapter]] (launched August 2024)
 
===Pico SDK===
[[Pico SDK]], developed by [[ByteDance]]'s [[Pico]] division, supports the [[Pico 4]], [[Pico 4 Pro]], and [[Pico 4 Ultra]] headsets available in Europe and Asia.<ref name="pico-sdk">https://developer.picoxr.com/ - PICO Developer - Official Developer Portal</ref>
 
====Supported Platforms====
* [[Pico 4]]
* [[Pico 4 Pro]]
* [[Pico 4 Ultra]] (launched 2024)
 
====Key Features====
* [[6DoF]] head and [[hand tracking]]
* [[Mixed reality]] capabilities
* [[Body tracking]]
* [[Face tracking]]
* [[Eye tracking]]
* [[Spatial anchors]]
* [[Unity]] and [[Unreal Engine]] support
* [[OpenXR]] compatibility
 
===AR-Specific SDKs===
 
====ARKit====
[[ARKit]] is Apple's framework for iOS [[AR]], current version 6. Features include:<ref name="ARKitDocs">https://developer.apple.com/augmented-reality/arkit/ - ARKit - Augmented Reality - Apple Developer</ref>
* 4K video capture
* Depth API with [[LiDAR]]
* Motion capture
* Scene geometry
* People occlusion
 
====ARCore====
[[ARCore]] is Google's SDK for Android [[AR]]. Supports:<ref name="ARCoreDocs">https://developers.google.com/ar - Build new augmented reality experiences - Google AR</ref>
* Motion tracking
* Anchors
* Environmental understanding
* Depth API
* Geospatial API
* Scene semantics
 
==Game Engines==
 
===Choosing a Game Engine: Unity vs. Unreal===
The choice of a game engine is one of the most fundamental decisions in XR development. Unity and Unreal Engine are the two dominant forces in the industry, each with a distinct set of strengths, weaknesses, and development philosophies.
 
====Unity====
[[Unity]] remains one of the most popular engines for [[VR development]]. As of 2025, [[Unity]] provides native [[OpenXR]] support through the [[XR Interaction Toolkit]].<ref name="unity-xr-toolkit">https://docs.unity3d.com/Packages/[email protected]/manual/index.html - XR Interaction Toolkit - Unity Documentation</ref>
 
=====Core Strengths=====
Unity's core is built around the [[C#]] programming language, which is widely regarded as having a more gentle learning curve compared to C++.<ref name="AnimostUnityVsUnreal">https://animost.com/ideas-inspirations/unity-vs-unreal-engine-for-xr-development/ - Unity vs. Unreal Engine for XR Development</ref> This accessibility, combined with a user-friendly interface, makes it an attractive option for developers of all experience levels, from indie creators to large studios.<ref name="DailyDevUnityVsUnreal">https://daily.dev/blog/unity-vs-unreal-engine-for-vrar-development - Unity vs Unreal Engine for VR/AR Development</ref>
 
=====Current Version=====
Unity 6 (2025)
 
=====Asset Ecosystem=====
A significant accelerator for Unity development is the [[Unity Asset Store]]. It offers a vast library of pre-built tools, 3D models, scripts, and plugins that can dramatically reduce development time and cost.<ref name="WebarooUnityVsUnreal">https://www.webaroo.us/insights/building-ar-vr/ - Building for AR/VR: Unity vs. Unreal Engine</ref> This rich ecosystem allows teams to prototype rapidly and bring products to market faster.
 
=====VR Features=====
* [[XR Interaction Toolkit]] (current version 3.1.2)
* [[OpenXR Plugin]] with native support (version 1.14 achieves feature parity with legacy [[Oculus XR Plugin]])<ref name="unity-openxr-parity">https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Unity's OpenXR Plugin 1.14 Feature Parity</ref>
* [[AR Foundation]] for [[AR development]]
* Native support for [[Meta Quest]], [[PSVR2]], and other [[VR platforms]]
* [[Meta Quest Link]] for rapid iteration
 
=====Key Packages=====
{| class="wikitable"
|-
! Package !! Purpose !! Current Version
|-
| [[XR Interaction Toolkit]] || High-level interaction system || 3.1.2
|-
| [[OpenXR Plugin]] || [[OpenXR]] runtime support || 1.14+
|-
| [[XR Plugin Management]] || [[XR]] backend management || Latest
|-
| [[AR Foundation]] || [[AR]] development framework || Latest
|-
| [[Meta XR SDK]] || [[Meta Quest]]-specific features || Latest
|}
 
====Unreal Engine====
[[Unreal Engine 5]] provides robust [[VR development]] capabilities with built-in [[OpenXR]] support. As of 2025, [[Unreal Engine 5.6]] is the current major version.<ref name="ue56-release">https://www.unrealengine.com/en-US/news/unreal-engine-5-6-is-now-available - Unreal Engine 5.6 Release Announcement</ref>
 
=====Core Strengths=====
Unreal Engine's primary advantage is its advanced rendering engine, which delivers exceptional visual quality with minimal setup. Features like [[Lumen]] for dynamic global illumination and [[Nanite]] for virtualized micropolygon geometry enable developers to create incredibly realistic and detailed worlds, making it a powerhouse for PCVR and high-end simulations.<ref name="RedditUnityVsUnreal">https://www.reddit.com/r/virtualreality/comments/z5i23c/unity_or_unreal_for_vr_dev/ - Unity or Unreal for VR dev? - Reddit Discussion</ref>
 
=====Current Versions=====
* [[Unreal Engine 5.6]] (June 2025)
* [[Unreal Engine 5.5]] (December 2024)
* [[Unreal Engine 5.7 Preview]] (September 2025)
 
=====Scripting Model=====
Unreal Engine employs a dual-language development model. For maximum performance and low-level system control, developers use [[C++]].<ref name="AnimostUnityVsUnreal"></ref> Complementing this is the [[Blueprints]] system, a powerful node-based interface that allows designers, artists, and programmers to build complex game logic without writing traditional code.<ref name="WebarooUnityVsUnreal"></ref>
 
=====VR Features=====
* Built-in [[OpenXR]] support
* [[Nanite]] virtualized geometry system
* [[Lumen]] global illumination
* [[Virtual Scouting]] toolset with [[OpenXR]]-compatible [[HMDs]]<ref name="ue55-vr">https://dev.epicgames.com/documentation/en-us/unreal-engine/unreal-engine-5-5-release-notes - Unreal Engine 5.5 Release Notes</ref>
* [[MetaHuman]] creation tools
* [[Mobile Forward Renderer]] improvements for [[PC VR]]
* [[XR]] dynamic resolution support
 
=====VR Development Tools=====
* [[Virtual Scouting]] with [[VR Content Browser]]
* [[OpenXR]] native integration
* [[SteamVR]] support
* [[Meta]] fork of [[Unreal Engine]] for [[Horizon OS]]-specific features
 
====Godot====
[[Godot Engine]] has significantly improved [[XR]] support with native [[OpenXR]] integration in [[Godot 4]].<ref name="godot-openxr">https://godotengine.org/article/godot-xr-update-feb-2025/ - Godot XR Update February 2025</ref> [[Meta]] has sponsored improvements to [[Godot]]'s [[OpenXR]] support.<ref name="godot-khronos">https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - Advancing OpenXR Development in Godot</ref>
 
=====Current Version=====
[[Godot 4.3]] (2025)
 
=====XR Features=====
* Built-in [[OpenXR]] support (no plugin required for [[Godot 4]])
* [[Godot XR Tools]] addon (version 4.4.0)
* [[Meta XR Simulator]] integration
* [[WebXR]] support for browser-based [[VR]]
* [[Godot Meta Toolkit]] (version 1.0.2) for [[Meta Platform SDK]]
* [[OpenXR Vendors]] plugin for platform-specific features
 
=====Supported Platforms=====
* [[Meta Quest]] via [[OpenXR]]
* [[SteamVR]]
* [[PSVR2]]
* [[WebXR]] (browser-based [[VR]])
 
====Comparative Analysis====
{| class="wikitable"
|+ Unity vs. Unreal Engine for XR Development
|-
! Feature !! Unity !! Unreal Engine
|-
! Primary Scripting || [[C#]]<ref name="AnimostUnityVsUnreal"></ref> || [[C++]]<ref name="AnimostUnityVsUnreal"></ref>
|-
! Visual Scripting || Bolt (Visual Scripting package)<ref name="AnimostUnityVsUnreal"></ref> || [[Blueprints]] (deeply integrated)<ref name="WebarooUnityVsUnreal"></ref>
|-
! Learning Curve || More gentle, user-friendly interface<ref name="DailyDevUnityVsUnreal"></ref> || Steeper, especially for its custom C++ framework<ref name="WebarooUnityVsUnreal"></ref>
|-
! Graphical Fidelity (Out-of-the-box) || Good, but often requires configuration to achieve high-end results<ref name="AnimostUnityVsUnreal"></ref> || Excellent, industry-leading visuals with Lumen and Nanite<ref name="RedditUnityVsUnreal"></ref>
|-
! Asset Ecosystem || Extensive ([[Unity Asset Store]]), major strength for rapid development<ref name="WebarooUnityVsUnreal"></ref> || Growing, but smaller than Unity's
|-
! Community Size || Larger, more beginner-friendly resources<ref name="AnimostUnityVsUnreal"></ref> || Smaller but strong, particularly for high-end development<ref name="AnimostUnityVsUnreal"></ref>
|-
! Primary Target Platform (XR) || Strongest in standalone and mobile VR (e.g., Meta Quest)<ref name="RedditUnityVsUnreal"></ref> || Strongest in PCVR and high-fidelity simulations<ref name="RedditUnityVsUnreal"></ref>
|-
! Ideal Use Cases || Indie games, mobile/standalone VR/AR, rapid prototyping, projects prioritizing speed-to-market || AAA games, architectural visualization, cinematic experiences, enterprise simulations requiring photorealism<ref name="WebarooUnityVsUnreal"></ref>
|}
 
==Developing with Unity==
Setting up a Unity project for XR development involves a series of specific configuration steps to enable communication with XR hardware and import the necessary SDKs.
 
===Project Setup for XR===
 
====Prerequisites====
Before creating a project, developers must use the [[Unity Hub]] to install a supported version of the Unity Editor (e.g., 2022.3 LTS or newer). During installation, it is crucial to include the [[Android Build Support]] module, as this is a requirement for developing applications for Android-based standalone headsets like the Meta Quest.<ref name="MetaUnitySetup">https://developers.meta.com/horizon/documentation/unity/unity-project-setup/ - Set Up Your Unity Project for Meta Quest Development</ref>
 
====Creating a Project and Configuring the Build Platform====
A new project should be created using the '''3D (URP)''' template. The [[Universal Render Pipeline]] provides a modern, performant rendering foundation suitable for the majority of XR applications.<ref name="MetaUnitySetup"></ref> Once the project is created, the first step is to set the target build platform. This is done by navigating to `File > Build Settings`. For Meta Quest development, select '''Meta Quest''' (in Unity 6.1 and later) or '''Android''' (in older versions) and click the "Switch Platform" button.<ref name="MetaUnitySetup"></ref>
 
====XR Plug-in Management====
Unity communicates with XR runtimes through its [[XR Plugin Management]] system. This package must be installed from the Package Manager if it is not already present.
# Navigate to `Edit > Project Settings`.
# Select the `XR Plug-in Management` tab.
# In both the Standalone (PC icon) and Android tabs, check the box for '''OpenXR'''. This tells Unity to use the OpenXR API to interface with the headset's runtime.<ref name="MetaUnitySetup"></ref>
 
This step is critical as it enables the core connection between the engine and the XR hardware.
 
===The Meta XR SDK for Unity===
For developers targeting Meta Quest devices, the [[Meta XR SDK]] is essential. It provides access to the full suite of the platform's features.
 
====Installation and Core Components====
The primary package is the [[Meta XR Core SDK]], which is installed from the [[Unity Package Manager]] via the `Window > Package Manager` interface.<ref name="MetaXRCoreSDKDocs">https://developers.meta.com/horizon/documentation/unity/unity-core-sdk/ - Meta XR Core SDK for Unity</ref> This SDK includes several key components:
* '''OVRCameraRig:''' A pre-configured camera prefab that serves as the XR rig. It replaces the standard Unity camera and automatically handles head and controller tracking, mapping the user's physical movements into the virtual scene.<ref name="MetaDevOverview"></ref>
* '''OVRInput:''' A robust API for handling input from the Touch controllers and hand tracking.<ref name="MetaXRCoreSDKDocs"></ref>
* '''Project Setup Tool:''' A utility that analyzes the project for common configuration errors and provides one-click fixes to apply recommended settings.<ref name="MetaXRCoreSDKDocs"></ref>
 
====Project Setup Tool====
After importing the Core SDK, developers should immediately run the [[Project Setup Tool]] by navigating to `Meta XR > Tools > Project Setup Tool`. This tool checks for dozens of required settings related to graphics, physics, and build configurations. Clicking the '''Fix All''' and '''Apply All''' buttons will automatically configure the project according to Meta's best practices.<ref name="MetaUnitySetup"></ref><ref name="UnityBestPractices">https://developers.meta.com/horizon/documentation/unity/unity-best-practices-intro/ - Best Practices for Unity</ref>
 
====Other Meta SDKs====
The Meta ecosystem is composed of several specialized SDKs that build upon the Core SDK:
* [[Meta XR Interaction SDK]]: A high-level framework for creating natural and robust interactions like grabbing, poking, and interacting with UI using both controllers and hands.<ref name="MetaInteractionSDKDocs">https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Interaction SDK</ref>
* [[Meta XR Audio SDK]]: Provides advanced spatial audio features, including [[HRTF]]-based spatialization and room acoustics simulation.<ref name="MetaAudioSDKDocs">https://developers.meta.com/horizon/documentation/unity/meta-xr-audio-sdk-unity/ - Meta XR Audio SDK Overview</ref>
* [[Meta XR Platform SDK]]: Enables integration with Meta's platform services, such as leaderboards, achievements, user profiles, and multiplayer matchmaking.<ref name="MetaPlatformSDKDocs">https://developers.meta.com/horizon/downloads/package/meta-xr-platform-sdk/ - Meta XR Platform SDK</ref>
 
The '''Meta XR All-in-One SDK''' is available on the Asset Store as a convenient package to manage these various SDKs.<ref name="MetaAllInOneSDK">https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 - Meta XR All-in-One SDK</ref>
 
===Unity's XR Interaction Toolkit (XRI)===
The [[XR Interaction Toolkit]] (XRI) is Unity's own high-level, component-based framework for building XR interactions. It is designed to be flexible and extensible, providing a solid foundation for VR and AR projects.<ref name="XRI2.0Docs">https://docs.unity3d.com/Packages/[email protected]/ - XR Interaction Toolkit 2.0.0</ref>
 
====Core Concepts====
XRI's architecture is built around a few key concepts:
* '''Interaction Manager:''' A singleton component that acts as the central hub, mediating all interactions between Interactors and Interactables in a scene.<ref name="XRI2.0Docs"></ref>
* '''Interactors:''' These are components that initiate actions. They represent the user's hands or controllers. Common types include:
** `XR Ray Interactor`: For pointing at and selecting objects from a distance.<ref name="XRILearnTutorial">https://learn.unity.com/tutorial/using-interactors-and-interactables-with-the-xr-interaction-toolkit - Using Interactors and Interactables with the XR Interaction Toolkit</ref>
** `XR Direct Interactor`: For directly touching and grabbing objects that are within arm's reach.<ref name="XRILearnTutorial"></ref>
** `XR Socket Interactor`: A static interactor that objects can be snapped into, useful for puzzles or placing items in specific locations.<ref name="XRILearnTutorial"></ref>
* '''Interactables:''' These are components placed on objects in the scene that can be acted upon by Interactors. The most common is the `XR Grab Interactable`, which allows an object to be picked up, held, and thrown.<ref name="XRILearnTutorial"></ref>
 
====Locomotion and UI====
XRI includes a complete locomotion system that can be added to the [[XR Origin]] (the main camera rig). This includes a [[Teleportation Provider]] for point-and-click movement and a [[Continuous Move Provider]] for smooth, joystick-based locomotion. It also provides tools for interacting with world-space UI canvases, using components like the `Tracked Device Graphic Raycaster` to allow ray interactors to click on buttons and other UI elements.<ref name="XRILatestDocs">https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@latest/ - XR Interaction Toolkit</ref>
 
==Developing with Unreal Engine==
Unreal Engine offers a powerful, high-fidelity platform for XR development, with a streamlined setup process centered around its VR Template and OpenXR integration.
 
===Project Setup for XR===
 
====Prerequisites====
Developers must first install Unreal Engine via the [[Epic Games Launcher]]. For development targeting standalone Android headsets like the Meta Quest, it is also necessary to install and configure [[Android Studio]] and the required Android SDK and NDK components.<ref name="MetaUnrealSetup">https://developers.meta.com/horizon/documentation/unreal/unreal-create-and-configure-new-project/ - Create and Configure a New Project in Unreal Engine</ref>
 
====Creating a Project and Plugin Configuration====
The recommended way to start a new XR project is by using the built-in '''Virtual Reality''' template. This can be selected from the "Games" category in the Unreal Project Browser when creating a new project.<ref name="MetaUnrealSetup"></ref> This template provides a pre-configured project with essential plugins enabled, a basic level, and a functional player pawn with locomotion and interaction systems.<ref name="BrownVRWiki">https://www.vrwiki.cs.brown.edu/vr-development-software/unreal-engine-5/adding-vr-to-an-existing-ue5-world - Adding VR to an Existing UE5 World</ref>
 
After creation, developers should verify that the necessary plugins are enabled by navigating to `Edit > Plugins`. The most important plugin is '''OpenXR'''. Platform-specific plugins like '''OculusVR''' (for Meta devices) or '''SteamVR''' should also be enabled depending on the target hardware.<ref name="UnrealXRDocs">https://dev.epicgames.com/documentation/en-us/unreal-engine/xr-development?application_version=4.27 - XR Development - Unreal Engine Documentation</ref>
 
===The Meta XR Plugin for Unreal===
The '''Meta XR Plugin''' (often referred to as the OculusVR plugin) is the key to unlocking the full feature set of Meta Quest devices in Unreal Engine. It provides access to platform-specific functionalities not covered by the core OpenXR standard.<ref name="MetaUnrealSetup"></ref> This includes:
* Advanced tracking features like Hand Tracking, Body Tracking, and Face/Eye Tracking
* Mixed Reality features such as Passthrough, Spatial Anchors, and Scene understanding
* Performance optimization tools like [[Application SpaceWarp]] and [[Fixed Foveated Rendering]]
* Platform services integration for leaderboards, achievements, and parties
* The Movement SDK for more realistic avatar motion and the Voice SDK for voice commands and dictation
 
===Leveraging the VR Template===
The standard VR Template is a powerful starting point that encapsulates logic for many common VR features.<ref name="UnrealVRTemplateDocs">https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-template-in-unreal-engine - VR Template in Unreal Engine</ref>
 
====Core Components====
The template is built around a few key assets:
* '''VRPawn:''' This [[Blueprint]] is the user's representation in the virtual world. It contains the camera component, motion controller components for tracking the hands, and all the logic for handling input and movement.<ref name="UnrealVRTemplateDocs"></ref><ref name="BrownVRWiki"></ref>
* '''VRGameMode:''' This object defines the rules for the level, most importantly specifying that the `VRPawn` should be the default pawn class for the player.<ref name="UnrealVRTemplateDocs"></ref>
 
====Locomotion Systems====
The template includes two primary, comfort-oriented locomotion methods:<ref name="UnrealVRTemplateDocs"></ref>
* '''Teleportation:''' The user can aim with one controller, which displays an arc and a target location. Releasing the thumbstick instantly moves the pawn to that location. This system relies on a [[Nav Mesh Bounds Volume]] placed in the level.
* '''Snap Turn:''' Using the other controller's thumbstick, the user can rotate their view in discrete increments (e.g., 45 degrees). This avoids the smooth rotation that can cause motion sickness for some users.


==Runtimes==
==Runtimes==
[[Runtimes]] are  
[[Runtimes]] handle core [[VR]] functionality including rendering, [[tracking]], and [[reprojection]].
 
===OpenXR Runtime===
[[OpenXR]] provides a unified runtime layer that works across multiple [[VR platforms]]. As of 2025, most major [[VR headsets]] support [[OpenXR]] runtime.<ref name="openxr-adoption">https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?</ref>
 
'''Supported by:'''
* [[Meta Quest Runtime]]
* [[SteamVR Runtime]]
* [[Windows Mixed Reality]]
* [[Pico Runtime]]
* [[Monado]] (open-source runtime for Linux)
 
===SteamVR / OpenVR===
[[SteamVR]] supports both [[OpenVR]] (legacy) and [[OpenXR]] runtimes. [[OpenVR]] was developed by [[Valve]] and has been the default runtime for [[SteamVR]], though [[OpenXR]] is increasingly recommended.<ref name="steamvr-openvr"></ref>
 
====Current State (2025)====
* [[SteamVR]] defaults to [[OpenVR]] but supports [[OpenXR]]
* [[Lighthouse]] tracking system still used for high-precision [[tracking]]
* [[Full body tracking]] via [[Vive Trackers]]
* Support for multiple [[VR headsets]] including [[Valve Index]], [[HTC Vive]], and others
 
====Features====
* [[Reprojection]] for smooth frame rates
* [[Room-scale]] [[VR]] setup
* [[Controller]] and [[tracker]] support
* [[Overlay]] system for [[UI]]
 
===Meta Quest Runtime===
The [[Meta Quest Runtime]] handles [[VR]] rendering and [[tracking]] for [[Meta Quest]] devices, now with integrated [[OpenXR]] support.
 
====Features====
* [[Application SpaceWarp]] (ASW) - [[Meta]]'s [[reprojection]] technology
* [[Asynchronous SpaceWarp]] for maintaining frame rate
* [[Guardian]] boundary system
* Native [[hand tracking]]
* [[Passthrough]] [[API]]
 
==Essential Best Practices for XR Development==
Beyond the specifics of any single engine, creating a successful XR application requires adherence to a set of universal principles focused on user comfort, intuitive interaction, and technical performance.
 
===User Experience and Comfort===
The primary goal of XR design is to create a sense of presence and immersion. This can be easily broken by experiences that are uncomfortable or unintuitive.
 
====Locomotion Design====
Movement in VR is the single greatest challenge for user comfort. The sensory conflict between seeing motion in the headset while the body's vestibular system reports being stationary can lead to [[Visually induced motion sickness]] (VIMS), also known as VR sickness.<ref name="SpurqLabsBestPractices">https://spurqlabs.com/vr-testing-challenges-best-practices-what-developers-should-know/ - VR Testing Challenges & Best Practices</ref>
 
=====Comfort-First Techniques=====
* '''[[Teleportation]]:''' This method instantly moves the user from one point to another, completely avoiding the continuous visual flow that causes motion sickness. It is consistently the most comfortable and widely preferred option.<ref name="UWLocomotionStudy">https://faculty.washington.edu/wobbrock/pubs/assets-23.02.pdf - Which VR Locomotion Techniques are Most Accessible?</ref><ref name="MetaLocomotionBestPractices">https://xrdesignhandbook.com/docs/Meta/Locomotion%20Best%20Practices.html - Locomotion Best Practices</ref>
* '''Snap Turns:''' Replace smooth, continuous rotation with discrete, instantaneous jumps in orientation. This is significantly more comfortable for many users than smooth turning.<ref name="SpurqLabsBestPractices"></ref>
 
=====Continuous Locomotion and Comfort Options=====
While more immersive for some, smooth joystick-based movement is the most likely to cause discomfort. If offered, it should always be an option, not a requirement, and should be accompanied by comfort settings. A common technique is '''vignetting''' (also called tunneling), which narrows the user's field of view during movement, reducing peripheral [[optic flow]].<ref name="MetaLocomotionDesignGuide">https://developers.meta.com/horizon/blog/now-available-vr-locomotion-design-guide/ - VR Locomotion Design Guide</ref>
 
=====Environmental Design=====
The design of the virtual world itself can impact comfort. Developers should:
* Avoid forcing movement on steep slopes or stairs
* Keep walls and large objects at a comfortable distance to reduce optic flow
* Consider using an [[Independent Visual Background]] (IVB), such as a static skybox that only moves with head rotation<ref name="MetaLocomotionBestPractices"></ref>
 
====Interaction Design====
Intuitive interaction is key to maintaining presence. The choice of input modality has a significant impact on the user experience.
 
=====Controllers vs. Hand Tracking=====
* '''Controllers''' offer high precision, tactile feedback through buttons and triggers, and haptic feedback, making them ideal for gaming, creative tools, and any application requiring reliable and precise input.<ref name="MetaInputModalities">https://developers.meta.com/horizon/design/interactions-input-modalities/ - Interactions and Input Modalities</ref>
* '''Hand tracking''' uses the headset's cameras to track the user's bare hands, offering a highly natural and intuitive interaction method. However, its performance can be affected by poor lighting, fast movements, and occlusion. It is generally less precise than controllers and is best suited for more casual experiences.<ref name="MetaHandTrackingDesign">https://developers.meta.com/horizon/design/hands - Hands - Meta for Developers</ref>
 
=====Multimodal Input=====
The best practice is often to support multiple input modalities. Allowing a user to seamlessly switch between controllers, hands, and even voice commands provides maximum accessibility.<ref name="MetaHandTrackingDesign"></ref>
 
====UI/UX in 3D Space====
Designing User Interfaces (UI) for a 3D, immersive space presents unique challenges.
 
=====Placement and Comfort=====
UI panels should be placed at a comfortable viewing distance, typically around 1-2 meters from the user, and positioned slightly below their natural line of sight to avoid neck strain.<ref name="MetaMRDesignGuidelines">https://developers.meta.com/horizon/design/mr-design-guideline/ - Mixed Reality Design Guidelines</ref>
 
=====World-Locking vs. Head-Locking=====
A critical rule is to '''avoid''' locking UI elements directly to the user's head (a "head-locked" HUD). This is extremely uncomfortable and can cause motion sickness. Instead, UI should be "world-locked" (anchored to a position in the environment) or, if it must follow the user, it should do so with a gentle, smoothed animation.<ref name="MetaMRDesignGuidelines"></ref>
 
=====Feedback=====
Clear and immediate feedback is essential. Interactive elements should have distinct hover states (e.g., highlighting or scaling up), and actions like button presses should be accompanied by both visual and auditory cues.<ref name="MetaMRDesignGuidelines"></ref>
 
===Technical Performance and Optimization===
Maintaining a consistently high and stable frame rate is the most important technical requirement for a comfortable VR experience.
 
====Understanding VR Performance Metrics====
* '''Framerate vs. Frame Time:''' '''Framerate''', measured in Frames Per Second (FPS), is the average number of frames rendered over a second. '''Frame Time''', measured in milliseconds (ms), is the time it takes to render a single frame. Frame time is a more accurate indicator of performance smoothness.<ref name="VRPerfNotesGitHub">https://github.com/authorTom/notes-on-VR-performance - Notes on VR Performance</ref>
* '''Performance Targets:''' For standalone headsets like the Meta Quest series, applications must achieve a minimum of 72 FPS. For PCVR, the target is typically 90 FPS or higher.<ref name="MetaUnityPerfDocs">https://developers.meta.com/horizon/documentation/unity/unity-perf/ - Performance and Profiling for Unity</ref>
 
====Profiling and Debugging====
The first step in optimization is identifying the bottleneck. Performance issues are typically either '''CPU-bound''' (caused by complex game logic, physics, or too many draw calls) or '''GPU-bound''' (caused by high resolution, complex shaders, or too much geometry).<ref name="MetaPCPerfGuidelines">https://developers.meta.com/horizon/documentation/native/pc/dg-performance-guidelines/ - Performance Guidelines</ref>
 
Essential profiling tools include:
* '''Engine Profilers:''' [[Unity Profiler]] and [[Unreal Insights]] are powerful, built-in tools for analyzing CPU and GPU usage.<ref name="VRPerfNotesGitHub"></ref>
* '''Platform Tools:''' [[Meta Quest Developer Hub]] (MQDH) provides real-time performance metrics and analysis tools for Quest development.<ref name="MetaMQDHDocs">https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ - Meta Quest Developer Hub</ref>
* '''Graphics Debuggers:''' Tools like [[RenderDoc]] allow for in-depth analysis of every command sent to the GPU.<ref name="VRPerfNotesGitHub"></ref>
 
====Core Optimization Strategies====
* '''Reduce Draw Calls:''' Combine multiple textures into a single '''texture atlas''' and use '''static batching''' to draw multiple similar objects in a single call. For standalone Quest, developers should aim for 50-200 draw calls per frame.<ref name="MetaUnityPerfDocs"></ref><ref name="MetaDevBlogDrawCalls">https://developers.meta.com/horizon/blog/down-the-rabbit-hole-w-oculus-quest-developer-best-practices-the-store/ - Developer Best Practices</ref>
* '''Optimize Geometry:''' Keep polygon counts within the budget for the target platform (e.g., 750k-1M triangles for Quest 2). Use [[Level of Detail]] (LOD) systems.<ref name="MetaUnityPerfDocs"></ref>
* '''Simplify Lighting and Shaders:''' Use '''baked lighting''' whenever possible. Avoid complex, multi-pass shaders and full-screen post-processing effects on mobile hardware.<ref name="UnityBestPractices"></ref>
* '''Use Occlusion Culling:''' This prevents the engine from rendering objects that are completely hidden from the camera's view.<ref name="MediumVRPerfOpt">https://medium.com/@lemapp09/beginning-game-development-vr-performance-optimization-78553530ca83 - VR Performance Optimization</ref>
 
====Understanding Reprojection Technologies====
When an application fails to render a new frame in time for the headset's display refresh, VR platforms employ [[asynchronous reprojection]] techniques. These systems act as a safety net, but they are not a substitute for good performance.
 
{| class="wikitable"
|+ Key Reprojection Technologies Explained
|-
! Technology !! Developer !! Mechanism !! Corrects For !! Common Artifacts
|-
! [[Asynchronous Timewarp]] (ATW) || Meta || Takes the last successfully rendered frame and re-projects it based on the newest head '''rotation''' data.<ref name="MetaATWBlog">https://developers.meta.com/horizon/blog/asynchronous-timewarp-on-oculus-rift/ - Asynchronous Timewarp on Oculus Rift</ref><ref name="MetaATWDocs">https://developers.meta.com/horizon/documentation/native/android/mobile-timewarp-overview/ - Asynchronous TimeWarp (ATW)</ref> || Head Rotation only. Does not account for positional movement or in-scene animation.<ref name="XinrealityTimewarp">https://xinreality.com/wiki/Timewarp - Timewarp</ref> || Positional judder, animation judder.<ref name="MetaASWBlog">https://developers.meta.com/horizon/blog/asynchronous-spacewarp/ - Asynchronous Spacewarp</ref>
|-
! [[Asynchronous SpaceWarp]] (ASW) || Meta || When an app's framerate drops to half the display's refresh rate, ASW analyzes the motion between two previous frames to synthesize and insert a new, predicted frame.<ref name="MetaASWDocs">https://developers.meta.com/horizon/documentation/native/pc/asynchronous-spacewarp/ - Asynchronous SpaceWarp</ref> || Head rotation, head position, controller movement, and in-scene animation.<ref name="MetaASWBlog"></ref> || Warping, ghosting, or "bubbling" artifacts, especially around fast-moving objects.<ref name="MetaASWBlog"></ref><ref name="RedditASWDisable">https://www.reddit.com/r/oculus/comments/bvcoh8/always_disable_asynchronous_spacewarp/ - Always disable Asynchronous Spacewarp</ref>
|-
! [[Motion Smoothing]] || Valve || Similar to ASW, it activates when an application cannot maintain framerate. It looks at the last two frames to estimate motion and extrapolates a new frame.<ref name="SteamVRMotionSmoothing">https://steamcommunity.com/games/250820/announcements/detail/1705071932992003492 - Introducing SteamVR Motion Smoothing</ref> || Head rotation, head position, controller movement, and in-scene animation.<ref name="SteamVRMotionSmoothing"></ref> || Similar to ASW, it can produce visual artifacts like warping or wobbling.<ref name="MSFSForumMotionReprojection">https://forums.flightsimulator.com/t/motion-reprojection-explained/548659 - Motion Reprojection Explained</ref>
|}
 
==App Stores and Distribution==
 
===Meta Quest Store===
The [[Meta Quest Store]] (formerly [[Oculus Store]]) is the primary distribution platform for [[Meta Quest]] applications on [[Horizon OS]].<ref name="meta-store">https://www.meta.com/experiences/ - Official Meta Quest Store</ref>
 
====Supported Devices====
* [[Meta Quest 2]]
* [[Meta Quest 3]]
* [[Meta Quest 3S]]
* [[Meta Quest Pro]]
 
====Distribution Methods====
* Official [[Meta Quest Store]] - Highly curated marketplace requiring stringent technical and content guidelines. Reserved for polished, high-quality experiences.<ref name="MetaDevBlogDrawCalls"></ref>
* [[App Lab]] - Alternative distribution path allowing developers to publish without full store curation. Apps are not browsable in the main store and can only be accessed via direct URL. Ideal for early access builds, experimental applications, and apps still in development.<ref name="MetaDevHome">https://developers.meta.com/ - Meta for Developers</ref>
* [[SideQuest]] - Third-party platform for sideloading applications
 
The submission process involves creating an app page, uploading builds (often using the [[Meta Quest Developer Hub]] desktop tool), providing store assets and metadata, and undergoing technical review.<ref name="MetaMQDHDocs"></ref><ref name="MetaSubmitApp">https://developers.meta.com/horizon/resources/publish-submit - Submitting your app</ref>
 
===Steam / SteamVR===
For PC-based VR, [[SteamVR]] is the dominant platform. It is known for its broad hardware compatibility, supporting not only Valve's own Index headset but also devices from HTC, Meta (via Link or [[Air Link]]), Windows Mixed Reality, and others.<ref name="SteamVRStore">https://www.steamvr.com/ - SteamVR</ref><ref name="SteamVRHomePage">https://store.steampowered.com/app/250820/SteamVR/ - SteamVR on Steam</ref>
 
====Supported Headsets====
* [[Valve Index]]
* [[HTC Vive]] series
* [[Meta Quest]] (via [[Meta Quest Link]] or [[Air Link]])
* [[Windows Mixed Reality]] headsets
* [[Pico]] headsets (via [[SteamVR]])
 
Publishing on Steam is handled through [[Steamworks]], Valve's suite of tools and services for developers. The barrier to entry is lower than the curated Meta Quest Store.<ref name="SteamVRPluginUnity">https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647 - SteamVR Plugin</ref> The recent release of the official Steam Link app on the Meta Quest Store has further solidified Steam's role as a central hub for PCVR.<ref name="MetaSteamLinkApp">https://www.meta.com/experiences/steam-link/5841245619310585/ - Steam Link on Meta Quest Store</ref>
 
===PlayStation Store (PSVR2)===
[[PSVR2]] games are distributed through the [[PlayStation Store]] for [[PlayStation 5]].<ref name="psvr2-games">https://www.playstation.com/en-us/ps-vr2/games/ - PlayStation VR2 games official page</ref>
 
====Notable Features====
* Integrated with [[PS5]] library
* Support for [[PSVR2]]-exclusive titles
* [[Cross-buy]] support for some titles
 
===Apple Vision Pro App Store===
[[Vision Pro]] applications are distributed through a dedicated section of the [[App Store]] for [[visionOS]].<ref name="visionos-apps">https://developer.apple.com/visionos/ - visionOS developer page</ref>
 
====App Types====
* Native [[visionOS]] apps
* Compatible [[iOS]]/[[iPadOS]] apps
* [[Unity]]-based [[VR]] experiences
* [[WebXR]] experiences via Safari
 
===Pico Store===
The [[Pico Store]] distributes content for [[Pico]] headsets in Europe and Asia markets.<ref name="pico-store">https://www.picoxr.com/global - PICO Global official website</ref> Pico offers a '''PICO Developer Program''' which provides qualifying developers with financial support, technical assistance, and marketing resources.<ref name="PicoDevProgram">https://developer.picoxr.com/developer-program/?enter_from=picoweb - PICO Developer Program</ref>
 
===Alternative Distribution: SideQuest===
[[SideQuest]] is the leading third-party platform and community for the Meta Quest ecosystem. Its primary function is to facilitate the "sideloading" of applications—the process of installing an app directly onto the headset via a PC, bypassing the official store.<ref name="SideQuestSetup">https://sidequestvr.com/setup-howto - Get SideQuest</ref>
 
SideQuest has become a vital hub for:<ref name="SideQuestHome">https://sidequestvr.com/ - SideQuest</ref>
* Early access and experimental games
* Unofficial ports of classic games to VR
* Content that may not meet the curation guidelines of the official store
 
{| class="wikitable"
|+ XR Distribution Platform Overview
|-
! Platform !! Primary Devices !! Curation Level !! Target Audience !! Key Feature
|-
! Meta Quest Store || Meta Quest series || High (Strict VRCs)<ref name="MetaDevBlogDrawCalls"></ref> || Mainstream consumers || Highest visibility and monetization potential for standalone VR<ref name="MetaDevHome"></ref>
|-
! App Lab || Meta Quest series || Low (Basic technical review)<ref name="MetaDevHome"></ref> || Early adopters, testers, niche communities || Distribute to Quest users via direct link without full store curation<ref name="MetaDevHome"></ref>
|-
! SteamVR || PCVR Headsets (Index, Vive, Quest via Link, etc.) || Low (Self-publishing model)<ref name="SteamVRPluginUnity"></ref> || PC gamers, VR enthusiasts || Broadest hardware compatibility for PCVR; large existing user base<ref name="SteamVRStore"></ref>
|-
! SideQuest || Meta Quest series || Low (Community-driven)<ref name="SideQuestHome"></ref> || VR enthusiasts, modders, indie game followers || Primary platform for sideloading, experimental content, and early access<ref name="SideQuestSetup"></ref>
|-
! Pico Store || Pico headsets (e.g., Pico 4) || High (Similar to Quest Store)<ref name="PicoDevFAQ">https://sdk.picovr.com/docs/FAQ/chapter_twentyeight.html - Developer Platform FAQ</ref> || Consumers in markets where Pico is prevalent || Growing ecosystem with developer incentive programs<ref name="PicoDevProgram"></ref>
|}
 
==Development Tools and Resources==
 
===XR Interaction Toolkits===
 
{| class="wikitable"
|-
! Toolkit !! Engine !! Features !! Current Version
|-
| [[XR Interaction Toolkit]] || [[Unity]] || [[Locomotion]], [[UI interaction]], [[object grabbing]] || 3.1.2
|-
| [[Godot XR Tools]] || [[Godot]] || [[Teleportation]], [[snap turning]], [[hand interactions]] || 4.4.0
|-
| [[Unreal VR Template]] || [[Unreal Engine]] || [[Motion controllers]], [[VR preview]] || Built-in
|-
| [[Meta XR Interaction SDK]] || [[Unity]] || [[Hand interactions]], [[controller support]] || Latest
|-
| [[PICO Unity Integration SDK]] || [[Unity]] || [[Pico]]-specific features || Latest
|}
 
===SDKs and Plugins===
 
====Meta XR SDKs====
* [[Meta XR Core SDK]] - Core [[Horizon OS]] functionality
* [[Meta XR Interaction SDK]] - Interaction systems
* [[Meta XR Simulator]] - Desktop testing without headset
* [[Meta XR Building Blocks]] - Pre-built [[VR]] components
 
====Other SDKs====
* [[OpenXR Toolkit]] - Enhanced [[OpenXR]] features
* [[Godot Meta Toolkit]] - [[Meta Platform SDK]] for [[Godot]]
* [[VRTK]] (Virtual Reality Toolkit) - Cross-platform [[VR]] interactions
 
===Testing and Debugging Tools===
 
* [[Meta XR Simulator]] - Test [[Meta Quest]] apps on PC/Mac
* [[visionOS Simulator]] - Test [[Vision Pro]] apps in [[Xcode]]
* [[WebXR Emulator Extension]] - Browser-based [[VR]] testing
* [[PICO Debugger]] - Debugging tool for [[Pico]] development
* [[SteamVR]] [[VR]] Preview - In-editor [[VR]] preview
 
==3D Models and Asset Marketplaces==
 
===Major 3D Asset Marketplaces===
 
{| class="wikitable"
|-
! Marketplace !! Specialty !! Notable Features !! URL
|-
| [[Fab]] || Unified content marketplace || Supports [[Unity]], [[Unreal Engine]], [[UEFN]]. Successor to [[Unreal Engine Marketplace]] and [[Sketchfab Store]]. Free [[Quixel Megascans]] through 2024. In 2025, 1,500+ Megascans free to all users.<ref name="fab-launch">https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - Fab launched October 2024</ref><ref name="fab-megascans">https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - In 2025, 1,500 Megascans will be free to all users</ref> || fab.com
|-
| [[CGTrader]] || General [[3D models]] || Over 2 million [[3D models]], supports [[VR]]/[[AR]] content, auction system<ref name="cgtrader">https://www.cgtrader.com/ - CGTrader 3D Models Marketplace</ref> || cgtrader.com
|-
| [[TurboSquid]] || Professional [[3D models]] || [[CheckMate]] certification, enterprise licensing, owned by [[Shutterstock]]<ref name="turbosquid">https://www.turbosquid.com/ - TurboSquid professional 3D models marketplace</ref> || turbosquid.com
|-
| [[Sketchfab]] || Web-based [[3D viewer]] || Real-time [[3D]] preview, [[VR]]/[[AR]] optimized, [[WebGL]] viewer || sketchfab.com
|-
| [[Unity Asset Store]] || [[Unity]] assets || Engine-specific assets, plugins, templates || assetstore.unity.com
|-
| [[Quixel Megascans]] || Photorealistic scans || Now part of [[Fab]], 1,500+ free assets in 2025<ref name="fab-megascans"></ref> || fab.com
|-
| [[Yobi3D]] || 3D model search engine || Aggregates models from various sites, supports [[VR]]/[[AR]]<ref name="yobi3d">https://www.yobi3d.com - Yobi3D 3D model search engine</ref> || yobi3d.com
|}
 
===Asset Types for VR Development===
 
* '''[[3D Models]]''': Characters, props, environments
* '''[[Textures]] and [[Materials]]''': [[PBR]] materials for realistic rendering
* '''[[Audio Assets]]''': [[Spatial audio]], sound effects
* '''[[Animations]]''': [[Rigged]] characters, [[mocap]] data
* '''[[VFX]]''': Particle effects, shaders
* '''[[Plugins]]''': Engine extensions and tools
 
===Free Asset Resources===


*'''[[Oculus Runtime]]''' is responsible for [[asynchronous timewarp]] and handles device detection, display, etc. It (the runtime service) needs to be running for [[Oculus Home]] to launch.
* [[Fab]] Free Content (monthly rotating selection)
* [[Quixel Megascans]] (1,500+ assets free in 2025)
* [[Unity Asset Store]] free section
* [[Sketchfab]] [[Creative Commons]] content
* [[Free3D]] community marketplace
* [[Poly Haven]] (formerly [[Poly]]): Free high-quality [[PBR]] assets


*'''[[SteamVR Runtime]]''' is responsible for [[reprojection]] and supports [[Rift]] and [[Vive]].
==Cross-Platform Development==


If you launch a game on Steam that supports [[Rift]] natively (meaning, it has been compiled against the [[Oculus SDK]]), you will get [[asynchronous timewarp]] and it will run exactly like the game from Oculus Home. It won't use the the [[SteamVR Runtime]], it will use the [[Oculus Runtime]]. Only downside is that launching the game is more difficult.  
===OpenXR for Cross-Platform Development===
[[OpenXR]] enables developers to create applications that work across multiple [[VR platforms]] without requiring separate codebases for each [[headset]].<ref name="openxr-cross-platform">https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - OpenXR provides unified API for VR and AR across platforms</ref>


If you buy a game that is compiled against the [[OpenVR SDK]], then it can run on [[Vive]] and Rift. However, running on Rift uses both runtimes:
====Supported Platforms via OpenXR====
* [[Meta Quest]] family
* [[SteamVR]]-compatible headsets
* [[Windows Mixed Reality]]
* [[PSVR2]] (limited support)
* [[Pico]] headsets


Rendered Image using the OpenVR SDK -> SteamVR Runtime -> Oculus Runtime -> Rift
====Engine Support====
* [[Unity]] (via [[OpenXR Plugin]] 1.14+)
* [[Unreal Engine]] (built-in)
* [[Godot]] (built-in in Godot 4)


The Oculus Runtime effectively thinks you are playing a game called "SteamVR", but the content of the "game" is actually the image that the SteamVR runtime got from the OpenVR compiled game. This seems to work rather well and personally I have not noticed any additional latency. In theory you would even get Asynchronous Timewarp, but that would only happen if SteamVR itself isn't responding in time. If an OpenVR game stalls completely, the SteamVR runtime will throw you back into the loading construct and continues to render at 90fps. Since the SteamVR runtime has this fail safe mechanism and all the Oculus Runtime sees is "SteamVR", it will most likely never trigger Asynchronous Timewarp.
===Platform-Specific Considerations===


==App Stores==
{| class="wikitable"
[[App Stores]] are distribution platforms for software.
|-
! Platform !! Input Methods !! Unique Features !! Distribution
|-
| [[Meta Quest]] || [[Controllers]], [[hand tracking]] || [[Passthrough]] [[MR]], [[spatial anchors]], standalone || [[Meta Quest Store]], [[Steam]]
|-
| [[Apple Vision Pro]] || [[Eye tracking]], [[hand tracking]], voice || [[Mac Virtual Display]], [[spatial video]] || [[App Store]]
|-
| [[PSVR2]] || [[Sense controllers]], [[hand tracking]] || [[Eye tracking]], [[HDR]] [[OLED]] || [[PlayStation Store]]
|-
| [[PCVR]] (Steam) || Various controllers || High-end graphics, [[full body tracking]] || [[Steam]]
|-
| [[Pico]] || [[Controllers]], [[hand tracking]], [[body tracking]] || [[Mixed reality]], [[face tracking]] || [[Pico Store]]
|}


*'''[[Oculus Home]]''' needs to be running for the [[Rift]] to work. By default only supports [[apps]] from the [[Oculus Store]] (checkbox in the settings of the 2d desktop client to enable other sources). It downloads [[games]] and runs them. It also handles the [[Universal Menu]] on the [[Xbox]] button.
==Best Practices and Guidelines==


*'''[[Steam]]/[[SteamVR]]''' technically does not need to run when launching [[OpenVR]] games, but highly recommended (room setup and config is pulled from there). Also handles overlay menu on the Xbox button, or when running on the Rift, it launches by pressing the select/start button in the Oculus Universal Menu.
===Performance Optimization===
* Target 72 FPS minimum (90 FPS for [[PCVR]])
* Use [[foveated rendering]] when available
* Implement [[dynamic resolution scaling]]
* Optimize [[draw calls]] and polygon counts
* Use [[occlusion culling]] effectively


===Example===
===User Comfort===
[[Project Cars]] on Steam supports both [[OpenVR]] and [[OVR]]. The Oculus Home version supports only OVR as of today.
* Minimize artificial [[locomotion]] sickness
* Provide comfort options ([[teleportation]], [[snap turning]])
* Maintain stable frame rates
* Design [[UI]] within comfortable viewing angles
* Test with diverse users


#Project Cars on Home + Rift -> will run natively
===Platform-Specific Guidelines===
#Project Cars on Steam + Vive -> will run natively
* Follow [[Meta Quest]] [[Human Interface Guidelines]]
#Project Cars on Steam + Rift + launched as Rift app -> will run natively
* Adhere to [[Apple Vision Pro]] [[Design Guidelines]]
#Project Cars on Home + Vive -> won't run
* Meet [[PlayStation VR2]] [[Technical Requirements]]
#Project Cars on Steam + Rift + launched as SteamVR app -> will run via SteamVR
* Consider [[OpenXR]] best practices for cross-platform apps


==Wrappers and Injectors==
==Wrappers and Compatibility Layers==
[[Oculus Rift]] [[apps]], or better said apps compiled with the [[OVR]] [[dll]]s do not render to the [[HMD]] directly. They render their images and send them to the [[runtime]]. The interface for that are the dlls. [[Wrappers]] replace the content of the dlls with functions that are named identically, but redirect the games rendered image to a different runtime. Equally, they provide the game with the [[rotational tracking|rotation]] and [[positional tracking|position]] data of the headset.
Some community tools exist to help run content across platforms:


*[[ReVive]] replaces the content of the dlls with functions that communicate with the [[SteamVR Runtime]]. The game thinks it's connected to a [[Rift]], but it is not. The SteamVR runtime is rendering the game and supports [[reprojection]], but not [[asynchronous timewarp]] (remember, this is a feature of the [[Oculus Runtime]]).
* '''[[ReVive]]''' - An open-source compatibility layer that lets Rift-exclusive games run on Vive or Index by translating Oculus SDK calls to OpenVR/SteamVR. (It does not bypass Oculus entitlement checks; games must still be owned on Oculus Home.)
* '''[[OpenXR]]/Runtime Bridges''' - Modern runtimes like SteamVR now support OpenXR, reducing the need for hacks.
* '''Legacy solutions''' (e.g. [[LibOVRWrapper]]) enabled old Rift apps to work with newer runtime DLLs, but these are rarely needed now.


All Oculus games distributed through [[Oculus Home]] implement an entitlement check. This means the game is asking Oculus Home if the user is allowed to play it. ReVive does not hack this entitlement check and games need to be downloaded/purchased via Oculus Home. Games need to be started through Home once or the Oculus Service needs to be restarted. Example of the entitlement check failing in [[Lucky's Tale]]: Instead of the main menu, ReVive users see a worm on a log showing you it's tongue.
In general, using the official SDK or OpenXR is recommended for compatibility.


Sidenote: technically one could even write a wrapper that routes [[OpenVR]] calls to the Oculus runtime directly, bypassing [[SteamVR]] altogether. But since SteamVR can do it already, there is little need for that
==Communities and Forums==


*[[LibOVRWrapper]] installs dlls into [[Windows]] that old Rift games need in order to communicate with the new runtime. It effectively translates old games to the new runtime. [unsure: Do these old games now also use timewarp or do they need to specifically request it via an sdk call?]
===Developer Communities===
* [[Meta Quest Developer]] Forum
* [[Unity XR]] Forums
* [[Unreal Engine]] Forums
* [[Godot XR]] GitHub Community
* [[r/vrdev]] (Reddit)
* [[OpenXR]] Developer Portal


*LibOVRWrapper + ReVive allows you to run old Rift games on [[Vive]].
===Learning Resources===
* [[Unity Learn]] XR Courses
* [[Unreal Engine]] Learning Portal
* [[Meta Quest Developer]] Hub
* [[Apple Developer]] Documentation
* [[Godot XR]] Documentation
* [[YouTube]] channels for [[VR development]]


==Resources==
===Official Documentation===
[[Oculus Sample Framework]] and [[Oculus Sample Framework for Unity 5]]<ref>https://developer.oculus.com/blog/introducing-the-oculus-sample-framework-for-unity-5/</ref> - Experimental sandbox for VR developers and designers.
* [[Meta]] Developers: developers.meta.com/horizon
{{see also|Resources Directory}}
* [[Apple Vision Pro]]: developer.apple.com/visionos
{{:Resources Directory}}
* [[Unity XR]]: docs.unity3d.com/Manual/XR.html
* [[Unreal Engine VR]]: dev.epicgames.com/documentation/en-us/unreal-engine/virtual-reality-development
* [[Godot XR]]: docs.godotengine.org/en/stable/tutorials/xr
* [[OpenXR]]: khronos.org/openxr
* [[SteamVR]]: partner.steamgames.com/doc/features/steamvr
* [[Pico Developer]]: developer.picoxr.com


==3D models==
==See Also==
* [https://www.yobi3d.com Yobi3D - 3D model search engine]: offers a 3D viewer to see search results in 3D. Also works with Cardboard.
* [[How to Get Started in VR Development]]
* [[VR Headsets]]
* [[AR Development]]
* [[Mixed Reality]]
* [[Spatial Computing]]
* [[WebXR]]


==References==
==References==
<references />
<references>
<ref name="UnityXRManual">https://docs.unity3d.com/6000.2/Documentation/Manual/XR.html - XR - Unity Documentation</ref>
<ref name="UnitySolutionsXR">https://unity.com/solutions/xr - XR Development in Unity: AR, VR and Spatial Solutions</ref>
<ref name="MetaDevOverview">https://developers.meta.com/horizon/documentation/unity/unity-development-overview/ - Unity Development Overview for Meta Quest</ref>
<ref name="SteamVRStore">https://www.steamvr.com/ - SteamVR</ref>
<ref name="KhronosOpenXR">https://www.khronos.org/OpenXR - OpenXR - High-performance access to AR and VR</ref>
<ref name="openxr-meta">https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR | Meta Horizon OS Developers</ref>
<ref name="openxr-recommended">https://www.uploadvr.com/meta-recommends-using-unity-unreal-built-in-openxr-support/ - Meta Will Recommend Using Unity & Unreal's Built-In OpenXR Support</ref>
<ref name="meta-hand-tracking">https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Meta XR Interaction SDK Overview</ref>
<ref name="visionos-launch">https://en.wikipedia.org/wiki/VisionOS - visionOS - Wikipedia</ref>
<ref name="visionos-unity">https://www.qualium-systems.com/blog/everything-youd-like-to-know-about-visionos-development/ - Everything about visionOS Development</ref>
<ref name="psvr2-sdk">https://www.pushsquare.com/news/2022/09/developers-should-have-an-easier-time-porting-their-games-to-psvr2 - Developers Should Have an Easier Time Porting Games to PSVR2</ref>
<ref name="psvr2-hand-tracking">https://gamerant.com/playstation-vr2-update-new-feature-hand-tracking/ - PlayStation VR2 SDK Update Adds Hand Tracking</ref>
<ref name="pico-sdk">https://developer.picoxr.com/ - PICO Developer - Official Developer Portal</ref>
<ref name="unity-xr-toolkit">https://docs.unity3d.com/Packages/[email protected]/manual/index.html - XR Interaction Toolkit - Unity Documentation</ref>
<ref name="unity-openxr-parity">https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Unity's OpenXR Plugin 1.14 Feature Parity</ref>
<ref name="ue56-release">https://www.unrealengine.com/en-US/news/unreal-engine-5-6-is-now-available - Unreal Engine 5.6 Release Announcement</ref>
<ref name="ue55-vr">https://dev.epicgames.com/documentation/en-us/unreal-engine/unreal-engine-5-5-release-notes - Unreal Engine 5.5 Release Notes</ref>
<ref name="godot-openxr">https://godotengine.org/article/godot-xr-update-feb-2025/ - Godot XR Update February 2025</ref>
<ref name="godot-khronos">https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - Advancing OpenXR Development in Godot</ref>
<ref name="openxr-adoption">https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?</ref>
<ref name="steamvr-openvr">https://en.wikipedia.org/wiki/OpenVR - OpenVR - Wikipedia</ref>
<ref name="meta-store">https://www.meta.com/experiences/ - Official Meta Quest Store</ref>
<ref name="psvr2-games">https://www.playstation.com/en-us/ps-vr2/games/ - PlayStation VR2 games official page</ref>
<ref name="visionos-apps">https://developer.apple.com/visionos/ - visionOS developer page</ref>
<ref name="pico-store">https://www.picoxr.com/global - PICO Global official website</ref>
<ref name="fab-launch">https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - Fab launched October 2024</ref>
<ref name="cgtrader">https://www.cgtrader.com/ - CGTrader 3D Models Marketplace</ref>
<ref name="turbosquid">https://www.turbosquid.com/ - TurboSquid professional 3D models marketplace</ref>
<ref name="fab-megascans">https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - In 2025, 1,500 Megascans will be free to all users</ref>
<ref name="openxr-cross-platform">https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - OpenXR provides unified API for VR and AR across platforms</ref>
<ref name="AnimostUnityVsUnreal">https://animost.com/ideas-inspirations/unity-vs-unreal-engine-for-xr-development/ - Unity vs. Unreal Engine for XR Development</ref>
<ref name="DailyDevUnityVsUnreal">https://daily.dev/blog/unity-vs-unreal-engine-for-vrar-development - Unity vs Unreal Engine for VR/AR Development</ref>
<ref name="WebarooUnityVsUnreal">https://www.webaroo.us/insights/building-ar-vr/ - Building for AR/VR: Unity vs. Unreal Engine</ref>
<ref name="RedditUnityVsUnreal">https://www.reddit.com/r/virtualreality/comments/z5i23c/unity_or_unreal_for_vr_dev/ - Unity or Unreal for VR dev? - Reddit Discussion</ref>
<ref name="OpenXRWiki">https://en.wikipedia.org/wiki/OpenXR - OpenXR - Wikipedia</ref>
<ref name="AutoVRSEOpenXR">https://www.autovrse.com/openxr - What is OpenXR?</ref>
<ref name="MetaOpenXRSupport">https://developers.meta.com/horizon/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis/ - Oculus All-in on OpenXR, Deprecates Proprietary APIs</ref>
<ref name="MetaUnitySetup">https://developers.meta.com/horizon/documentation/unity/unity-project-setup/ - Set Up Your Unity Project for Meta Quest Development</ref>
<ref name="DCSForumsOpenVR">https://forum.dcs.world/topic/318110-confusion-steamvr-vs-openxr-opencomposite-8/ - Confusion SteamVR vs OpenXR / OpenComposite</ref>
<ref name="PimaxOpenVRvsOpenXR">https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?</ref>
<ref name="RedditOpenXRUniversal">https://www.reddit.com/r/virtualreality/comments/wi6w2x/vr_devs_just_how_universal_is_openxr_anyways/ - VR Devs, just how universal is OpenXR anyways?</ref>
<ref name="MetaOpenXRAdvancements">https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR</ref>
<ref name="MetaXRCoreSDKDocs">https://developers.meta.com/horizon/documentation/unity/unity-core-sdk/ - Meta XR Core SDK for Unity</ref>
<ref name="UnityBestPractices">https://developers.meta.com/horizon/documentation/unity/unity-best-practices-intro/ - Best Practices for Unity</ref>
<ref name="MetaInteractionSDKDocs">https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Interaction SDK</ref>
<ref name="MetaAudioSDKDocs">https://developers.meta.com/horizon/documentation/unity/meta-xr-audio-sdk-unity/ - Meta XR Audio SDK Overview</ref>
<ref name="MetaPlatformSDKDocs">https://developers.meta.com/horizon/downloads/package/meta-xr-platform-sdk/ - Meta XR Platform SDK</ref>
<ref name="MetaAllInOneSDK">https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 - Meta XR All-in-One SDK</ref>
<ref name="XRI2.0Docs">https://docs.unity3d.com/Packages/[email protected]/ - XR Interaction Toolkit 2.0.0</ref>
<ref name="XRILearnTutorial">https://learn.unity.com/tutorial/using-interactors-and-interactables-with-the-xr-interaction-toolkit - Using Interactors and Interactables with the XR Interaction Toolkit</ref>
<ref name="XRILatestDocs">https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@latest/ - XR Interaction Toolkit</ref>
<ref name="MetaUnrealSetup">https://developers.meta.com/horizon/documentation/unreal/unreal-create-and-configure-new-project/ - Create and Configure a New Project in Unreal Engine</ref>
<ref name="BrownVRWiki">https://www.vrwiki.cs.brown.edu/vr-development-software/unreal-engine-5/adding-vr-to-an-existing-ue5-world - Adding VR to an Existing UE5 World</ref>
<ref name="UnrealXRDocs">https://dev.epicgames.com/documentation/en-us/unreal-engine/xr-development?application_version=4.27 - XR Development - Unreal Engine Documentation</ref>
<ref name="UnrealVRTemplateDocs">https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-template-in-unreal-engine - VR Template in Unreal Engine</ref>
<ref name="SpurqLabsBestPractices">https://spurqlabs.com/vr-testing-challenges-best-practices-what-developers-should-know/ - VR Testing Challenges & Best Practices</ref>
<ref name="UWLocomotionStudy">https://faculty.washington.edu/wobbrock/pubs/assets-23.02.pdf - Which VR Locomotion Techniques are Most Accessible?</ref>
<ref name="MetaLocomotionBestPractices">https://xrdesignhandbook.com/docs/Meta/Locomotion%20Best%20Practices.html - Locomotion Best Practices</ref>
<ref name="MetaLocomotionDesignGuide">https://developers.meta.com/horizon/blog/now-available-vr-locomotion-design-guide/ - VR Locomotion Design Guide</ref>
<ref name="MetaInputModalities">https://developers.meta.com/horizon/design/interactions-input-modalities/ - Interactions and Input Modalities</ref>
<ref name="MetaHandTrackingDesign">https://developers.meta.com/horizon/design/hands - Hands - Meta for Developers</ref>
<ref name="MetaMRDesignGuidelines">https://developers.meta.com/horizon/design/mr-design-guideline/ - Mixed Reality Design Guidelines</ref>
<ref name="VRPerfNotesGitHub">https://github.com/authorTom/notes-on-VR-performance - Notes on VR Performance</ref>
<ref name="MetaUnityPerfDocs">https://developers.meta.com/horizon/documentation/unity/unity-perf/ - Performance and Profiling for Unity</ref>
<ref name="MetaPCPerfGuidelines">https://developers.meta.com/horizon/documentation/native/pc/dg-performance-guidelines/ - Performance Guidelines</ref>
<ref name="MetaMQDHDocs">https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ - Meta Quest Developer Hub</ref>
<ref name="MediumVRPerfOpt">https://medium.com/@lemapp09/beginning-game-development-vr-performance-optimization-78553530ca83 - VR Performance Optimization</ref>
<ref name="MetaDevBlogDrawCalls">https://developers.meta.com/horizon/blog/down-the-rabbit-hole-w-oculus-quest-developer-best-practices-the-store/ - Developer Best Practices</ref>
<ref name="MetaATWBlog">https://developers.meta.com/horizon/blog/asynchronous-timewarp-on-oculus-rift/ - Asynchronous Timewarp on Oculus Rift</ref>
<ref name="MetaATWDocs">https://developers.meta.com/horizon/documentation/native/android/mobile-timewarp-overview/ - Asynchronous TimeWarp (ATW)</ref>
<ref name="XinrealityTimewarp">https://xinreality.com/wiki/Timewarp - Timewarp</ref>
<ref name="MetaASWBlog">https://developers.meta.com/horizon/blog/asynchronous-spacewarp/ - Asynchronous Spacewarp</ref>
<ref name="MetaASWDocs">https://developers.meta.com/horizon/documentation/native/pc/asynchronous-spacewarp/ - Asynchronous SpaceWarp</ref>
<ref name="RedditASWDisable">https://www.reddit.com/r/oculus/comments/bvcoh8/always_disable_asynchronous_spacewarp/ - Always disable Asynchronous Spacewarp</ref>
<ref name="SteamVRMotionSmoothing">https://steamcommunity.com/games/250820/announcements/detail/1705071932992003492 - Introducing SteamVR Motion Smoothing</ref>
<ref name="MSFSForumMotionReprojection">https://forums.flightsimulator.com/t/motion-reprojection-explained/548659 - Motion Reprojection Explained</ref>
<ref name="MetaSubmitApp">https://developers.meta.com/horizon/resources/publish-submit - Submitting your app</ref>
<ref name="MetaDevHome">https://developers.meta.com/ - Meta for Developers</ref>
<ref name="SteamVRHomePage">https://store.steampowered.com/app/250820/SteamVR/ - SteamVR on Steam</ref>
<ref name="SteamVRPluginUnity">https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647 - SteamVR Plugin</ref>
<ref name="MetaSteamLinkApp">https://www.meta.com/experiences/steam-link/5841245619310585/ - Steam Link on Meta Quest Store</ref>
<ref name="SideQuestSetup">https://sidequestvr.com/setup-howto - Get SideQuest</ref>
<ref name="SideQuestHome">https://sidequestvr.com/ - SideQuest</ref>
<ref name="PicoDevProgram">https://developer.picoxr.com/developer-program/?enter_from=picoweb - PICO Developer Program</ref>
<ref name="PicoDevFAQ">https://sdk.picovr.com/docs/FAQ/chapter_twentyeight.html - Developer Platform FAQ</ref>
<ref name="ARKitDocs">https://developer.apple.com/augmented-reality/arkit/ - ARKit - Augmented Reality - Apple Developer</ref>
<ref name="ARCoreDocs">https://developers.google.com/ar - Build new augmented reality experiences - Google AR</ref>
<ref name="yobi3d">https://www.yobi3d.com - Yobi3D 3D model search engine</ref>
</references>

Latest revision as of 22:51, 13 October 2025

Template:InfoBox

Overview

This article provides comprehensive information about resources for VR and AR development as of 2025, including SDKs, game engines, runtimes, app stores, and development tools for modern virtual reality and augmented reality platforms. The XR development landscape has evolved significantly, with OpenXR emerging as the industry-standard cross-platform API, supported by all major hardware manufacturers and game engines.

The XR Development Landscape

Extended Reality (XR) encompasses Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), representing a paradigm shift in how users interact with digital content. For developers, this landscape is defined by a rich ecosystem of game engines, Software Development Kits (SDKs), and cross-platform standards designed to build immersive experiences.[1]

At the heart of modern XR development are powerful game engines, primarily Unity and Unreal Engine, which provide the rendering, physics, and scripting environments necessary to create interactive 3D worlds.[2] These engines are complemented by platform-specific SDKs that unlock the unique hardware features of devices like the Meta Quest.[3]

Underpinning this entire ecosystem is the OpenXR standard, a crucial initiative by the Khronos Group to create a unified API that allows developers to write code that can run across multiple hardware platforms with minimal modification, thereby reducing fragmentation and simplifying the development process.[4]

How to Get Started

See also: How to Get Started in VR Development

Getting started with XR development in 2025 requires choosing a target platform and appropriate development tools. Modern VR development primarily uses OpenXR as a cross-platform standard, with platform-specific SDKs providing additional features.

Beginner Path

For beginners, the recommended path is to:

  1. Choose a game engine (Unity, Unreal Engine, or Godot)
  2. Select target VR headsets (e.g., Meta Quest 3, Meta Quest 3S, Apple Vision Pro, PSVR2)
  3. Set up the appropriate SDK and development environment
  4. Study platform-specific guidelines and best practices
  5. Build a simple app and iterate

Development Environment Setup

1. Select a Development Environment: Use integrated development environments (IDEs) like Unity or Unreal Engine, which support XR out of the box. Download the latest versions: Unity 6.x or Unreal Engine 5.4+.

2. Install SDKs: For Meta devices, use the Meta XR SDK. For Apple, the visionOS SDK. Adopt OpenXR for cross-platform compatibility.

3. Set Up Hardware: Ensure your development PC meets requirements (e.g., high-end GPU for tethered VR). For standalone devices, develop directly on the headset or use simulators like Meta XR Simulator.

4. Learn Basics: Study motion tracking, anchors, environmental understanding, and depth APIs. Resources include official documentation from Meta, Apple, and Khronos Group.

5. Test and Iterate: Use tools like Meta XR Simulator for testing without hardware. Optimize for performance, considering battery life and refresh rates.

Essential Concepts

- Understand 6DOF (six degrees of freedom) motion detection for immersive experiences - Focus on user comfort to avoid motion sickness, using techniques like asynchronous timewarp or reprojection - Integrate haptics and controllers for better interaction - For AR, leverage location anchors and scene semantics for real-world integration

Modern SDKs and Development Kits

SDKs or Software Development Kits are essential tools for building VR apps and AR apps. As of 2025, the XR development landscape has evolved significantly from the early Oculus Rift and HTC Vive era.

OpenXR

OpenXR is an open, royalty-free standard developed by the Khronos Group that provides a unified API for VR and AR platforms.[5] OpenXR eliminates the need for multiple proprietary APIs, allowing developers to create applications that work across different VR headsets and AR devices.

As of 2025, Meta recommends OpenXR as the primary development path starting with SDK version 74.[6] OpenXR support is built into major game engines and provides cross-platform compatibility.

Core Architecture and Concepts

The OpenXR API provides a set of abstractions for developers to interact with the XR system:[7]

  • XrInstance: Represents the connection between the application and the OpenXR runtime. It is the first object created.
  • XrSystem: Represents the set of XR devices, including the headset and controllers.
  • XrSession: Manages the interaction session between the application and the user. It controls the application's lifecycle, such as when it should render frames.
  • XrSpace: Defines a 3D coordinate system. This is used to track the position and orientation of the headset, controllers, and other tracked objects.
  • XrActions: Provides a high-level, action-based input system. Instead of querying for "button A press," a developer defines an action like "Jump" or "Grab" and maps it to different physical inputs on various controllers.

Key OpenXR Features

Industry Adoption

OpenXR has achieved widespread industry adoption. All major hardware and software platforms provide conformant OpenXR runtimes, including:[8][9]

OpenXR vs. OpenVR

A common point of confusion is the distinction between OpenXR and OpenVR:[10][11]

  • OpenVR is an older API developed by Valve. It was the primary API for the SteamVR platform and was "open" in the sense that other hardware could create drivers to be compatible with it. However, its development was ultimately controlled by Valve.
  • OpenXR is the modern, multi-company standard managed by the Khronos Group. It is the successor to proprietary APIs like OpenVR and the old Oculus SDK.

Today, the SteamVR platform itself is an OpenXR runtime. This means it can run applications built with the OpenXR API, while also maintaining backward compatibility with older applications built with the OpenVR API.[12]

Cross-Platform Considerations

While OpenXR has successfully standardized the foundational layer of XR development, its "write once, run anywhere" promise is not absolute. For standalone Android-based headsets, developers often still need to create separate application packages (APKs) for different platforms like Meta Quest and Pico, as these platforms may use their own vendor-specific OpenXR loaders.[13]

Furthermore, innovative platform-specific features—such as Meta's advanced Passthrough capabilities, hand tracking enhancements, or spatial anchors—are exposed to developers through vendor-specific extensions to the OpenXR standard.[14] To leverage these powerful features, a developer must write code that specifically targets that vendor's hardware.

Meta Quest SDK

The Meta Quest SDK (formerly Oculus SDK) supports Meta Quest 2, Meta Quest 3, Meta Quest 3S, and Meta Quest Pro headsets. As of March 2025, Meta transitioned to recommending OpenXR as the primary development approach, while still providing the Meta XR Core SDK for Horizon OS-specific features.[5]

Supported Platforms

Key Features

Apple Vision Pro SDK (visionOS)

The visionOS SDK enables development for Apple Vision Pro, Apple's spatial computing platform launched in February 2024.[16]

Development Tools

Key Features

Requirements

  • Mac with Apple Silicon (M1 or later)[17]
  • macOS Monterey or later
  • Xcode 15.2 or later
  • For Unity: Unity 2022 LTS (2022.3.191 or newer), Apple Silicon version only

PlayStation VR2 SDK

PSVR2 development uses the same SDK as PlayStation 5, making porting between platforms more straightforward.[18]

Supported Features

Pico SDK

Pico SDK, developed by ByteDance's Pico division, supports the Pico 4, Pico 4 Pro, and Pico 4 Ultra headsets available in Europe and Asia.[20]

Supported Platforms

Key Features

AR-Specific SDKs

ARKit

ARKit is Apple's framework for iOS AR, current version 6. Features include:[21]

  • 4K video capture
  • Depth API with LiDAR
  • Motion capture
  • Scene geometry
  • People occlusion

ARCore

ARCore is Google's SDK for Android AR. Supports:[22]

  • Motion tracking
  • Anchors
  • Environmental understanding
  • Depth API
  • Geospatial API
  • Scene semantics

Game Engines

Choosing a Game Engine: Unity vs. Unreal

The choice of a game engine is one of the most fundamental decisions in XR development. Unity and Unreal Engine are the two dominant forces in the industry, each with a distinct set of strengths, weaknesses, and development philosophies.

Unity

Unity remains one of the most popular engines for VR development. As of 2025, Unity provides native OpenXR support through the XR Interaction Toolkit.[23]

Core Strengths

Unity's core is built around the C# programming language, which is widely regarded as having a more gentle learning curve compared to C++.[24] This accessibility, combined with a user-friendly interface, makes it an attractive option for developers of all experience levels, from indie creators to large studios.[25]

Current Version

Unity 6 (2025)

Asset Ecosystem

A significant accelerator for Unity development is the Unity Asset Store. It offers a vast library of pre-built tools, 3D models, scripts, and plugins that can dramatically reduce development time and cost.[26] This rich ecosystem allows teams to prototype rapidly and bring products to market faster.

VR Features
Key Packages
Package Purpose Current Version
XR Interaction Toolkit High-level interaction system 3.1.2
OpenXR Plugin OpenXR runtime support 1.14+
XR Plugin Management XR backend management Latest
AR Foundation AR development framework Latest
Meta XR SDK Meta Quest-specific features Latest

Unreal Engine

Unreal Engine 5 provides robust VR development capabilities with built-in OpenXR support. As of 2025, Unreal Engine 5.6 is the current major version.[28]

Core Strengths

Unreal Engine's primary advantage is its advanced rendering engine, which delivers exceptional visual quality with minimal setup. Features like Lumen for dynamic global illumination and Nanite for virtualized micropolygon geometry enable developers to create incredibly realistic and detailed worlds, making it a powerhouse for PCVR and high-end simulations.[29]

Current Versions
Scripting Model

Unreal Engine employs a dual-language development model. For maximum performance and low-level system control, developers use C++.[24] Complementing this is the Blueprints system, a powerful node-based interface that allows designers, artists, and programmers to build complex game logic without writing traditional code.[26]

VR Features
VR Development Tools

Godot

Godot Engine has significantly improved XR support with native OpenXR integration in Godot 4.[31] Meta has sponsored improvements to Godot's OpenXR support.[32]

Current Version

Godot 4.3 (2025)

XR Features
Supported Platforms

Comparative Analysis

Unity vs. Unreal Engine for XR Development
Feature Unity Unreal Engine
Primary Scripting C#[24] C++[24]
Visual Scripting Bolt (Visual Scripting package)[24] Blueprints (deeply integrated)[26]
Learning Curve More gentle, user-friendly interface[25] Steeper, especially for its custom C++ framework[26]
Graphical Fidelity (Out-of-the-box) Good, but often requires configuration to achieve high-end results[24] Excellent, industry-leading visuals with Lumen and Nanite[29]
Asset Ecosystem Extensive (Unity Asset Store), major strength for rapid development[26] Growing, but smaller than Unity's
Community Size Larger, more beginner-friendly resources[24] Smaller but strong, particularly for high-end development[24]
Primary Target Platform (XR) Strongest in standalone and mobile VR (e.g., Meta Quest)[29] Strongest in PCVR and high-fidelity simulations[29]
Ideal Use Cases Indie games, mobile/standalone VR/AR, rapid prototyping, projects prioritizing speed-to-market AAA games, architectural visualization, cinematic experiences, enterprise simulations requiring photorealism[26]

Developing with Unity

Setting up a Unity project for XR development involves a series of specific configuration steps to enable communication with XR hardware and import the necessary SDKs.

Project Setup for XR

Prerequisites

Before creating a project, developers must use the Unity Hub to install a supported version of the Unity Editor (e.g., 2022.3 LTS or newer). During installation, it is crucial to include the Android Build Support module, as this is a requirement for developing applications for Android-based standalone headsets like the Meta Quest.[33]

Creating a Project and Configuring the Build Platform

A new project should be created using the 3D (URP) template. The Universal Render Pipeline provides a modern, performant rendering foundation suitable for the majority of XR applications.[33] Once the project is created, the first step is to set the target build platform. This is done by navigating to `File > Build Settings`. For Meta Quest development, select Meta Quest (in Unity 6.1 and later) or Android (in older versions) and click the "Switch Platform" button.[33]

XR Plug-in Management

Unity communicates with XR runtimes through its XR Plugin Management system. This package must be installed from the Package Manager if it is not already present.

  1. Navigate to `Edit > Project Settings`.
  2. Select the `XR Plug-in Management` tab.
  3. In both the Standalone (PC icon) and Android tabs, check the box for OpenXR. This tells Unity to use the OpenXR API to interface with the headset's runtime.[33]

This step is critical as it enables the core connection between the engine and the XR hardware.

The Meta XR SDK for Unity

For developers targeting Meta Quest devices, the Meta XR SDK is essential. It provides access to the full suite of the platform's features.

Installation and Core Components

The primary package is the Meta XR Core SDK, which is installed from the Unity Package Manager via the `Window > Package Manager` interface.[34] This SDK includes several key components:

  • OVRCameraRig: A pre-configured camera prefab that serves as the XR rig. It replaces the standard Unity camera and automatically handles head and controller tracking, mapping the user's physical movements into the virtual scene.[3]
  • OVRInput: A robust API for handling input from the Touch controllers and hand tracking.[34]
  • Project Setup Tool: A utility that analyzes the project for common configuration errors and provides one-click fixes to apply recommended settings.[34]

Project Setup Tool

After importing the Core SDK, developers should immediately run the Project Setup Tool by navigating to `Meta XR > Tools > Project Setup Tool`. This tool checks for dozens of required settings related to graphics, physics, and build configurations. Clicking the Fix All and Apply All buttons will automatically configure the project according to Meta's best practices.[33][35]

Other Meta SDKs

The Meta ecosystem is composed of several specialized SDKs that build upon the Core SDK:

  • Meta XR Interaction SDK: A high-level framework for creating natural and robust interactions like grabbing, poking, and interacting with UI using both controllers and hands.[36]
  • Meta XR Audio SDK: Provides advanced spatial audio features, including HRTF-based spatialization and room acoustics simulation.[37]
  • Meta XR Platform SDK: Enables integration with Meta's platform services, such as leaderboards, achievements, user profiles, and multiplayer matchmaking.[38]

The Meta XR All-in-One SDK is available on the Asset Store as a convenient package to manage these various SDKs.[39]

Unity's XR Interaction Toolkit (XRI)

The XR Interaction Toolkit (XRI) is Unity's own high-level, component-based framework for building XR interactions. It is designed to be flexible and extensible, providing a solid foundation for VR and AR projects.[40]

Core Concepts

XRI's architecture is built around a few key concepts:

  • Interaction Manager: A singleton component that acts as the central hub, mediating all interactions between Interactors and Interactables in a scene.[40]
  • Interactors: These are components that initiate actions. They represent the user's hands or controllers. Common types include:
    • `XR Ray Interactor`: For pointing at and selecting objects from a distance.[41]
    • `XR Direct Interactor`: For directly touching and grabbing objects that are within arm's reach.[41]
    • `XR Socket Interactor`: A static interactor that objects can be snapped into, useful for puzzles or placing items in specific locations.[41]
  • Interactables: These are components placed on objects in the scene that can be acted upon by Interactors. The most common is the `XR Grab Interactable`, which allows an object to be picked up, held, and thrown.[41]

Locomotion and UI

XRI includes a complete locomotion system that can be added to the XR Origin (the main camera rig). This includes a Teleportation Provider for point-and-click movement and a Continuous Move Provider for smooth, joystick-based locomotion. It also provides tools for interacting with world-space UI canvases, using components like the `Tracked Device Graphic Raycaster` to allow ray interactors to click on buttons and other UI elements.[42]

Developing with Unreal Engine

Unreal Engine offers a powerful, high-fidelity platform for XR development, with a streamlined setup process centered around its VR Template and OpenXR integration.

Project Setup for XR

Prerequisites

Developers must first install Unreal Engine via the Epic Games Launcher. For development targeting standalone Android headsets like the Meta Quest, it is also necessary to install and configure Android Studio and the required Android SDK and NDK components.[43]

Creating a Project and Plugin Configuration

The recommended way to start a new XR project is by using the built-in Virtual Reality template. This can be selected from the "Games" category in the Unreal Project Browser when creating a new project.[43] This template provides a pre-configured project with essential plugins enabled, a basic level, and a functional player pawn with locomotion and interaction systems.[44]

After creation, developers should verify that the necessary plugins are enabled by navigating to `Edit > Plugins`. The most important plugin is OpenXR. Platform-specific plugins like OculusVR (for Meta devices) or SteamVR should also be enabled depending on the target hardware.[45]

The Meta XR Plugin for Unreal

The Meta XR Plugin (often referred to as the OculusVR plugin) is the key to unlocking the full feature set of Meta Quest devices in Unreal Engine. It provides access to platform-specific functionalities not covered by the core OpenXR standard.[43] This includes:

  • Advanced tracking features like Hand Tracking, Body Tracking, and Face/Eye Tracking
  • Mixed Reality features such as Passthrough, Spatial Anchors, and Scene understanding
  • Performance optimization tools like Application SpaceWarp and Fixed Foveated Rendering
  • Platform services integration for leaderboards, achievements, and parties
  • The Movement SDK for more realistic avatar motion and the Voice SDK for voice commands and dictation

Leveraging the VR Template

The standard VR Template is a powerful starting point that encapsulates logic for many common VR features.[46]

Core Components

The template is built around a few key assets:

  • VRPawn: This Blueprint is the user's representation in the virtual world. It contains the camera component, motion controller components for tracking the hands, and all the logic for handling input and movement.[46][44]
  • VRGameMode: This object defines the rules for the level, most importantly specifying that the `VRPawn` should be the default pawn class for the player.[46]

Locomotion Systems

The template includes two primary, comfort-oriented locomotion methods:[46]

  • Teleportation: The user can aim with one controller, which displays an arc and a target location. Releasing the thumbstick instantly moves the pawn to that location. This system relies on a Nav Mesh Bounds Volume placed in the level.
  • Snap Turn: Using the other controller's thumbstick, the user can rotate their view in discrete increments (e.g., 45 degrees). This avoids the smooth rotation that can cause motion sickness for some users.

Runtimes

Runtimes handle core VR functionality including rendering, tracking, and reprojection.

OpenXR Runtime

OpenXR provides a unified runtime layer that works across multiple VR platforms. As of 2025, most major VR headsets support OpenXR runtime.[47]

Supported by:

SteamVR / OpenVR

SteamVR supports both OpenVR (legacy) and OpenXR runtimes. OpenVR was developed by Valve and has been the default runtime for SteamVR, though OpenXR is increasingly recommended.[10]

Current State (2025)

Features

Meta Quest Runtime

The Meta Quest Runtime handles VR rendering and tracking for Meta Quest devices, now with integrated OpenXR support.

Features

Essential Best Practices for XR Development

Beyond the specifics of any single engine, creating a successful XR application requires adherence to a set of universal principles focused on user comfort, intuitive interaction, and technical performance.

User Experience and Comfort

The primary goal of XR design is to create a sense of presence and immersion. This can be easily broken by experiences that are uncomfortable or unintuitive.

Locomotion Design

Movement in VR is the single greatest challenge for user comfort. The sensory conflict between seeing motion in the headset while the body's vestibular system reports being stationary can lead to Visually induced motion sickness (VIMS), also known as VR sickness.[48]

Comfort-First Techniques
  • Teleportation: This method instantly moves the user from one point to another, completely avoiding the continuous visual flow that causes motion sickness. It is consistently the most comfortable and widely preferred option.[49][50]
  • Snap Turns: Replace smooth, continuous rotation with discrete, instantaneous jumps in orientation. This is significantly more comfortable for many users than smooth turning.[48]
Continuous Locomotion and Comfort Options

While more immersive for some, smooth joystick-based movement is the most likely to cause discomfort. If offered, it should always be an option, not a requirement, and should be accompanied by comfort settings. A common technique is vignetting (also called tunneling), which narrows the user's field of view during movement, reducing peripheral optic flow.[51]

Environmental Design

The design of the virtual world itself can impact comfort. Developers should:

  • Avoid forcing movement on steep slopes or stairs
  • Keep walls and large objects at a comfortable distance to reduce optic flow
  • Consider using an Independent Visual Background (IVB), such as a static skybox that only moves with head rotation[50]

Interaction Design

Intuitive interaction is key to maintaining presence. The choice of input modality has a significant impact on the user experience.

Controllers vs. Hand Tracking
  • Controllers offer high precision, tactile feedback through buttons and triggers, and haptic feedback, making them ideal for gaming, creative tools, and any application requiring reliable and precise input.[52]
  • Hand tracking uses the headset's cameras to track the user's bare hands, offering a highly natural and intuitive interaction method. However, its performance can be affected by poor lighting, fast movements, and occlusion. It is generally less precise than controllers and is best suited for more casual experiences.[53]
Multimodal Input

The best practice is often to support multiple input modalities. Allowing a user to seamlessly switch between controllers, hands, and even voice commands provides maximum accessibility.[53]

UI/UX in 3D Space

Designing User Interfaces (UI) for a 3D, immersive space presents unique challenges.

Placement and Comfort

UI panels should be placed at a comfortable viewing distance, typically around 1-2 meters from the user, and positioned slightly below their natural line of sight to avoid neck strain.[54]

World-Locking vs. Head-Locking

A critical rule is to avoid locking UI elements directly to the user's head (a "head-locked" HUD). This is extremely uncomfortable and can cause motion sickness. Instead, UI should be "world-locked" (anchored to a position in the environment) or, if it must follow the user, it should do so with a gentle, smoothed animation.[54]

Feedback

Clear and immediate feedback is essential. Interactive elements should have distinct hover states (e.g., highlighting or scaling up), and actions like button presses should be accompanied by both visual and auditory cues.[54]

Technical Performance and Optimization

Maintaining a consistently high and stable frame rate is the most important technical requirement for a comfortable VR experience.

Understanding VR Performance Metrics

  • Framerate vs. Frame Time: Framerate, measured in Frames Per Second (FPS), is the average number of frames rendered over a second. Frame Time, measured in milliseconds (ms), is the time it takes to render a single frame. Frame time is a more accurate indicator of performance smoothness.[55]
  • Performance Targets: For standalone headsets like the Meta Quest series, applications must achieve a minimum of 72 FPS. For PCVR, the target is typically 90 FPS or higher.[56]

Profiling and Debugging

The first step in optimization is identifying the bottleneck. Performance issues are typically either CPU-bound (caused by complex game logic, physics, or too many draw calls) or GPU-bound (caused by high resolution, complex shaders, or too much geometry).[57]

Essential profiling tools include:

Core Optimization Strategies

  • Reduce Draw Calls: Combine multiple textures into a single texture atlas and use static batching to draw multiple similar objects in a single call. For standalone Quest, developers should aim for 50-200 draw calls per frame.[56][59]
  • Optimize Geometry: Keep polygon counts within the budget for the target platform (e.g., 750k-1M triangles for Quest 2). Use Level of Detail (LOD) systems.[56]
  • Simplify Lighting and Shaders: Use baked lighting whenever possible. Avoid complex, multi-pass shaders and full-screen post-processing effects on mobile hardware.[35]
  • Use Occlusion Culling: This prevents the engine from rendering objects that are completely hidden from the camera's view.[60]

Understanding Reprojection Technologies

When an application fails to render a new frame in time for the headset's display refresh, VR platforms employ asynchronous reprojection techniques. These systems act as a safety net, but they are not a substitute for good performance.

Key Reprojection Technologies Explained
Technology Developer Mechanism Corrects For Common Artifacts
Asynchronous Timewarp (ATW) Meta Takes the last successfully rendered frame and re-projects it based on the newest head rotation data.[61][62] Head Rotation only. Does not account for positional movement or in-scene animation.[63] Positional judder, animation judder.[64]
Asynchronous SpaceWarp (ASW) Meta When an app's framerate drops to half the display's refresh rate, ASW analyzes the motion between two previous frames to synthesize and insert a new, predicted frame.[65] Head rotation, head position, controller movement, and in-scene animation.[64] Warping, ghosting, or "bubbling" artifacts, especially around fast-moving objects.[64][66]
Motion Smoothing Valve Similar to ASW, it activates when an application cannot maintain framerate. It looks at the last two frames to estimate motion and extrapolates a new frame.[67] Head rotation, head position, controller movement, and in-scene animation.[67] Similar to ASW, it can produce visual artifacts like warping or wobbling.[68]

App Stores and Distribution

Meta Quest Store

The Meta Quest Store (formerly Oculus Store) is the primary distribution platform for Meta Quest applications on Horizon OS.[69]

Supported Devices

Distribution Methods

  • Official Meta Quest Store - Highly curated marketplace requiring stringent technical and content guidelines. Reserved for polished, high-quality experiences.[59]
  • App Lab - Alternative distribution path allowing developers to publish without full store curation. Apps are not browsable in the main store and can only be accessed via direct URL. Ideal for early access builds, experimental applications, and apps still in development.[70]
  • SideQuest - Third-party platform for sideloading applications

The submission process involves creating an app page, uploading builds (often using the Meta Quest Developer Hub desktop tool), providing store assets and metadata, and undergoing technical review.[58][71]

Steam / SteamVR

For PC-based VR, SteamVR is the dominant platform. It is known for its broad hardware compatibility, supporting not only Valve's own Index headset but also devices from HTC, Meta (via Link or Air Link), Windows Mixed Reality, and others.[72][73]

Supported Headsets

Publishing on Steam is handled through Steamworks, Valve's suite of tools and services for developers. The barrier to entry is lower than the curated Meta Quest Store.[74] The recent release of the official Steam Link app on the Meta Quest Store has further solidified Steam's role as a central hub for PCVR.[75]

PlayStation Store (PSVR2)

PSVR2 games are distributed through the PlayStation Store for PlayStation 5.[76]

Notable Features

  • Integrated with PS5 library
  • Support for PSVR2-exclusive titles
  • Cross-buy support for some titles

Apple Vision Pro App Store

Vision Pro applications are distributed through a dedicated section of the App Store for visionOS.[77]

App Types

Pico Store

The Pico Store distributes content for Pico headsets in Europe and Asia markets.[78] Pico offers a PICO Developer Program which provides qualifying developers with financial support, technical assistance, and marketing resources.[79]

Alternative Distribution: SideQuest

SideQuest is the leading third-party platform and community for the Meta Quest ecosystem. Its primary function is to facilitate the "sideloading" of applications—the process of installing an app directly onto the headset via a PC, bypassing the official store.[80]

SideQuest has become a vital hub for:[81]

  • Early access and experimental games
  • Unofficial ports of classic games to VR
  • Content that may not meet the curation guidelines of the official store
XR Distribution Platform Overview
Platform Primary Devices Curation Level Target Audience Key Feature
Meta Quest Store Meta Quest series High (Strict VRCs)[59] Mainstream consumers Highest visibility and monetization potential for standalone VR[70]
App Lab Meta Quest series Low (Basic technical review)[70] Early adopters, testers, niche communities Distribute to Quest users via direct link without full store curation[70]
SteamVR PCVR Headsets (Index, Vive, Quest via Link, etc.) Low (Self-publishing model)[74] PC gamers, VR enthusiasts Broadest hardware compatibility for PCVR; large existing user base[72]
SideQuest Meta Quest series Low (Community-driven)[81] VR enthusiasts, modders, indie game followers Primary platform for sideloading, experimental content, and early access[80]
Pico Store Pico headsets (e.g., Pico 4) High (Similar to Quest Store)[82] Consumers in markets where Pico is prevalent Growing ecosystem with developer incentive programs[79]

Development Tools and Resources

XR Interaction Toolkits

Toolkit Engine Features Current Version
XR Interaction Toolkit Unity Locomotion, UI interaction, object grabbing 3.1.2
Godot XR Tools Godot Teleportation, snap turning, hand interactions 4.4.0
Unreal VR Template Unreal Engine Motion controllers, VR preview Built-in
Meta XR Interaction SDK Unity Hand interactions, controller support Latest
PICO Unity Integration SDK Unity Pico-specific features Latest

SDKs and Plugins

Meta XR SDKs

Other SDKs

Testing and Debugging Tools

3D Models and Asset Marketplaces

Major 3D Asset Marketplaces

Marketplace Specialty Notable Features URL
Fab Unified content marketplace Supports Unity, Unreal Engine, UEFN. Successor to Unreal Engine Marketplace and Sketchfab Store. Free Quixel Megascans through 2024. In 2025, 1,500+ Megascans free to all users.[83][84] fab.com
CGTrader General 3D models Over 2 million 3D models, supports VR/AR content, auction system[85] cgtrader.com
TurboSquid Professional 3D models CheckMate certification, enterprise licensing, owned by Shutterstock[86] turbosquid.com
Sketchfab Web-based 3D viewer Real-time 3D preview, VR/AR optimized, WebGL viewer sketchfab.com
Unity Asset Store Unity assets Engine-specific assets, plugins, templates assetstore.unity.com
Quixel Megascans Photorealistic scans Now part of Fab, 1,500+ free assets in 2025[84] fab.com
Yobi3D 3D model search engine Aggregates models from various sites, supports VR/AR[87] yobi3d.com

Asset Types for VR Development

Free Asset Resources

Cross-Platform Development

OpenXR for Cross-Platform Development

OpenXR enables developers to create applications that work across multiple VR platforms without requiring separate codebases for each headset.[88]

Supported Platforms via OpenXR

Engine Support

Platform-Specific Considerations

Platform Input Methods Unique Features Distribution
Meta Quest Controllers, hand tracking Passthrough MR, spatial anchors, standalone Meta Quest Store, Steam
Apple Vision Pro Eye tracking, hand tracking, voice Mac Virtual Display, spatial video App Store
PSVR2 Sense controllers, hand tracking Eye tracking, HDR OLED PlayStation Store
PCVR (Steam) Various controllers High-end graphics, full body tracking Steam
Pico Controllers, hand tracking, body tracking Mixed reality, face tracking Pico Store

Best Practices and Guidelines

Performance Optimization

User Comfort

  • Minimize artificial locomotion sickness
  • Provide comfort options (teleportation, snap turning)
  • Maintain stable frame rates
  • Design UI within comfortable viewing angles
  • Test with diverse users

Platform-Specific Guidelines

Wrappers and Compatibility Layers

Some community tools exist to help run content across platforms:

  • ReVive - An open-source compatibility layer that lets Rift-exclusive games run on Vive or Index by translating Oculus SDK calls to OpenVR/SteamVR. (It does not bypass Oculus entitlement checks; games must still be owned on Oculus Home.)
  • OpenXR/Runtime Bridges - Modern runtimes like SteamVR now support OpenXR, reducing the need for hacks.
  • Legacy solutions (e.g. LibOVRWrapper) enabled old Rift apps to work with newer runtime DLLs, but these are rarely needed now.

In general, using the official SDK or OpenXR is recommended for compatibility.

Communities and Forums

Developer Communities

Learning Resources

Official Documentation

  • Meta Developers: developers.meta.com/horizon
  • Apple Vision Pro: developer.apple.com/visionos
  • Unity XR: docs.unity3d.com/Manual/XR.html
  • Unreal Engine VR: dev.epicgames.com/documentation/en-us/unreal-engine/virtual-reality-development
  • Godot XR: docs.godotengine.org/en/stable/tutorials/xr
  • OpenXR: khronos.org/openxr
  • SteamVR: partner.steamgames.com/doc/features/steamvr
  • Pico Developer: developer.picoxr.com

See Also

References

  1. https://docs.unity3d.com/6000.2/Documentation/Manual/XR.html - XR - Unity Documentation
  2. https://unity.com/solutions/xr - XR Development in Unity: AR, VR and Spatial Solutions
  3. 3.0 3.1 https://developers.meta.com/horizon/documentation/unity/unity-development-overview/ - Unity Development Overview for Meta Quest
  4. https://www.khronos.org/OpenXR - OpenXR - High-performance access to AR and VR
  5. 5.0 5.1 https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR | Meta Horizon OS Developers
  6. https://en.wikipedia.org/wiki/OpenXR - OpenXR - Wikipedia
  7. https://developers.meta.com/horizon/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis/ - Oculus All-in on OpenXR, Deprecates Proprietary APIs
  8. https://www.autovrse.com/openxr - What is OpenXR?
  9. 10.0 10.1 https://en.wikipedia.org/wiki/OpenVR - OpenVR - Wikipedia
  10. https://forum.dcs.world/topic/318110-confusion-steamvr-vs-openxr-opencomposite-8/ - Confusion SteamVR vs OpenXR / OpenComposite
  11. https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?
  12. https://www.reddit.com/r/virtualreality/comments/wi6w2x/vr_devs_just_how_universal_is_openxr_anyways/ - VR Devs, just how universal is OpenXR anyways?
  13. https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR
  14. https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Meta XR Interaction SDK Overview
  15. https://en.wikipedia.org/wiki/VisionOS - visionOS - Wikipedia
  16. https://www.qualium-systems.com/blog/everything-youd-like-to-know-about-visionos-development/ - Everything about visionOS Development
  17. https://www.pushsquare.com/news/2022/09/developers-should-have-an-easier-time-porting-their-games-to-psvr2 - Developers Should Have an Easier Time Porting Games to PSVR2
  18. https://gamerant.com/playstation-vr2-update-new-feature-hand-tracking/ - PlayStation VR2 SDK Update Adds Hand Tracking
  19. https://developer.picoxr.com/ - PICO Developer - Official Developer Portal
  20. https://developer.apple.com/augmented-reality/arkit/ - ARKit - Augmented Reality - Apple Developer
  21. https://developers.google.com/ar - Build new augmented reality experiences - Google AR
  22. https://docs.unity3d.com/Packages/[email protected]/manual/index.html - XR Interaction Toolkit - Unity Documentation
  23. 24.0 24.1 24.2 24.3 24.4 24.5 24.6 24.7 https://animost.com/ideas-inspirations/unity-vs-unreal-engine-for-xr-development/ - Unity vs. Unreal Engine for XR Development
  24. 25.0 25.1 https://daily.dev/blog/unity-vs-unreal-engine-for-vrar-development - Unity vs Unreal Engine for VR/AR Development
  25. 26.0 26.1 26.2 26.3 26.4 26.5 https://www.webaroo.us/insights/building-ar-vr/ - Building for AR/VR: Unity vs. Unreal Engine
  26. https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Unity's OpenXR Plugin 1.14 Feature Parity
  27. https://www.unrealengine.com/en-US/news/unreal-engine-5-6-is-now-available - Unreal Engine 5.6 Release Announcement
  28. 29.0 29.1 29.2 29.3 https://www.reddit.com/r/virtualreality/comments/z5i23c/unity_or_unreal_for_vr_dev/ - Unity or Unreal for VR dev? - Reddit Discussion
  29. https://dev.epicgames.com/documentation/en-us/unreal-engine/unreal-engine-5-5-release-notes - Unreal Engine 5.5 Release Notes
  30. https://godotengine.org/article/godot-xr-update-feb-2025/ - Godot XR Update February 2025
  31. https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - Advancing OpenXR Development in Godot
  32. 33.0 33.1 33.2 33.3 33.4 https://developers.meta.com/horizon/documentation/unity/unity-project-setup/ - Set Up Your Unity Project for Meta Quest Development
  33. 34.0 34.1 34.2 https://developers.meta.com/horizon/documentation/unity/unity-core-sdk/ - Meta XR Core SDK for Unity
  34. 35.0 35.1 https://developers.meta.com/horizon/documentation/unity/unity-best-practices-intro/ - Best Practices for Unity
  35. https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Interaction SDK
  36. https://developers.meta.com/horizon/documentation/unity/meta-xr-audio-sdk-unity/ - Meta XR Audio SDK Overview
  37. https://developers.meta.com/horizon/downloads/package/meta-xr-platform-sdk/ - Meta XR Platform SDK
  38. https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 - Meta XR All-in-One SDK
  39. 40.0 40.1 https://docs.unity3d.com/Packages/[email protected]/ - XR Interaction Toolkit 2.0.0
  40. 41.0 41.1 41.2 41.3 https://learn.unity.com/tutorial/using-interactors-and-interactables-with-the-xr-interaction-toolkit - Using Interactors and Interactables with the XR Interaction Toolkit
  41. https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@latest/ - XR Interaction Toolkit
  42. 43.0 43.1 43.2 https://developers.meta.com/horizon/documentation/unreal/unreal-create-and-configure-new-project/ - Create and Configure a New Project in Unreal Engine
  43. 44.0 44.1 https://www.vrwiki.cs.brown.edu/vr-development-software/unreal-engine-5/adding-vr-to-an-existing-ue5-world - Adding VR to an Existing UE5 World
  44. https://dev.epicgames.com/documentation/en-us/unreal-engine/xr-development?application_version=4.27 - XR Development - Unreal Engine Documentation
  45. 46.0 46.1 46.2 46.3 https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-template-in-unreal-engine - VR Template in Unreal Engine
  46. https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?
  47. 48.0 48.1 https://spurqlabs.com/vr-testing-challenges-best-practices-what-developers-should-know/ - VR Testing Challenges & Best Practices
  48. https://faculty.washington.edu/wobbrock/pubs/assets-23.02.pdf - Which VR Locomotion Techniques are Most Accessible?
  49. 50.0 50.1 https://xrdesignhandbook.com/docs/Meta/Locomotion%20Best%20Practices.html - Locomotion Best Practices
  50. https://developers.meta.com/horizon/blog/now-available-vr-locomotion-design-guide/ - VR Locomotion Design Guide
  51. https://developers.meta.com/horizon/design/interactions-input-modalities/ - Interactions and Input Modalities
  52. 53.0 53.1 https://developers.meta.com/horizon/design/hands - Hands - Meta for Developers
  53. 54.0 54.1 54.2 https://developers.meta.com/horizon/design/mr-design-guideline/ - Mixed Reality Design Guidelines
  54. 55.0 55.1 55.2 https://github.com/authorTom/notes-on-VR-performance - Notes on VR Performance
  55. 56.0 56.1 56.2 https://developers.meta.com/horizon/documentation/unity/unity-perf/ - Performance and Profiling for Unity
  56. https://developers.meta.com/horizon/documentation/native/pc/dg-performance-guidelines/ - Performance Guidelines
  57. 58.0 58.1 https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ - Meta Quest Developer Hub
  58. 59.0 59.1 59.2 https://developers.meta.com/horizon/blog/down-the-rabbit-hole-w-oculus-quest-developer-best-practices-the-store/ - Developer Best Practices
  59. https://medium.com/@lemapp09/beginning-game-development-vr-performance-optimization-78553530ca83 - VR Performance Optimization
  60. https://developers.meta.com/horizon/blog/asynchronous-timewarp-on-oculus-rift/ - Asynchronous Timewarp on Oculus Rift
  61. https://developers.meta.com/horizon/documentation/native/android/mobile-timewarp-overview/ - Asynchronous TimeWarp (ATW)
  62. https://xinreality.com/wiki/Timewarp - Timewarp
  63. 64.0 64.1 64.2 https://developers.meta.com/horizon/blog/asynchronous-spacewarp/ - Asynchronous Spacewarp
  64. https://developers.meta.com/horizon/documentation/native/pc/asynchronous-spacewarp/ - Asynchronous SpaceWarp
  65. https://www.reddit.com/r/oculus/comments/bvcoh8/always_disable_asynchronous_spacewarp/ - Always disable Asynchronous Spacewarp
  66. 67.0 67.1 https://steamcommunity.com/games/250820/announcements/detail/1705071932992003492 - Introducing SteamVR Motion Smoothing
  67. https://forums.flightsimulator.com/t/motion-reprojection-explained/548659 - Motion Reprojection Explained
  68. https://www.meta.com/experiences/ - Official Meta Quest Store
  69. 70.0 70.1 70.2 70.3 https://developers.meta.com/ - Meta for Developers
  70. https://developers.meta.com/horizon/resources/publish-submit - Submitting your app
  71. 72.0 72.1 https://www.steamvr.com/ - SteamVR
  72. https://store.steampowered.com/app/250820/SteamVR/ - SteamVR on Steam
  73. 74.0 74.1 https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647 - SteamVR Plugin
  74. https://www.meta.com/experiences/steam-link/5841245619310585/ - Steam Link on Meta Quest Store
  75. https://www.playstation.com/en-us/ps-vr2/games/ - PlayStation VR2 games official page
  76. https://developer.apple.com/visionos/ - visionOS developer page
  77. https://www.picoxr.com/global - PICO Global official website
  78. 79.0 79.1 https://developer.picoxr.com/developer-program/?enter_from=picoweb - PICO Developer Program
  79. 80.0 80.1 https://sidequestvr.com/setup-howto - Get SideQuest
  80. 81.0 81.1 https://sidequestvr.com/ - SideQuest
  81. https://sdk.picovr.com/docs/FAQ/chapter_twentyeight.html - Developer Platform FAQ
  82. https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - Fab launched October 2024
  83. 84.0 84.1 https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - In 2025, 1,500 Megascans will be free to all users
  84. https://www.cgtrader.com/ - CGTrader 3D Models Marketplace
  85. https://www.turbosquid.com/ - TurboSquid professional 3D models marketplace
  86. https://www.yobi3d.com - Yobi3D 3D model search engine
  87. https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - OpenXR provides unified API for VR and AR across platforms