Jump to content

Developer Resource

From VR & AR Wiki
Revision as of 22:51, 13 October 2025 by Xinreality (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Template:InfoBox

Overview

This article provides comprehensive information about resources for VR and AR development as of 2025, including SDKs, game engines, runtimes, app stores, and development tools for modern virtual reality and augmented reality platforms. The XR development landscape has evolved significantly, with OpenXR emerging as the industry-standard cross-platform API, supported by all major hardware manufacturers and game engines.

The XR Development Landscape

Extended Reality (XR) encompasses Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), representing a paradigm shift in how users interact with digital content. For developers, this landscape is defined by a rich ecosystem of game engines, Software Development Kits (SDKs), and cross-platform standards designed to build immersive experiences.[1]

At the heart of modern XR development are powerful game engines, primarily Unity and Unreal Engine, which provide the rendering, physics, and scripting environments necessary to create interactive 3D worlds.[2] These engines are complemented by platform-specific SDKs that unlock the unique hardware features of devices like the Meta Quest.[3]

Underpinning this entire ecosystem is the OpenXR standard, a crucial initiative by the Khronos Group to create a unified API that allows developers to write code that can run across multiple hardware platforms with minimal modification, thereby reducing fragmentation and simplifying the development process.[4]

How to Get Started

See also: How to Get Started in VR Development

Getting started with XR development in 2025 requires choosing a target platform and appropriate development tools. Modern VR development primarily uses OpenXR as a cross-platform standard, with platform-specific SDKs providing additional features.

Beginner Path

For beginners, the recommended path is to:

  1. Choose a game engine (Unity, Unreal Engine, or Godot)
  2. Select target VR headsets (e.g., Meta Quest 3, Meta Quest 3S, Apple Vision Pro, PSVR2)
  3. Set up the appropriate SDK and development environment
  4. Study platform-specific guidelines and best practices
  5. Build a simple app and iterate

Development Environment Setup

1. Select a Development Environment: Use integrated development environments (IDEs) like Unity or Unreal Engine, which support XR out of the box. Download the latest versions: Unity 6.x or Unreal Engine 5.4+.

2. Install SDKs: For Meta devices, use the Meta XR SDK. For Apple, the visionOS SDK. Adopt OpenXR for cross-platform compatibility.

3. Set Up Hardware: Ensure your development PC meets requirements (e.g., high-end GPU for tethered VR). For standalone devices, develop directly on the headset or use simulators like Meta XR Simulator.

4. Learn Basics: Study motion tracking, anchors, environmental understanding, and depth APIs. Resources include official documentation from Meta, Apple, and Khronos Group.

5. Test and Iterate: Use tools like Meta XR Simulator for testing without hardware. Optimize for performance, considering battery life and refresh rates.

Essential Concepts

- Understand 6DOF (six degrees of freedom) motion detection for immersive experiences - Focus on user comfort to avoid motion sickness, using techniques like asynchronous timewarp or reprojection - Integrate haptics and controllers for better interaction - For AR, leverage location anchors and scene semantics for real-world integration

Modern SDKs and Development Kits

SDKs or Software Development Kits are essential tools for building VR apps and AR apps. As of 2025, the XR development landscape has evolved significantly from the early Oculus Rift and HTC Vive era.

OpenXR

OpenXR is an open, royalty-free standard developed by the Khronos Group that provides a unified API for VR and AR platforms.[5] OpenXR eliminates the need for multiple proprietary APIs, allowing developers to create applications that work across different VR headsets and AR devices.

As of 2025, Meta recommends OpenXR as the primary development path starting with SDK version 74.[6] OpenXR support is built into major game engines and provides cross-platform compatibility.

Core Architecture and Concepts

The OpenXR API provides a set of abstractions for developers to interact with the XR system:[7]

  • XrInstance: Represents the connection between the application and the OpenXR runtime. It is the first object created.
  • XrSystem: Represents the set of XR devices, including the headset and controllers.
  • XrSession: Manages the interaction session between the application and the user. It controls the application's lifecycle, such as when it should render frames.
  • XrSpace: Defines a 3D coordinate system. This is used to track the position and orientation of the headset, controllers, and other tracked objects.
  • XrActions: Provides a high-level, action-based input system. Instead of querying for "button A press," a developer defines an action like "Jump" or "Grab" and maps it to different physical inputs on various controllers.

Key OpenXR Features

Industry Adoption

OpenXR has achieved widespread industry adoption. All major hardware and software platforms provide conformant OpenXR runtimes, including:[8][9]

OpenXR vs. OpenVR

A common point of confusion is the distinction between OpenXR and OpenVR:[10][11]

  • OpenVR is an older API developed by Valve. It was the primary API for the SteamVR platform and was "open" in the sense that other hardware could create drivers to be compatible with it. However, its development was ultimately controlled by Valve.
  • OpenXR is the modern, multi-company standard managed by the Khronos Group. It is the successor to proprietary APIs like OpenVR and the old Oculus SDK.

Today, the SteamVR platform itself is an OpenXR runtime. This means it can run applications built with the OpenXR API, while also maintaining backward compatibility with older applications built with the OpenVR API.[12]

Cross-Platform Considerations

While OpenXR has successfully standardized the foundational layer of XR development, its "write once, run anywhere" promise is not absolute. For standalone Android-based headsets, developers often still need to create separate application packages (APKs) for different platforms like Meta Quest and Pico, as these platforms may use their own vendor-specific OpenXR loaders.[13]

Furthermore, innovative platform-specific features—such as Meta's advanced Passthrough capabilities, hand tracking enhancements, or spatial anchors—are exposed to developers through vendor-specific extensions to the OpenXR standard.[14] To leverage these powerful features, a developer must write code that specifically targets that vendor's hardware.

Meta Quest SDK

The Meta Quest SDK (formerly Oculus SDK) supports Meta Quest 2, Meta Quest 3, Meta Quest 3S, and Meta Quest Pro headsets. As of March 2025, Meta transitioned to recommending OpenXR as the primary development approach, while still providing the Meta XR Core SDK for Horizon OS-specific features.[5]

Supported Platforms

Key Features

Apple Vision Pro SDK (visionOS)

The visionOS SDK enables development for Apple Vision Pro, Apple's spatial computing platform launched in February 2024.[16]

Development Tools

Key Features

Requirements

  • Mac with Apple Silicon (M1 or later)[17]
  • macOS Monterey or later
  • Xcode 15.2 or later
  • For Unity: Unity 2022 LTS (2022.3.191 or newer), Apple Silicon version only

PlayStation VR2 SDK

PSVR2 development uses the same SDK as PlayStation 5, making porting between platforms more straightforward.[18]

Supported Features

Pico SDK

Pico SDK, developed by ByteDance's Pico division, supports the Pico 4, Pico 4 Pro, and Pico 4 Ultra headsets available in Europe and Asia.[20]

Supported Platforms

Key Features

AR-Specific SDKs

ARKit

ARKit is Apple's framework for iOS AR, current version 6. Features include:[21]

  • 4K video capture
  • Depth API with LiDAR
  • Motion capture
  • Scene geometry
  • People occlusion

ARCore

ARCore is Google's SDK for Android AR. Supports:[22]

  • Motion tracking
  • Anchors
  • Environmental understanding
  • Depth API
  • Geospatial API
  • Scene semantics

Game Engines

Choosing a Game Engine: Unity vs. Unreal

The choice of a game engine is one of the most fundamental decisions in XR development. Unity and Unreal Engine are the two dominant forces in the industry, each with a distinct set of strengths, weaknesses, and development philosophies.

Unity

Unity remains one of the most popular engines for VR development. As of 2025, Unity provides native OpenXR support through the XR Interaction Toolkit.[23]

Core Strengths

Unity's core is built around the C# programming language, which is widely regarded as having a more gentle learning curve compared to C++.[24] This accessibility, combined with a user-friendly interface, makes it an attractive option for developers of all experience levels, from indie creators to large studios.[25]

Current Version

Unity 6 (2025)

Asset Ecosystem

A significant accelerator for Unity development is the Unity Asset Store. It offers a vast library of pre-built tools, 3D models, scripts, and plugins that can dramatically reduce development time and cost.[26] This rich ecosystem allows teams to prototype rapidly and bring products to market faster.

VR Features
Key Packages
Package Purpose Current Version
XR Interaction Toolkit High-level interaction system 3.1.2
OpenXR Plugin OpenXR runtime support 1.14+
XR Plugin Management XR backend management Latest
AR Foundation AR development framework Latest
Meta XR SDK Meta Quest-specific features Latest

Unreal Engine

Unreal Engine 5 provides robust VR development capabilities with built-in OpenXR support. As of 2025, Unreal Engine 5.6 is the current major version.[28]

Core Strengths

Unreal Engine's primary advantage is its advanced rendering engine, which delivers exceptional visual quality with minimal setup. Features like Lumen for dynamic global illumination and Nanite for virtualized micropolygon geometry enable developers to create incredibly realistic and detailed worlds, making it a powerhouse for PCVR and high-end simulations.[29]

Current Versions
Scripting Model

Unreal Engine employs a dual-language development model. For maximum performance and low-level system control, developers use C++.[24] Complementing this is the Blueprints system, a powerful node-based interface that allows designers, artists, and programmers to build complex game logic without writing traditional code.[26]

VR Features
VR Development Tools

Godot

Godot Engine has significantly improved XR support with native OpenXR integration in Godot 4.[31] Meta has sponsored improvements to Godot's OpenXR support.[32]

Current Version

Godot 4.3 (2025)

XR Features
Supported Platforms

Comparative Analysis

Unity vs. Unreal Engine for XR Development
Feature Unity Unreal Engine
Primary Scripting C#[24] C++[24]
Visual Scripting Bolt (Visual Scripting package)[24] Blueprints (deeply integrated)[26]
Learning Curve More gentle, user-friendly interface[25] Steeper, especially for its custom C++ framework[26]
Graphical Fidelity (Out-of-the-box) Good, but often requires configuration to achieve high-end results[24] Excellent, industry-leading visuals with Lumen and Nanite[29]
Asset Ecosystem Extensive (Unity Asset Store), major strength for rapid development[26] Growing, but smaller than Unity's
Community Size Larger, more beginner-friendly resources[24] Smaller but strong, particularly for high-end development[24]
Primary Target Platform (XR) Strongest in standalone and mobile VR (e.g., Meta Quest)[29] Strongest in PCVR and high-fidelity simulations[29]
Ideal Use Cases Indie games, mobile/standalone VR/AR, rapid prototyping, projects prioritizing speed-to-market AAA games, architectural visualization, cinematic experiences, enterprise simulations requiring photorealism[26]

Developing with Unity

Setting up a Unity project for XR development involves a series of specific configuration steps to enable communication with XR hardware and import the necessary SDKs.

Project Setup for XR

Prerequisites

Before creating a project, developers must use the Unity Hub to install a supported version of the Unity Editor (e.g., 2022.3 LTS or newer). During installation, it is crucial to include the Android Build Support module, as this is a requirement for developing applications for Android-based standalone headsets like the Meta Quest.[33]

Creating a Project and Configuring the Build Platform

A new project should be created using the 3D (URP) template. The Universal Render Pipeline provides a modern, performant rendering foundation suitable for the majority of XR applications.[33] Once the project is created, the first step is to set the target build platform. This is done by navigating to `File > Build Settings`. For Meta Quest development, select Meta Quest (in Unity 6.1 and later) or Android (in older versions) and click the "Switch Platform" button.[33]

XR Plug-in Management

Unity communicates with XR runtimes through its XR Plugin Management system. This package must be installed from the Package Manager if it is not already present.

  1. Navigate to `Edit > Project Settings`.
  2. Select the `XR Plug-in Management` tab.
  3. In both the Standalone (PC icon) and Android tabs, check the box for OpenXR. This tells Unity to use the OpenXR API to interface with the headset's runtime.[33]

This step is critical as it enables the core connection between the engine and the XR hardware.

The Meta XR SDK for Unity

For developers targeting Meta Quest devices, the Meta XR SDK is essential. It provides access to the full suite of the platform's features.

Installation and Core Components

The primary package is the Meta XR Core SDK, which is installed from the Unity Package Manager via the `Window > Package Manager` interface.[34] This SDK includes several key components:

  • OVRCameraRig: A pre-configured camera prefab that serves as the XR rig. It replaces the standard Unity camera and automatically handles head and controller tracking, mapping the user's physical movements into the virtual scene.[3]
  • OVRInput: A robust API for handling input from the Touch controllers and hand tracking.[34]
  • Project Setup Tool: A utility that analyzes the project for common configuration errors and provides one-click fixes to apply recommended settings.[34]

Project Setup Tool

After importing the Core SDK, developers should immediately run the Project Setup Tool by navigating to `Meta XR > Tools > Project Setup Tool`. This tool checks for dozens of required settings related to graphics, physics, and build configurations. Clicking the Fix All and Apply All buttons will automatically configure the project according to Meta's best practices.[33][35]

Other Meta SDKs

The Meta ecosystem is composed of several specialized SDKs that build upon the Core SDK:

  • Meta XR Interaction SDK: A high-level framework for creating natural and robust interactions like grabbing, poking, and interacting with UI using both controllers and hands.[36]
  • Meta XR Audio SDK: Provides advanced spatial audio features, including HRTF-based spatialization and room acoustics simulation.[37]
  • Meta XR Platform SDK: Enables integration with Meta's platform services, such as leaderboards, achievements, user profiles, and multiplayer matchmaking.[38]

The Meta XR All-in-One SDK is available on the Asset Store as a convenient package to manage these various SDKs.[39]

Unity's XR Interaction Toolkit (XRI)

The XR Interaction Toolkit (XRI) is Unity's own high-level, component-based framework for building XR interactions. It is designed to be flexible and extensible, providing a solid foundation for VR and AR projects.[40]

Core Concepts

XRI's architecture is built around a few key concepts:

  • Interaction Manager: A singleton component that acts as the central hub, mediating all interactions between Interactors and Interactables in a scene.[40]
  • Interactors: These are components that initiate actions. They represent the user's hands or controllers. Common types include:
    • `XR Ray Interactor`: For pointing at and selecting objects from a distance.[41]
    • `XR Direct Interactor`: For directly touching and grabbing objects that are within arm's reach.[41]
    • `XR Socket Interactor`: A static interactor that objects can be snapped into, useful for puzzles or placing items in specific locations.[41]
  • Interactables: These are components placed on objects in the scene that can be acted upon by Interactors. The most common is the `XR Grab Interactable`, which allows an object to be picked up, held, and thrown.[41]

Locomotion and UI

XRI includes a complete locomotion system that can be added to the XR Origin (the main camera rig). This includes a Teleportation Provider for point-and-click movement and a Continuous Move Provider for smooth, joystick-based locomotion. It also provides tools for interacting with world-space UI canvases, using components like the `Tracked Device Graphic Raycaster` to allow ray interactors to click on buttons and other UI elements.[42]

Developing with Unreal Engine

Unreal Engine offers a powerful, high-fidelity platform for XR development, with a streamlined setup process centered around its VR Template and OpenXR integration.

Project Setup for XR

Prerequisites

Developers must first install Unreal Engine via the Epic Games Launcher. For development targeting standalone Android headsets like the Meta Quest, it is also necessary to install and configure Android Studio and the required Android SDK and NDK components.[43]

Creating a Project and Plugin Configuration

The recommended way to start a new XR project is by using the built-in Virtual Reality template. This can be selected from the "Games" category in the Unreal Project Browser when creating a new project.[43] This template provides a pre-configured project with essential plugins enabled, a basic level, and a functional player pawn with locomotion and interaction systems.[44]

After creation, developers should verify that the necessary plugins are enabled by navigating to `Edit > Plugins`. The most important plugin is OpenXR. Platform-specific plugins like OculusVR (for Meta devices) or SteamVR should also be enabled depending on the target hardware.[45]

The Meta XR Plugin for Unreal

The Meta XR Plugin (often referred to as the OculusVR plugin) is the key to unlocking the full feature set of Meta Quest devices in Unreal Engine. It provides access to platform-specific functionalities not covered by the core OpenXR standard.[43] This includes:

  • Advanced tracking features like Hand Tracking, Body Tracking, and Face/Eye Tracking
  • Mixed Reality features such as Passthrough, Spatial Anchors, and Scene understanding
  • Performance optimization tools like Application SpaceWarp and Fixed Foveated Rendering
  • Platform services integration for leaderboards, achievements, and parties
  • The Movement SDK for more realistic avatar motion and the Voice SDK for voice commands and dictation

Leveraging the VR Template

The standard VR Template is a powerful starting point that encapsulates logic for many common VR features.[46]

Core Components

The template is built around a few key assets:

  • VRPawn: This Blueprint is the user's representation in the virtual world. It contains the camera component, motion controller components for tracking the hands, and all the logic for handling input and movement.[46][44]
  • VRGameMode: This object defines the rules for the level, most importantly specifying that the `VRPawn` should be the default pawn class for the player.[46]

Locomotion Systems

The template includes two primary, comfort-oriented locomotion methods:[46]

  • Teleportation: The user can aim with one controller, which displays an arc and a target location. Releasing the thumbstick instantly moves the pawn to that location. This system relies on a Nav Mesh Bounds Volume placed in the level.
  • Snap Turn: Using the other controller's thumbstick, the user can rotate their view in discrete increments (e.g., 45 degrees). This avoids the smooth rotation that can cause motion sickness for some users.

Runtimes

Runtimes handle core VR functionality including rendering, tracking, and reprojection.

OpenXR Runtime

OpenXR provides a unified runtime layer that works across multiple VR platforms. As of 2025, most major VR headsets support OpenXR runtime.[47]

Supported by:

SteamVR / OpenVR

SteamVR supports both OpenVR (legacy) and OpenXR runtimes. OpenVR was developed by Valve and has been the default runtime for SteamVR, though OpenXR is increasingly recommended.[10]

Current State (2025)

Features

Meta Quest Runtime

The Meta Quest Runtime handles VR rendering and tracking for Meta Quest devices, now with integrated OpenXR support.

Features

Essential Best Practices for XR Development

Beyond the specifics of any single engine, creating a successful XR application requires adherence to a set of universal principles focused on user comfort, intuitive interaction, and technical performance.

User Experience and Comfort

The primary goal of XR design is to create a sense of presence and immersion. This can be easily broken by experiences that are uncomfortable or unintuitive.

Locomotion Design

Movement in VR is the single greatest challenge for user comfort. The sensory conflict between seeing motion in the headset while the body's vestibular system reports being stationary can lead to Visually induced motion sickness (VIMS), also known as VR sickness.[48]

Comfort-First Techniques
  • Teleportation: This method instantly moves the user from one point to another, completely avoiding the continuous visual flow that causes motion sickness. It is consistently the most comfortable and widely preferred option.[49][50]
  • Snap Turns: Replace smooth, continuous rotation with discrete, instantaneous jumps in orientation. This is significantly more comfortable for many users than smooth turning.[48]
Continuous Locomotion and Comfort Options

While more immersive for some, smooth joystick-based movement is the most likely to cause discomfort. If offered, it should always be an option, not a requirement, and should be accompanied by comfort settings. A common technique is vignetting (also called tunneling), which narrows the user's field of view during movement, reducing peripheral optic flow.[51]

Environmental Design

The design of the virtual world itself can impact comfort. Developers should:

  • Avoid forcing movement on steep slopes or stairs
  • Keep walls and large objects at a comfortable distance to reduce optic flow
  • Consider using an Independent Visual Background (IVB), such as a static skybox that only moves with head rotation[50]

Interaction Design

Intuitive interaction is key to maintaining presence. The choice of input modality has a significant impact on the user experience.

Controllers vs. Hand Tracking
  • Controllers offer high precision, tactile feedback through buttons and triggers, and haptic feedback, making them ideal for gaming, creative tools, and any application requiring reliable and precise input.[52]
  • Hand tracking uses the headset's cameras to track the user's bare hands, offering a highly natural and intuitive interaction method. However, its performance can be affected by poor lighting, fast movements, and occlusion. It is generally less precise than controllers and is best suited for more casual experiences.[53]
Multimodal Input

The best practice is often to support multiple input modalities. Allowing a user to seamlessly switch between controllers, hands, and even voice commands provides maximum accessibility.[53]

UI/UX in 3D Space

Designing User Interfaces (UI) for a 3D, immersive space presents unique challenges.

Placement and Comfort

UI panels should be placed at a comfortable viewing distance, typically around 1-2 meters from the user, and positioned slightly below their natural line of sight to avoid neck strain.[54]

World-Locking vs. Head-Locking

A critical rule is to avoid locking UI elements directly to the user's head (a "head-locked" HUD). This is extremely uncomfortable and can cause motion sickness. Instead, UI should be "world-locked" (anchored to a position in the environment) or, if it must follow the user, it should do so with a gentle, smoothed animation.[54]

Feedback

Clear and immediate feedback is essential. Interactive elements should have distinct hover states (e.g., highlighting or scaling up), and actions like button presses should be accompanied by both visual and auditory cues.[54]

Technical Performance and Optimization

Maintaining a consistently high and stable frame rate is the most important technical requirement for a comfortable VR experience.

Understanding VR Performance Metrics

  • Framerate vs. Frame Time: Framerate, measured in Frames Per Second (FPS), is the average number of frames rendered over a second. Frame Time, measured in milliseconds (ms), is the time it takes to render a single frame. Frame time is a more accurate indicator of performance smoothness.[55]
  • Performance Targets: For standalone headsets like the Meta Quest series, applications must achieve a minimum of 72 FPS. For PCVR, the target is typically 90 FPS or higher.[56]

Profiling and Debugging

The first step in optimization is identifying the bottleneck. Performance issues are typically either CPU-bound (caused by complex game logic, physics, or too many draw calls) or GPU-bound (caused by high resolution, complex shaders, or too much geometry).[57]

Essential profiling tools include:

Core Optimization Strategies

  • Reduce Draw Calls: Combine multiple textures into a single texture atlas and use static batching to draw multiple similar objects in a single call. For standalone Quest, developers should aim for 50-200 draw calls per frame.[56][59]
  • Optimize Geometry: Keep polygon counts within the budget for the target platform (e.g., 750k-1M triangles for Quest 2). Use Level of Detail (LOD) systems.[56]
  • Simplify Lighting and Shaders: Use baked lighting whenever possible. Avoid complex, multi-pass shaders and full-screen post-processing effects on mobile hardware.[35]
  • Use Occlusion Culling: This prevents the engine from rendering objects that are completely hidden from the camera's view.[60]

Understanding Reprojection Technologies

When an application fails to render a new frame in time for the headset's display refresh, VR platforms employ asynchronous reprojection techniques. These systems act as a safety net, but they are not a substitute for good performance.

Key Reprojection Technologies Explained
Technology Developer Mechanism Corrects For Common Artifacts
Asynchronous Timewarp (ATW) Meta Takes the last successfully rendered frame and re-projects it based on the newest head rotation data.[61][62] Head Rotation only. Does not account for positional movement or in-scene animation.[63] Positional judder, animation judder.[64]
Asynchronous SpaceWarp (ASW) Meta When an app's framerate drops to half the display's refresh rate, ASW analyzes the motion between two previous frames to synthesize and insert a new, predicted frame.[65] Head rotation, head position, controller movement, and in-scene animation.[64] Warping, ghosting, or "bubbling" artifacts, especially around fast-moving objects.[64][66]
Motion Smoothing Valve Similar to ASW, it activates when an application cannot maintain framerate. It looks at the last two frames to estimate motion and extrapolates a new frame.[67] Head rotation, head position, controller movement, and in-scene animation.[67] Similar to ASW, it can produce visual artifacts like warping or wobbling.[68]

App Stores and Distribution

Meta Quest Store

The Meta Quest Store (formerly Oculus Store) is the primary distribution platform for Meta Quest applications on Horizon OS.[69]

Supported Devices

Distribution Methods

  • Official Meta Quest Store - Highly curated marketplace requiring stringent technical and content guidelines. Reserved for polished, high-quality experiences.[59]
  • App Lab - Alternative distribution path allowing developers to publish without full store curation. Apps are not browsable in the main store and can only be accessed via direct URL. Ideal for early access builds, experimental applications, and apps still in development.[70]
  • SideQuest - Third-party platform for sideloading applications

The submission process involves creating an app page, uploading builds (often using the Meta Quest Developer Hub desktop tool), providing store assets and metadata, and undergoing technical review.[58][71]

Steam / SteamVR

For PC-based VR, SteamVR is the dominant platform. It is known for its broad hardware compatibility, supporting not only Valve's own Index headset but also devices from HTC, Meta (via Link or Air Link), Windows Mixed Reality, and others.[72][73]

Supported Headsets

Publishing on Steam is handled through Steamworks, Valve's suite of tools and services for developers. The barrier to entry is lower than the curated Meta Quest Store.[74] The recent release of the official Steam Link app on the Meta Quest Store has further solidified Steam's role as a central hub for PCVR.[75]

PlayStation Store (PSVR2)

PSVR2 games are distributed through the PlayStation Store for PlayStation 5.[76]

Notable Features

  • Integrated with PS5 library
  • Support for PSVR2-exclusive titles
  • Cross-buy support for some titles

Apple Vision Pro App Store

Vision Pro applications are distributed through a dedicated section of the App Store for visionOS.[77]

App Types

Pico Store

The Pico Store distributes content for Pico headsets in Europe and Asia markets.[78] Pico offers a PICO Developer Program which provides qualifying developers with financial support, technical assistance, and marketing resources.[79]

Alternative Distribution: SideQuest

SideQuest is the leading third-party platform and community for the Meta Quest ecosystem. Its primary function is to facilitate the "sideloading" of applications—the process of installing an app directly onto the headset via a PC, bypassing the official store.[80]

SideQuest has become a vital hub for:[81]

  • Early access and experimental games
  • Unofficial ports of classic games to VR
  • Content that may not meet the curation guidelines of the official store
XR Distribution Platform Overview
Platform Primary Devices Curation Level Target Audience Key Feature
Meta Quest Store Meta Quest series High (Strict VRCs)[59] Mainstream consumers Highest visibility and monetization potential for standalone VR[70]
App Lab Meta Quest series Low (Basic technical review)[70] Early adopters, testers, niche communities Distribute to Quest users via direct link without full store curation[70]
SteamVR PCVR Headsets (Index, Vive, Quest via Link, etc.) Low (Self-publishing model)[74] PC gamers, VR enthusiasts Broadest hardware compatibility for PCVR; large existing user base[72]
SideQuest Meta Quest series Low (Community-driven)[81] VR enthusiasts, modders, indie game followers Primary platform for sideloading, experimental content, and early access[80]
Pico Store Pico headsets (e.g., Pico 4) High (Similar to Quest Store)[82] Consumers in markets where Pico is prevalent Growing ecosystem with developer incentive programs[79]

Development Tools and Resources

XR Interaction Toolkits

Toolkit Engine Features Current Version
XR Interaction Toolkit Unity Locomotion, UI interaction, object grabbing 3.1.2
Godot XR Tools Godot Teleportation, snap turning, hand interactions 4.4.0
Unreal VR Template Unreal Engine Motion controllers, VR preview Built-in
Meta XR Interaction SDK Unity Hand interactions, controller support Latest
PICO Unity Integration SDK Unity Pico-specific features Latest

SDKs and Plugins

Meta XR SDKs

Other SDKs

Testing and Debugging Tools

3D Models and Asset Marketplaces

Major 3D Asset Marketplaces

Marketplace Specialty Notable Features URL
Fab Unified content marketplace Supports Unity, Unreal Engine, UEFN. Successor to Unreal Engine Marketplace and Sketchfab Store. Free Quixel Megascans through 2024. In 2025, 1,500+ Megascans free to all users.[83][84] fab.com
CGTrader General 3D models Over 2 million 3D models, supports VR/AR content, auction system[85] cgtrader.com
TurboSquid Professional 3D models CheckMate certification, enterprise licensing, owned by Shutterstock[86] turbosquid.com
Sketchfab Web-based 3D viewer Real-time 3D preview, VR/AR optimized, WebGL viewer sketchfab.com
Unity Asset Store Unity assets Engine-specific assets, plugins, templates assetstore.unity.com
Quixel Megascans Photorealistic scans Now part of Fab, 1,500+ free assets in 2025[84] fab.com
Yobi3D 3D model search engine Aggregates models from various sites, supports VR/AR[87] yobi3d.com

Asset Types for VR Development

Free Asset Resources

Cross-Platform Development

OpenXR for Cross-Platform Development

OpenXR enables developers to create applications that work across multiple VR platforms without requiring separate codebases for each headset.[88]

Supported Platforms via OpenXR

Engine Support

Platform-Specific Considerations

Platform Input Methods Unique Features Distribution
Meta Quest Controllers, hand tracking Passthrough MR, spatial anchors, standalone Meta Quest Store, Steam
Apple Vision Pro Eye tracking, hand tracking, voice Mac Virtual Display, spatial video App Store
PSVR2 Sense controllers, hand tracking Eye tracking, HDR OLED PlayStation Store
PCVR (Steam) Various controllers High-end graphics, full body tracking Steam
Pico Controllers, hand tracking, body tracking Mixed reality, face tracking Pico Store

Best Practices and Guidelines

Performance Optimization

User Comfort

  • Minimize artificial locomotion sickness
  • Provide comfort options (teleportation, snap turning)
  • Maintain stable frame rates
  • Design UI within comfortable viewing angles
  • Test with diverse users

Platform-Specific Guidelines

Wrappers and Compatibility Layers

Some community tools exist to help run content across platforms:

  • ReVive - An open-source compatibility layer that lets Rift-exclusive games run on Vive or Index by translating Oculus SDK calls to OpenVR/SteamVR. (It does not bypass Oculus entitlement checks; games must still be owned on Oculus Home.)
  • OpenXR/Runtime Bridges - Modern runtimes like SteamVR now support OpenXR, reducing the need for hacks.
  • Legacy solutions (e.g. LibOVRWrapper) enabled old Rift apps to work with newer runtime DLLs, but these are rarely needed now.

In general, using the official SDK or OpenXR is recommended for compatibility.

Communities and Forums

Developer Communities

Learning Resources

Official Documentation

  • Meta Developers: developers.meta.com/horizon
  • Apple Vision Pro: developer.apple.com/visionos
  • Unity XR: docs.unity3d.com/Manual/XR.html
  • Unreal Engine VR: dev.epicgames.com/documentation/en-us/unreal-engine/virtual-reality-development
  • Godot XR: docs.godotengine.org/en/stable/tutorials/xr
  • OpenXR: khronos.org/openxr
  • SteamVR: partner.steamgames.com/doc/features/steamvr
  • Pico Developer: developer.picoxr.com

See Also

References

  1. https://docs.unity3d.com/6000.2/Documentation/Manual/XR.html - XR - Unity Documentation
  2. https://unity.com/solutions/xr - XR Development in Unity: AR, VR and Spatial Solutions
  3. 3.0 3.1 https://developers.meta.com/horizon/documentation/unity/unity-development-overview/ - Unity Development Overview for Meta Quest
  4. https://www.khronos.org/OpenXR - OpenXR - High-performance access to AR and VR
  5. 5.0 5.1 https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR | Meta Horizon OS Developers
  6. https://en.wikipedia.org/wiki/OpenXR - OpenXR - Wikipedia
  7. https://developers.meta.com/horizon/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis/ - Oculus All-in on OpenXR, Deprecates Proprietary APIs
  8. https://www.autovrse.com/openxr - What is OpenXR?
  9. 10.0 10.1 https://en.wikipedia.org/wiki/OpenVR - OpenVR - Wikipedia
  10. https://forum.dcs.world/topic/318110-confusion-steamvr-vs-openxr-opencomposite-8/ - Confusion SteamVR vs OpenXR / OpenComposite
  11. https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?
  12. https://www.reddit.com/r/virtualreality/comments/wi6w2x/vr_devs_just_how_universal_is_openxr_anyways/ - VR Devs, just how universal is OpenXR anyways?
  13. https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Meta and OpenXR
  14. https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Meta XR Interaction SDK Overview
  15. https://en.wikipedia.org/wiki/VisionOS - visionOS - Wikipedia
  16. https://www.qualium-systems.com/blog/everything-youd-like-to-know-about-visionos-development/ - Everything about visionOS Development
  17. https://www.pushsquare.com/news/2022/09/developers-should-have-an-easier-time-porting-their-games-to-psvr2 - Developers Should Have an Easier Time Porting Games to PSVR2
  18. https://gamerant.com/playstation-vr2-update-new-feature-hand-tracking/ - PlayStation VR2 SDK Update Adds Hand Tracking
  19. https://developer.picoxr.com/ - PICO Developer - Official Developer Portal
  20. https://developer.apple.com/augmented-reality/arkit/ - ARKit - Augmented Reality - Apple Developer
  21. https://developers.google.com/ar - Build new augmented reality experiences - Google AR
  22. https://docs.unity3d.com/Packages/[email protected]/manual/index.html - XR Interaction Toolkit - Unity Documentation
  23. 24.0 24.1 24.2 24.3 24.4 24.5 24.6 24.7 https://animost.com/ideas-inspirations/unity-vs-unreal-engine-for-xr-development/ - Unity vs. Unreal Engine for XR Development
  24. 25.0 25.1 https://daily.dev/blog/unity-vs-unreal-engine-for-vrar-development - Unity vs Unreal Engine for VR/AR Development
  25. 26.0 26.1 26.2 26.3 26.4 26.5 https://www.webaroo.us/insights/building-ar-vr/ - Building for AR/VR: Unity vs. Unreal Engine
  26. https://developers.meta.com/horizon/blog/openxr-standard-quest-horizonos-unity-unreal-godot-developer-success/ - Unity's OpenXR Plugin 1.14 Feature Parity
  27. https://www.unrealengine.com/en-US/news/unreal-engine-5-6-is-now-available - Unreal Engine 5.6 Release Announcement
  28. 29.0 29.1 29.2 29.3 https://www.reddit.com/r/virtualreality/comments/z5i23c/unity_or_unreal_for_vr_dev/ - Unity or Unreal for VR dev? - Reddit Discussion
  29. https://dev.epicgames.com/documentation/en-us/unreal-engine/unreal-engine-5-5-release-notes - Unreal Engine 5.5 Release Notes
  30. https://godotengine.org/article/godot-xr-update-feb-2025/ - Godot XR Update February 2025
  31. https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - Advancing OpenXR Development in Godot
  32. 33.0 33.1 33.2 33.3 33.4 https://developers.meta.com/horizon/documentation/unity/unity-project-setup/ - Set Up Your Unity Project for Meta Quest Development
  33. 34.0 34.1 34.2 https://developers.meta.com/horizon/documentation/unity/unity-core-sdk/ - Meta XR Core SDK for Unity
  34. 35.0 35.1 https://developers.meta.com/horizon/documentation/unity/unity-best-practices-intro/ - Best Practices for Unity
  35. https://developers.meta.com/horizon/documentation/unity/unity-isdk-interaction-sdk-overview/ - Interaction SDK
  36. https://developers.meta.com/horizon/documentation/unity/meta-xr-audio-sdk-unity/ - Meta XR Audio SDK Overview
  37. https://developers.meta.com/horizon/downloads/package/meta-xr-platform-sdk/ - Meta XR Platform SDK
  38. https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657 - Meta XR All-in-One SDK
  39. 40.0 40.1 https://docs.unity3d.com/Packages/[email protected]/ - XR Interaction Toolkit 2.0.0
  40. 41.0 41.1 41.2 41.3 https://learn.unity.com/tutorial/using-interactors-and-interactables-with-the-xr-interaction-toolkit - Using Interactors and Interactables with the XR Interaction Toolkit
  41. https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@latest/ - XR Interaction Toolkit
  42. 43.0 43.1 43.2 https://developers.meta.com/horizon/documentation/unreal/unreal-create-and-configure-new-project/ - Create and Configure a New Project in Unreal Engine
  43. 44.0 44.1 https://www.vrwiki.cs.brown.edu/vr-development-software/unreal-engine-5/adding-vr-to-an-existing-ue5-world - Adding VR to an Existing UE5 World
  44. https://dev.epicgames.com/documentation/en-us/unreal-engine/xr-development?application_version=4.27 - XR Development - Unreal Engine Documentation
  45. 46.0 46.1 46.2 46.3 https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-template-in-unreal-engine - VR Template in Unreal Engine
  46. https://pimax.com/blogs/blogs/steam-vr-vs-openxr-which-runtime-is-best - Steam VR vs OpenXR: Which Runtime is Best?
  47. 48.0 48.1 https://spurqlabs.com/vr-testing-challenges-best-practices-what-developers-should-know/ - VR Testing Challenges & Best Practices
  48. https://faculty.washington.edu/wobbrock/pubs/assets-23.02.pdf - Which VR Locomotion Techniques are Most Accessible?
  49. 50.0 50.1 https://xrdesignhandbook.com/docs/Meta/Locomotion%20Best%20Practices.html - Locomotion Best Practices
  50. https://developers.meta.com/horizon/blog/now-available-vr-locomotion-design-guide/ - VR Locomotion Design Guide
  51. https://developers.meta.com/horizon/design/interactions-input-modalities/ - Interactions and Input Modalities
  52. 53.0 53.1 https://developers.meta.com/horizon/design/hands - Hands - Meta for Developers
  53. 54.0 54.1 54.2 https://developers.meta.com/horizon/design/mr-design-guideline/ - Mixed Reality Design Guidelines
  54. 55.0 55.1 55.2 https://github.com/authorTom/notes-on-VR-performance - Notes on VR Performance
  55. 56.0 56.1 56.2 https://developers.meta.com/horizon/documentation/unity/unity-perf/ - Performance and Profiling for Unity
  56. https://developers.meta.com/horizon/documentation/native/pc/dg-performance-guidelines/ - Performance Guidelines
  57. 58.0 58.1 https://developers.meta.com/horizon/documentation/unity/ts-mqdh/ - Meta Quest Developer Hub
  58. 59.0 59.1 59.2 https://developers.meta.com/horizon/blog/down-the-rabbit-hole-w-oculus-quest-developer-best-practices-the-store/ - Developer Best Practices
  59. https://medium.com/@lemapp09/beginning-game-development-vr-performance-optimization-78553530ca83 - VR Performance Optimization
  60. https://developers.meta.com/horizon/blog/asynchronous-timewarp-on-oculus-rift/ - Asynchronous Timewarp on Oculus Rift
  61. https://developers.meta.com/horizon/documentation/native/android/mobile-timewarp-overview/ - Asynchronous TimeWarp (ATW)
  62. https://xinreality.com/wiki/Timewarp - Timewarp
  63. 64.0 64.1 64.2 https://developers.meta.com/horizon/blog/asynchronous-spacewarp/ - Asynchronous Spacewarp
  64. https://developers.meta.com/horizon/documentation/native/pc/asynchronous-spacewarp/ - Asynchronous SpaceWarp
  65. https://www.reddit.com/r/oculus/comments/bvcoh8/always_disable_asynchronous_spacewarp/ - Always disable Asynchronous Spacewarp
  66. 67.0 67.1 https://steamcommunity.com/games/250820/announcements/detail/1705071932992003492 - Introducing SteamVR Motion Smoothing
  67. https://forums.flightsimulator.com/t/motion-reprojection-explained/548659 - Motion Reprojection Explained
  68. https://www.meta.com/experiences/ - Official Meta Quest Store
  69. 70.0 70.1 70.2 70.3 https://developers.meta.com/ - Meta for Developers
  70. https://developers.meta.com/horizon/resources/publish-submit - Submitting your app
  71. 72.0 72.1 https://www.steamvr.com/ - SteamVR
  72. https://store.steampowered.com/app/250820/SteamVR/ - SteamVR on Steam
  73. 74.0 74.1 https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647 - SteamVR Plugin
  74. https://www.meta.com/experiences/steam-link/5841245619310585/ - Steam Link on Meta Quest Store
  75. https://www.playstation.com/en-us/ps-vr2/games/ - PlayStation VR2 games official page
  76. https://developer.apple.com/visionos/ - visionOS developer page
  77. https://www.picoxr.com/global - PICO Global official website
  78. 79.0 79.1 https://developer.picoxr.com/developer-program/?enter_from=picoweb - PICO Developer Program
  79. 80.0 80.1 https://sidequestvr.com/setup-howto - Get SideQuest
  80. 81.0 81.1 https://sidequestvr.com/ - SideQuest
  81. https://sdk.picovr.com/docs/FAQ/chapter_twentyeight.html - Developer Platform FAQ
  82. https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - Fab launched October 2024
  83. 84.0 84.1 https://www.unrealengine.com/en-US/blog/fab-epics-new-unified-content-marketplace-launches-today - In 2025, 1,500 Megascans will be free to all users
  84. https://www.cgtrader.com/ - CGTrader 3D Models Marketplace
  85. https://www.turbosquid.com/ - TurboSquid professional 3D models marketplace
  86. https://www.yobi3d.com - Yobi3D 3D model search engine
  87. https://www.khronos.org/blog/advancing-openxr-development-godot-xr-engine-enhancements - OpenXR provides unified API for VR and AR across platforms