Developer Resource
How to Get Started
- See also: How to Get Started in VR Development
This page serves as a comprehensive starting point for new developers to find resources needed for learning to create virtual reality experiences in 2024-2025.
Introduction
For aspiring VR developers in 2024-2025, the barrier to entry has never been lower. You can start building VR experiences with completely free tools (Unity Personal, Unreal Engine, or Godot), a $299 Meta Quest 3S headset, and free learning resources. The industry has consolidated around the OpenXR standard, meaning you can develop once and deploy to multiple VR platforms. Unity dominates VR development (68% of XR developers use it)[1], but Unreal Engine offers superior graphics for high-end experiences. Mobile VR (Google Cardboard, Gear VR) is completely dead—standalone VR headsets like Meta Quest have replaced it entirely.
The VR development landscape in 2024-2025 is characterized by three major trends: OpenXR standardization enabling true cross-platform development, the dominance of Meta Quest in the consumer market (75% market share with 20+ million Quest 2 users), and the emergence of mixed reality features blurring the lines between VR and AR. Whether you're building games, training simulations, architectural visualizations, or social experiences, the tools and platforms available today make it possible to create professional VR content as an indie developer or small studio.
Typical VR project budgets range from $10,000 for simple experiences to $500,000+ for commercial applications[2], though indie developers can start much smaller with modern tools and asset stores.
1. Understanding Current VR Hardware
The VR hardware market has matured significantly, with clear winners emerging in each category. Standalone VR headsets have become the primary development target, combining ease of use with sufficient performance for most applications. PC VR remains relevant for high-end experiences, while mobile VR has been completely abandoned by major manufacturers.
Standalone VR Headsets
Meta Quest 3S
Meta Quest 3S represents the best entry point for VR development in 2024-2025. Priced at $299 (128GB) or $399 (256GB)[3], it offers the same Snapdragon XR2 Gen 2 processor as its premium sibling while maintaining access to the full Quest ecosystem. With 1832 x 1920 pixels per eye, 90Hz refresh rate (up to 120Hz), and full-color passthrough cameras for mixed reality development, it provides everything needed to learn VR development without breaking the bank. The device works both as a standalone headset and can connect to a PC via Quest Link for testing PCVR applications. The Quest 3S uses Fresnel lenses which provide good optical quality at a lower cost than pancake lenses.
Meta Quest 3
Meta Quest 3 ($499 for 512GB) offers superior optics with pancake lenses and higher resolution (2064 x 2208 per eye)[4] but uses the same processor, ensuring full compatibility between devices. This makes the Quest 3S ideal for beginners while the Quest 3 suits developers who want better visual fidelity for testing. Both headsets support hand tracking, controller input, Unity and Unreal Engine development, and WebXR-enabled browsers. The Quest 3 features a 110° horizontal field of view compared to Quest 3S's 96° FOV[5].
Meta Quest Pro
Meta Quest Pro ($999, often discounted) targets enterprise and professional developers with advanced features including eye tracking, face tracking, high-resolution color passthrough cameras, improved ergonomics with rear-mounted battery, and included Quest Touch Pro controllers with built-in cameras for improved tracking[6]. The device provides 1800 x 1920 pixels per eye with local dimming for improved contrast. While expensive for beginners, Quest Pro's advanced tracking capabilities make it valuable for developers working on social VR, enterprise training, or experiences requiring facial expressions and eye tracking.
Pico 4 Ultra
Pico 4 Ultra (£529 in UK/EU, approximately $799 USD, not officially available in US) offers an alternative with 12GB RAM (more than Quest 3's 8GB), 2160 x 2160 per eye, and Wi-Fi 7 support[7]. It works standalone or connects to PC via PICO Connect for SteamVR compatibility. The device supports OpenXR, hand tracking, dual 32MP color passthrough cameras, and optional motion trackers for full-body tracking. The Pico 4 Ultra features an iToF depth sensor for enhanced mixed reality capabilities[8]. However, geographic limitations and smaller ecosystem make Meta Quest the safer choice for most developers.
PC VR Headsets
Valve Index
Valve Index ($999 full kit, $499 headset only) remains available but is not recommended for new purchases in 2025. Launched in 2019, its 1440 x 1600 per eye resolution and $999 price point make it poor value compared to Quest 3, despite its industry-leading 130° FOV and excellent "Knuckles" controllers with finger tracking[9]. The Index supports refresh rates up to 144Hz and uses SteamVR tracking with external base stations (Lighthouse 1.0 or 2.0) for precise room-scale tracking. The controllers feature advanced capacitive sensors for individual finger tracking, force sensors in grip for variable squeeze detection, and strap design allowing fully opening hands without dropping controllers. A successor ("Deckard") is rumored for late 2025 but unconfirmed.
HTC Vive Pro 2
HTC Vive Pro 2 ($1,199 full kit, $699 headset only) is aimed at enthusiasts and professionals, offering one of the highest resolutions available in a consumer headset. It boasts 2448 x 2448 pixels per eye (5K total resolution), a 120Hz refresh rate, and a wide 120° field of view[10]. Like the Valve Index, it uses SteamVR Tracking with external base stations for precise room-scale experiences. The Vive Pro 2 is compatible with both Valve Index controllers and original Vive controllers, providing flexibility for developers.
HTC Vive Focus Vision
HTC Vive Focus Vision ($999) bridges standalone and PC VR with a hybrid design, offering 2448 x 2448 per eye (5K resolution), 120° FOV, and built-in eye tracking with foveated rendering[11]. The device includes auto-IPD adjustment, hot-swappable batteries, and 26-point precision hand tracking, making it suitable for professional development and enterprise applications. The Focus Vision can operate standalone or connect to PC for hybrid workflows.
PlayStation VR2
PlayStation VR2 ($549, plus $59 PC adapter) entered PC VR compatibility in August 2024, offering 2000 x 2040 per eye on OLED displays with 90Hz/120Hz refresh rates[12]. The device provides access to both PS5 exclusives and SteamVR libraries. The PSVR2 features eye tracking, HDR, headset haptics, and adaptive triggers on PS5. However, PC functionality lacks several PS5 features: no eye tracking, HDR, headset haptics, or adaptive triggers when used with PC[13]. This makes it best suited for developers targeting both PlayStation and PC platforms simultaneously.
Bigscreen Beyond
Bigscreen Beyond ($999 headset only, requires SteamVR base stations and controllers sold separately) is a unique entry in the PC VR market, focusing on an ultra-lightweight and custom-fit design. Weighing only 127 grams, it is the world's smallest VR headset[14]. Each unit is custom-built for the user based on a 3D face scan, ensuring a perfect fit. It uses micro-OLED displays with a resolution of 2560 x 2560 per eye and supports refresh rates of 75Hz and 90Hz. The Bigscreen Beyond provides approximately 97° field of view and requires SteamVR base stations (1.0 or 2.0) and controllers for tracking.
Enterprise and Professional Headsets
Varjo XR-4 Series
Varjo XR-4 (€6,990) represents the premium end of professional VR/MR, targeting military, aerospace, and industrial training applications. This military-grade headset features resolution that mimics the human eye at 33 PPD (Pixels Per Degree), a 120° x 105° field of view, and an advanced LiDAR sensor for enhanced depth awareness[15]. The device uses inside-out tracking and supports photorealistic mixed reality with 200Hz eye tracking. The Varjo XR-4 is designed for mission-critical applications where visual fidelity is paramount, such as flight simulation, surgical training, and military operations planning.
Mixed Reality Devices
Apple Vision Pro
Apple Vision Pro ($3,499 for 256GB) represents the premium end of mixed reality, offering 23 million total pixels, micro-OLED displays, and best-in-class color passthrough quality[16]. The device runs visionOS with native support for Swift, SwiftUI, RealityKit, and ARKit, plus Unity support and WebXR in Safari. The M2 + R1 chip combination provides exceptional performance with 12ms eye tracking latency.
However, the $3,499 price point severely limits developer adoption, and Vision Pro does not support OpenXR, requiring completely separate development from other VR platforms. Hand and eye tracking serve as primary input (no controllers), making it unsuitable for traditional VR games but excellent for enterprise applications, productivity tools, and spatial computing experiences.
Mobile VR Status
Mobile VR is completely dead as a development platform in 2024-2025. Google Cardboard viewers are no longer officially produced (discontinued March 2021)[17], though third-party versions exist for $5-50. The SDK was open-sourced in November 2019 but receives no active development. Google states plainly: "we're no longer actively developing the Google VR SDK." Samsung Gear VR was discontinued in 2020, and Google Daydream View ended in October 2019.
The mobile VR market has completely shifted to standalone headsets like Meta Quest and Pico, which offer superior 6DoF tracking (versus mobile VR's 3DoF), dedicated VR processors, and no phone requirement at comparable or lower prices. Do not develop for mobile VR platforms—invest in standalone VR instead.
Hardware Comparison Table
Headset | Type | Resolution (per eye) | Refresh Rate | Field of View | Tracking | Price | Best For |
---|---|---|---|---|---|---|---|
Meta Quest 3S | Standalone | 1832 x 1920 | 90-120Hz | 96° | Inside-Out | $299-399 | Beginners, indie developers |
Meta Quest 3 | Standalone | 2064 x 2208 | 90-120Hz | 110° | Inside-Out | $499 | Standalone VR development |
Meta Quest Pro | Standalone | 1800 x 1920 | 90Hz | 106° | Inside-Out | $999 | Enterprise, eye/face tracking |
Pico 4 Ultra | Standalone | 2160 x 2160 | 90Hz | 105° | Inside-Out | ~$799 | International markets, MR |
Valve Index | PC VR | 1440 x 1600 | 144Hz | 130° | External (Base Stations) | $999 | High FOV, finger tracking |
HTC Vive Pro 2 | PC VR | 2448 x 2448 | 120Hz | 120° | External (Base Stations) | $1,199 | High resolution PC VR |
HTC Vive Focus Vision | Hybrid | 2448 x 2448 | 120Hz | 120° | Inside-Out | $999 | Enterprise, hybrid use |
PlayStation VR2 | PC VR/Console | 2000 x 2040 | 90-120Hz | 110° | Inside-Out | $549+$59 | PS5 + PC development |
Bigscreen Beyond | PC VR | 2560 x 2560 | 75-90Hz | 97° | External (Base Stations) | $999 | Lightweight, custom fit |
Apple Vision Pro | Mixed Reality | ~4K equivalent | 100Hz | 120° | Inside-Out | $3,499 | Spatial computing, iOS |
Varjo XR-4 | Professional | 33 PPD | 90Hz | 120° x 105° | Inside-Out | €6,990 | Enterprise, military, training |
Recommended Hardware for Beginners
Start with Meta Quest 3S ($299) for the optimal combination of price, capability, and ecosystem access. The device handles standalone VR development, connects to PC for testing PCVR builds, supports mixed reality features, and targets the largest VR user base. This single headset covers most development needs for beginners and indie developers.
2. Game Engines and Development Software
The game engine choice fundamentally shapes your VR development experience. All major engines support VR development in 2024-2025, but they differ significantly in learning curve, visual capabilities, and pricing models. Unity dominates the VR development landscape, but Unreal Engine and Godot offer compelling alternatives for specific use cases.
Unity: The VR Development Standard
Unity 6 (released October 17, 2024) represents the current Long Term Support release, supported through October 2026[18]. The engine offers the most comprehensive VR platform support and the largest ecosystem of VR-specific assets, tutorials, and community resources. Unity's licensing model changed significantly in 2024: Unity Personal is completely free for projects earning up to $200,000 annually (increased from $100,000), with no mandatory splash screen in Unity 6. Unity Pro costs $2,040/year per seat. The controversial runtime fee was cancelled in September 2024[19].
XR Interaction Toolkit (current version 3.0.8) provides Unity's high-level, component-based interaction system for VR and AR[20]. The toolkit includes:
- Object hover, select, and grab interactions
- Haptic feedback support
- Visual feedback with tint and line rendering
- Canvas UI interaction for 3D menus
- Teleportation and locomotion systems (continuous movement, snap turning)
- Hand tracking support
- Eye gaze support
- Poke interaction for mixed reality applications
The system is built around three core concepts:
- Interactors: Represent the user's hands or controllers, initiating actions like grabbing or pointing
- Interactables: Objects in the scene that can be hovered over, selected, grabbed, or otherwise manipulated
- Interaction Manager: An overarching system that mediates the interactions between Interactors and Interactables
Unity's VR template includes teleportation, snap turn mechanics, grab interactions, VR menu systems, and input mapping preconfigured for rapid prototyping. The XR Device Simulator allows VR development without a physical headset, though testing on actual hardware remains essential. Unity supports Android XR, Meta Quest, PlayStation VR2, Apple iOS and visionOS, and all OpenXR-powered headsets[21].
Unreal Engine: High-Fidelity VR Graphics
Unreal Engine 5.5 (latest in late 2024) provides cutting-edge graphics capabilities that make it the preferred choice for AAA VR experiences and architectural visualization[22]. The engine is completely free for game developers earning less than $1 million USD annually. Above that threshold, Epic Games charges a 5% royalty on gross revenue (reduced to 3.5% if games launch simultaneously on Epic Games Store starting January 2025)[23]. No royalty applies to sales made through Epic Games Store. Non-game uses (film, TV, architecture) require a $1,850/year seat-based subscription for companies making over $1 million annually.
Nanite and Lumen represent Unreal's technical advantages: Nanite provides virtualized geometry for highly detailed environments without traditional polygon count limitations, while Lumen delivers dynamic global illumination and reflections with real-time ray tracing. These features enable visual fidelity that exceeds what's achievable in Unity without extensive custom work. However, VR optimization remains crucial—mobile VR platforms may require disabling these features for performance.
The built-in VR Template includes teleportation locomotion, VR spectator Blueprint, common input actions for grabbing and attaching items, and compatibility with HTC Vive, Valve Index, Oculus Rift, Oculus Quest, Windows Mixed Reality, and PlayStation VR[24]. Blueprint Visual Scripting allows rapid prototyping without C++ knowledge, making Unreal accessible to beginners despite its reputation for complexity. Unreal Engine's VR support is built upon the OpenXR standard, ensuring broad compatibility across a wide range of headsets.
For beginners, start with Blueprints rather than jumping into C++. Blueprint's node-based visual scripting removes coding barriers while teaching fundamental programming concepts. Transition to C++ later for performance-critical code once you understand VR development principles.
Godot: Open-Source Alternative
Godot 4.5 (released September 15, 2025) offers a completely free and open-source VR development option with no strings attached. Licensed under MIT, Godot imposes zero royalties or revenue caps, making it ideal for developers who want full ownership without licensing concerns. The engine has gained significant traction in the VR space with comprehensive OpenXR support built directly into Godot 4.0 and later[25].
Recent updates brought:
- Hand tracking via XRHandModifier3D node for cross-platform hand tracking (OpenXR and WebXR)
- Face tracking with Unified Expressions standard support
- Body tracking using Humanoid Skeleton
- Composition layers for floating panels with sharp 2D content
- Scene discovery and spatial anchors for Meta Quest
- Enhanced foveated rendering via Variable Rate Shading
- WebXR improvements including hand tracking and MSAA
Platform support includes Meta Quest (2, 3, Pro), Pico 4, HTC Focus 3 and XR Elite, Magic Leap 2, SteamVR headsets, and Qualcomm Spaces (Lynx R1). GDScript, Godot's Python-like scripting language, offers a gentler learning curve than C# or C++, making it excellent for beginners. The engine also supports C# for developers familiar with that language (though not yet for web exports).
Godot suits indie developers and those philosophically aligned with open-source software, but it lacks Unity's extensive asset store and Unreal's AAA graphics capabilities. For learning VR development principles without financial investment, Godot represents an excellent choice.
WebXR: Browser-Based VR Development
WebXR Device API achieved W3C Candidate Recommendation status on October 1, 2025, replacing the deprecated WebVR API that never achieved full standardization[26]. WebXR provides a unified API for VR and AR experiences that run directly in web browsers without installation, working across desktop VR, mobile VR, and AR platforms. This enables instant access and easy sharing compared to native applications.
A-Frame
A-Frame (version 1.6.0 as of November 2024) offers the most accessible WebXR framework, using HTML-like markup for building 3D/VR experiences. Built on Three.js with Entity-Component-System architecture, A-Frame allows developers to create VR scenes with minimal JavaScript knowledge[27]. The framework provides a large community, extensive component library, and excellent beginner-friendliness. A-Frame's declarative HTML approach means developers can create complex scenes with simple, easy-to-read tags.
Babylon.js
Babylon.js provides a powerful, TypeScript-native real-time 3D game engine with comprehensive WebXR support. The WebXR Experience Helper offers one-stop access to XR functionalities including gaze, teleportation, and AR features. Havok Physics engine integration and an online playground make Babylon.js ideal for developers comfortable with TypeScript who want production-ready capabilities[28]. The engine includes an advanced rendering pipeline with Physically Based Rendering (PBR), physics engine integration, a GUI system, and robust, built-in support for WebXR.
Three.js
Three.js remains the most popular JavaScript 3D library with built-in WebXR support, offering lower-level control and maximum flexibility compared to higher-level frameworks[29]. It is a cross-browser JavaScript library and API used to create and display animated 3D computer graphics in a web browser using WebGL. React developers can use React Three Fiber, a React renderer for Three.js that enables building VR experiences using React components and modern hooks-based APIs.
Browser support spans Chrome, Microsoft Edge (Chromium-based), Firefox, and Safari (iOS) for AR[30]. WebXR enables cross-platform deployment without platform-specific SDKs, though performance and feature sets remain limited compared to native VR applications. Use WebXR for prototypes, web experiences, lightweight applications, or when instant accessibility matters more than maximum performance.
Development Platform Comparison
Platform | Primary Language(s) | Key Strengths | Key Weaknesses | Licensing Model | Ideal For |
---|---|---|---|---|---|
Unity | C# | Huge asset store, strong community, excellent mobile/standalone performance, mature toolset, 68% XR market share | Graphics fidelity can lag behind UE, subscription cost for successful businesses | Free for revenue/funding < $200K. Paid subscription ($2,040/yr/seat) required above that threshold | Standalone VR games, mobile AR, enterprise applications, developers with a C# background |
Unreal Engine | C++, Blueprints (Visual Scripting) | Industry-leading graphical fidelity, powerful built-in tools (Lumen, Nanite), royalty model is friendly to indies, photorealistic rendering | Steeper learning curve, higher system requirements, C++ can be complex for beginners | Free until product grosses $1M, then 5% royalty (3.5% with Epic Store). Separate seat-based license ($1,850/yr) for non-game industries | High-fidelity PC VR games, architectural visualization, film production, projects prioritizing top-tier visuals |
Godot | GDScript, C# | Completely free and open-source (MIT license), no royalties ever, lightweight, comprehensive OpenXR support, Python-like scripting | Smaller asset ecosystem, less mature VR tooling, fewer tutorials compared to Unity/Unreal | Open Source (Free, MIT License) | Indie developers, open-source projects, developers wanting no licensing restrictions |
WebXR (A-Frame, Babylon.js, Three.js) | JavaScript, HTML, TypeScript | Unparalleled accessibility (no downloads), easy to share, leverages existing web development skills, instant updates, works on any WebXR browser | Performance limitations compared to native engines, browser inconsistencies, less mature tooling, limited device features | Open Source (Free) | Marketing experiences, educational content, product configurators, rapid prototyping, developers with web background |
3. Development Resources and Tools
Building VR applications requires more than just a game engine. The modern VR development stack includes official SDKs for platform-specific features, 3D modeling software for asset creation, learning resources for skill development, and audio tools for spatial sound design.
Official SDKs and Cross-Platform Standards
OpenXR
OpenXR represents the most important development in VR platform interoperability. This royalty-free, open standard from the Khronos Group provides cross-platform API access to VR and AR devices, eliminating the need for platform-specific code[31]. OpenXR 1.1 (released April 15, 2024) consolidated widely-used extensions into the core specification, reducing fragmentation and simplifying advanced XR development[32].
Major platform support includes:
- Meta Quest (full support since 2021, all new features delivered via OpenXR extensions)[33]
- Microsoft (Windows Mixed Reality, HoloLens 2)
- Valve SteamVR
- HTC Vive
- Pico (OpenXR 1.1 conformant announced 2024)
- Google Android XR (announced December 2024)[34]
- Varjo
- NVIDIA CloudXR
- Qualcomm Snapdragon Spaces
Apple Vision Pro does not support OpenXR, using proprietary visionOS APIs instead.
Unity, Unreal Engine, and Godot all provide native OpenXR support through plugins or built-in systems. Developers targeting multiple VR platforms should prioritize OpenXR implementation over platform-specific SDKs whenever possible, adding platform-specific features only when necessary for functionality unavailable through OpenXR.
Meta Quest SDK
Meta Quest SDK (Meta XR Core SDK version 74+) provides the primary SDK for Quest development, with full OpenXR support now recommended[35]. The SDK offers:
- VR and mixed reality capabilities
- Unity and Unreal Engine integration
- Meta Spatial SDK for Android-based development
- Hand tracking, eye tracking (Quest Pro), and face tracking (Quest Pro) APIs
- Passthrough API for mixed reality experiences
SteamVR SDK
SteamVR SDK (OpenVR) continues as the underlying API for SteamVR[36]. The Unity plugin (version 2.8.0, updated January 17, 2024) provides SteamVR integration[37], though Valve has transitioned focus to OpenXR. SteamVR Input allows developers to define abstract "actions" like GrabItem or Teleport, with SteamVR handling mapping to physical controller inputs[38].
Learning Resources for VR Development
Unity Learn
Unity Learn provides the most comprehensive free VR learning pathway, with courses including "VR Development Pathway" and "Create with VR"[39]. The XR Device Simulator enables VR development practice without owning a headset, though actual hardware testing remains essential for motion comfort and interaction validation. Unity Learn offers structured paths covering beginner to advanced topics with hands-on projects.
Udemy Courses
For structured paid learning, Udemy offers several highly-rated VR development courses[40]:
- "Multiplayer Virtual Reality (VR) Development With Unity" by Tevfik IRONHEAD (4.4/5 rating, 8,372 learners, updated August 2024) covers multiplayer VR with Oculus Quest, Unity XR Interaction Toolkit, and Photon PUN 2
- "Unity VR/XR Developer: Make Immersive VIRTUAL REALITY Games" by GameDev.tv includes mock headset support for development without VR hardware
- "The Ultimate Guide to VR with Unity: No Code Edition" by GameDevHQ offers VR development without programming requirements
YouTube Channels
YouTube channels provide free ongoing education[41]:
- Valem Tutorials (92.4K subscribers) focuses specifically on Unity VR development with Meta Quest and XR Interaction Toolkit
- Tyriel Wood - VR Tech (136K subscribers) offers high-production VR/AR hardware reviews and technical insights
- ThrillSeeker covers VR hardware, body tracking, and haptic technology
- SideQuest (20.4K subscribers) provides platform-specific tutorials for Quest development and SideQuest distribution
- Mitch's VR Lab provides UI programming examples and principles for Unreal Engine
3D Modeling Software for VR Asset Creation
Every object and environment in a VR world is a 3D model. Developers can source these from asset stores or create them from scratch. VR often requires higher-detail or optimized models because players can inspect objects up-close.
Blender
Blender (current version 4.5 LTS, released July 15, 2025) represents the best option for beginners and budget-conscious developers. The software is 100% free and open-source under GNU GPL 3.0 license with zero limitations on commercial use[42]. Blender provides complete 3D modeling, sculpting, rigging, animation, and rendering capabilities comparable to commercial alternatives. The software exports to all VR-compatible formats (FBX, OBJ, GLTF, USD/USDZ) and includes VR viewport support for designing in VR.
Recent versions added:
- Geometry Nodes for procedural modeling
- Improved Cycles rendering engine
- Enhanced sculpting tools
- Python scripting support for automation
- Grease Pencil for 2D animation in 3D space
Autodesk Maya
Maya 2025 serves as the industry standard for AAA game character creation and animation. Pricing ranges from $1,785/year for annual subscription to $225/month for month-to-month access[43]. Maya Indie ($305/year) offers the full software for individuals and studios earning under $100,000 USD annually. Educational licenses provide free 1-year access for students and educators. Maya excels at character rigging, animation, and complex modeling workflows used in professional game and film production.
Autodesk 3ds Max
3ds Max 2025 costs $1,620-$1,690/year for professional licenses, with 3ds Max Indie at approximately $280/year for under $100,000 annual earnings. The Windows-only software excels at architectural visualization, environment design, and game asset creation with robust polygon, spline, and NURBS modeling. 3ds Max integrates well with Unreal Engine for architectural visualization workflows.
ZBrush
ZBrush 2025 (September 2024) transitioned to subscription-only pricing in 2024[44]. Current pricing is $49/month or $399/year[45]. The subscription includes both desktop (Windows, macOS) and iPad versions. ZBrush remains the industry leader for digital sculpting with:
- 200+ brushes for detailed organic and hard-surface sculpting
- Multi-resolution mesh editing
- ZModeler for hard surfaces
- Redshift CPU rendering included
- ZBrush to GoZ workflow for seamless transfer to Maya, Blender, 3ds Max
In-VR Modeling Tools
A newer paradigm involves creating 3D assets from within a VR environment:
- Gravity Sketch allows for intuitive 3D sketching and modeling using motion controllers. It is a professional tool used in industrial design that is now free for individual users[46]. Designers can manipulate virtual surfaces directly in 3D space with gesture-based sculpting.
- Adobe Substance 3D Modeler ($49.99/month as part of Substance 3D Collection) offers VR/desktop modeling and texturing with integration to Adobe Creative Cloud[47]. It works for organic assets and seamlessly integrates with other Substance tools.
- Tilt Brush (now open-source) and Medium (discontinued but available) offer a "virtual clay" sculpting experience, which can feel more natural for organic modeling.
For beginners: Start with Blender exclusively. The free, full-featured software provides everything needed to learn 3D modeling for VR without financial investment. Consider polygon count optimization (optimize for performance) and ensure proper scale and textures for VR viewing distances.
Photogrammetry and 3D Scanning
Photogrammetry converts photographs into 3D models, enabling realistic VR environments from real-world locations, authentic asset textures, architectural visualization, digital twin creation for training, and cultural preservation in VR experiences.
RealityScan
RealityScan 2.0 (rebranded from RealityCapture by Epic Games in 2024) offers professional-grade photogrammetry that's free for students, educators, and individuals/companies with annual gross revenue below $1 million USD[48]. Features include:
- AI-assisted background masking
- Visual quality inspection with heatmaps
- Aerial LiDAR and terrestrial data support
- Orthographic projections for mapping
- Unreal Engine integration
Agisoft Metashape
Agisoft Metashape (formerly Photoscan, from $179 for Standard Edition) provides professional photogrammetry software that processes photos of an object/scene into highly detailed 3D meshes[49]. The software handles DSLR photos, drone imagery, and generates meshes with color/diffuse texture maps for use in game engines. Cross-platform support (Windows, Mac, Linux) and cloud processing options make it versatile for various workflows.
Artec Studio
Artec Studio (subscription-based) provides AI photogrammetry from photos/videos with excellent geometry processing[50]. While newer to the market, it offers an easy-to-use interface for beginners with professional results. The software works well for capturing real-world objects and environments for VR integration.
Polycam
Polycam (iOS, Android, Web) provides accessible photogrammetry with LiDAR scanning support (on compatible iOS devices), Room Mode for floor plans, AI-based 360 capture, and seamless Unity/Unreal integration. Free plan includes 5 photogrammetry captures with up to 100 images each and 12+ export formats. Pro plan costs $17.99/month or $99.99/year.
Meshroom
Meshroom by AliceVision provides 100% free and open-source photogrammetry with node-based workflow, Structure-from-Motion and Multi-View Stereo algorithms, real-time 3D model viewing, HDR image fusion, panorama support, and plugins for Gaussian Splatting and NeRF. Requires Nvidia GPU with CUDA support recommended. Take 80+ photos from multiple angles for best results.
Asset Stores and Free Resources
Unity Asset Store
Unity Asset Store offers the largest VR-specific asset collection[51], including:
- Physics-based interaction frameworks
- VR templates and starter projects
- Locomotion systems (teleportation, continuous movement, climbing)
- Hand tracking solutions
- The free SteamVR Plugin (version 2.8.0, January 2024) maintained by Valve
- VRTK (Virtual Reality Toolkit) offering comprehensive VR development frameworks
Unreal Marketplace / Fab
Unreal Marketplace transitioned to Fab platform in 2024, offering VR templates like Ultimate VR Template (UVRT) with physics, locomotion, climbing, and inventory systems[52]. The built-in VR Template in Unreal Engine provides comprehensive starting points free with the engine.
Free 3D Model Platforms
- Sketchfab hosts millions of 3D models with many free CC-licensed options, WebGL viewer, and VR-compatible formats (GLTF, FBX, OBJ)
- TurboSquid offers 29,000+ free models among 1.5+ million total models[53]
- Poly Haven provides 100% free CC0 assets (HDRIs, textures, 3D models) in the public domain with no attribution required
- CGTrader mixes free and paid models with large VR/AR libraries
- Unity and Unreal asset stores both have free sections with VR-ready content
Spatial Audio Tools
Audio is a critical, yet often overlooked, component of immersion. Spatial audio makes sounds appear to originate from specific points in 3D space, enhancing realism and providing crucial navigational cues.
Principles of Spatial Audio
Spatial audio is achieved primarily through binaural rendering using HRTF (Head-Related Transfer Function). An HRTF is a filter that simulates how sound waves are altered by a person's head, torso, and outer ears before reaching the eardrums[54]. By applying HRTFs to a mono sound source and playing it back over headphones, developers can create a convincing illusion that the sound is coming from a specific direction relative to the listener.
Steam Audio
Steam Audio (version 4.6.0, released December 2024) provides the best free spatial audio solution for VR[55]. Developed by Valve, the advanced spatial audio SDK uses physics-based sound propagation and HRTF-based binaural audio. Features include:
- HRTF binaural rendering for 3D positioning
- Physics-based sound propagation
- Occlusion and reverb simulation based on scene geometry
- Low-latency audio for hundreds of sources
- Ambisonics support for 360-degree soundfields
- Baking for design-time optimization
The completely free and open-source SDK includes plugins for Unity, Unreal Engine, FMOD Studio, and Wwise (added in v4.6.0), plus native C++ API[56].
Meta XR Audio SDK
Meta XR Audio SDK (formerly Oculus Audio SDK) is Meta's proprietary solution, replacing the older Oculus Spatializer. It provides high-quality HRTF-based spatialization, room acoustics simulation, and ambisonics support for both Unity and Unreal Engine, optimized for the Quest platform[57]. The SDK is free and includes features like early reflections, late reverberation, and automatic room modeling.
FMOD Studio
FMOD Studio by Firelight Technologies offers DAW-like workflow with:
- Timeline-based event system
- Real-time parameter control for dynamic audio
- Adaptive music systems
- 3D spatialization and attenuation
- Built-in event macros and snapshot systems
The indie license is free for projects under revenue thresholds (projects/companies with budget under $500K USD)[58], with commercial licensing requiring contact with Firelight. FMOD's easier learning curve and intuitive interface make it ideal for indie developers and solo creators.
Wwise
Wwise by Audiokinetic represents the industry standard audio middleware used in AAA productions. The powerful Game Call system, Actor-Mixer hierarchies, advanced profiling tools, comprehensive sound propagation, and extensive documentation make it preferred for large productions. A free tier exists for indie developers under revenue limits (projects/companies with budget under $500K CAD), with commercial pricing available by contact.
Audacity
Audacity is a simple, powerful, and reliable audio editor that is entirely free and open-source. While it doesn't handle spatial audio directly, it's essential for editing and preparing audio files before importing into game engines or middleware solutions.
For indie developers, start with Steam Audio (free) for spatial audio, adding FMOD (easier) or Wwise (more powerful) as projects grow in complexity.
4. VR Development Best Practices
Creating comfortable, intuitive VR experiences requires following established best practices that prevent motion sickness, optimize performance, and design interactions that feel natural in 3D space.
Performance Targets and Optimization
VR applications must maintain 90 FPS minimum on target hardware to prevent motion sickness and ensure comfort[59]. Meta Quest store qualification requires 90 FPS on recommended spec machines with an absolute minimum of 45 FPS (using reprojection) on minimum spec machines. Frame time must stay at 11.1 milliseconds or less for 90 FPS. PlayStation VR2 targets 90Hz or 120Hz refresh rates. PC VR headsets typically target 90Hz or higher.
Platform | Target FPS | Frame Time | Notes |
---|---|---|---|
Meta Quest 2/3/3S | 90 FPS (72 FPS minimum) | 11.1ms | 120Hz experimental on Quest 3 |
PlayStation VR2 | 90-120 FPS | 8.3-11.1ms | Both 90Hz and 120Hz modes |
PC VR (Index, Vive) | 90-144 FPS | 6.9-11.1ms | Varies by headset |
Apple Vision Pro | 90-100 FPS | 10-11.1ms | Adaptive refresh rate |
Optimization guidelines:
- Limit to 500-1,000 draw calls per frame maximum
- Maximum 1-2 million triangles/vertices per frame for standalone VR
- Use texture compression (ASTC for Quest, BC for PC) and mipmapping universally
- Limit script execution to 1-3ms per frame to avoid CPU bottlenecks
- Identify whether bottlenecks are CPU or GPU-bound and optimize accordingly
- Use Level-of-Detail (LOD) systems for distant objects
- Implement occlusion culling to avoid rendering hidden geometry
- Bake lighting when possible instead of using real-time lights
- Use foveated rendering if supported by the headset
For Unreal Engine VR:
- Enable Mobile Multiview for standalone VR (Quest, Pico)
- Enable Instanced Stereo for PCVR
- Enable Round Robin Occlusion (RRO) which alternates rendering between eyes for performance gains
- Use forward shading instead of deferred rendering
- Set MSAA to 4x for optimal quality/performance balance
For Unity VR:
- Use Universal Render Pipeline (URP) which provides performance needed for mobile VR
- The multi-threaded Data-Oriented Technology Stack (DOTS) enables advanced performance optimization for complex scenes
- Enable Single Pass Instanced rendering mode
Motion Sickness Prevention
VR motion sickness occurs primarily due to sensory conflict: eyes perceive movement while the inner ear senses stillness[60]. Additional causes include:
- Low frame rates below 90 FPS
- High latency between head movement and display update (keep under 20 milliseconds)
- Low resolution causing eye strain
- Display flickering affected by refresh rate and luminance
Prevention strategies:
- Maintain consistent 90 FPS minimum without exception
- Keep latency under 20ms between head movement and display update
- Implement Field of View (FOV) reduction (vignetting) during movement using darkened peripheral vision during locomotion
- Offer both teleportation and continuous locomotion options when possible
- Always provide a stable reference frame—never move the camera independently of head tracking
- Avoid sharp visual patterns in peripheral vision
- Use the 80/20 composition rule: 80% darker colors, 20% bright colors to reduce visual stress
- Provide comfort options in settings (FOV reduction intensity, turning speed, snap turn increments)
Locomotion considerations:
- Teleportation locomotion eliminates motion sickness by removing continuous movement optical flow, though it introduces higher cognitive workload. Use arc trajectory visualization to show destination clearly.
- Continuous/smooth locomotion feels more natural and performs better for speed and precision tasks but can induce sickness if implemented poorly. Always pair with comfort options.
- Snap turn provides discrete rotation (typically 30-45° increments) to avoid continuous turning discomfort, commonly paired with other locomotion methods
- Room-scale physical walking offers maximum immersion limited by tracking space; use redirected walking techniques for larger virtual spaces
- Static reference frames (virtual cockpit, virtual nose) provide mixed results depending on users—test with your target audience
UI/UX Design for VR
Meta's official VR design guidelines emphasize prototype early and test frequently—2D screen design differs radically from actual VR experience[61]. Additional best practices from industry research[62]:
- Use physical world metaphors to guide design; familiar objects don't require learning new interaction models
- Support intuitive interactions where users can naturally grab objects and press buttons
- Keep UI concise and unobstructive, avoiding coverage of the entire scene
- Diegetic vs. Non-Diegetic Interfaces: Non-diegetic UI (traditional HUD overlays) can break immersion. Diegetic interfaces integrate UI elements into the game world itself (e.g., ammunition counter on weapon, health meter on wrist device). Diegetic interfaces are generally preferred in VR as they enhance presence.
Design guidelines:
- Angular scale measurement using dmm (1mm/1px at 1m distance) ensures consistent element sizing across varying distances
- Safe area guidelines prevent title-text and key elements from being cut off by headset FOV limitations
- Spatial design requires rethinking traditional UI patterns—no conventional anchor points like screen corners exist in VR
- Avoid "head-locked" content where UI elements are fixed to the user's view and follow their head movement—this is unnatural and causes eye strain
- Instead, anchor UI elements in the world or have them follow the user's view with gentle, smoothed animation (lazy follow)
- Keep text and interactive elements within 1-5 meters for comfortable viewing
- Use large, easy-to-read fonts (minimum 0.5-1 degree visual angle)
- Provide clear visual feedback for hovering, selection, and interaction states
VR UI/UX Resources:
- Google's Cardboard Design Lab provides fast introduction to VR design principles
- Research VR Podcast covers developing VR industry and cognitive science
- Leap Motion's VR Design Best Practices article offers thoughtful suggestions
- Fuseman's Introduction to VR UI in Unity livestream tutorial
- Unity and Meta developer documentation for platform-specific guidelines
Interaction Design Patterns
Hand Tracking vs Controllers
Meta Quest 2, 3, and Pro include hand and body tracking toggleable in Movement Tracking settings; switch between controllers and hands by clicking controllers together. HTC Vive Focus 3 requires enabling in Settings > Inputs with pinch gesture for selection. Valve Index controllers feature advanced capacitive sensors for finger tracking on controllers, force sensors in grip for variable squeeze detection, and strap design allowing fully opening hands without dropping controllers.
Controllers remain dominant for precision tasks and gaming in 2024-2025, while hand tracking gains adoption for social VR, casual interactions, and accessibility. Many apps support both with seamless switching. Hand tracking uses headset cameras to track the position and gestures of the user's real hands[63], allowing for highly intuitive interactions like pinching to select, making a fist to grab, or pushing virtual buttons with a finger.
Grab and Manipulation
Patterns from Unity XR Interaction Toolkit and common industry practices:
- Direct grab: Using sphere trace from controller position for nearby objects
- Distance grab: Using ray-based selection for distant objects (raycasting)
- Attach on grab: Where objects snap to specific attachment points on controllers
- Physics-based throwing: Combining velocity, trajectory, and release timing for realistic object throwing
- Two-handed interactions: For large objects, weapons, or tools requiring two hands
- Haptic feedback: Vibration feedback confirming interactions
Input Implementation
Modern VR development has moved away from coding for specific buttons on specific controllers. Instead, use abstraction layers:
- SteamVR Input (part of SteamVR ecosystem) allows developers to define abstract "actions" like GrabItem or Teleport. SteamVR then handles mapping these actions to the physical inputs of whatever controller the user has[38].
- The modern industry standard is OpenXR, a royalty-free API that provides a unified interface for engines like Unity and Unreal to communicate with a wide variety of VR/AR hardware[64]. By developing against the OpenXR API, developers can write their interaction logic once and have it work across many different headsets without platform-specific code.
5. Programming Languages for VR Development
Each game engine uses specific programming languages, and your language choice should align with your platform choice.
C# for Unity VR Development
Unity exclusively uses C# for scripting, making it the most important programming language for VR developers given Unity's 68% market share among XR developers. The official Unity Learn VR Development Pathway provides free, structured learning. Unity's Visual Scripting offers an alternative for non-programmers, enabling VR development without traditional coding.
Key systems for Unity VR development:
- The XR Plug-in Management system handles device support across platforms
- XR Interaction Toolkit provides high-level components for common VR interactions
- Unity's VR template includes teleport locomotion, snap turn mechanics, grab interactions, VR menu systems, and input mapping preconfigured
C# is a modern, object-oriented language with features like:
- Strong typing for catching errors at compile time
- Garbage collection for automatic memory management
- LINQ for data queries
- Async/await for asynchronous programming
- Extensive .NET Standard library support
C++ and Blueprints for Unreal Engine VR
Unreal Engine offers two development paths: Blueprint Visual Scripting and C++ programming.
For beginners, start with Blueprints exclusively. Blueprint's node-based visual scripting eliminates code barriers while providing:
- Immediate results and visual feedback
- Easier understanding of coding principles through visual representation
- Full VR Template availability with pre-built systems
- Transition path to C++ after mastering concepts
The VR Template in Blueprints includes:
- VRPawn with locomotion logic
- GrabComponent for interactions
- Teleport and snap turn implementations
- Built-in OpenXR support compatible with Meta Quest, VIVE, Valve Index, and PSVR2
C++ becomes necessary for:
- AAA game development requiring maximum performance
- Better performance optimization and memory management
- Full control over engine features and low-level systems
- Plugin development for extending engine functionality
However, C++ carries a harder learning curve. Industry standard AAA projects use approximately 10% Blueprints for gameplay logic and 90% C++ for performance-critical code. Indies can succeed with primarily Blueprint-based projects, adding C++ only when performance demands it.
JavaScript/TypeScript for WebXR
WebXR development uses standard web technologies: JavaScript or TypeScript with WebGL/WebGL2 rendering. TypeScript adds static typing to JavaScript, catching errors at compile time and improving code maintainability for larger projects.
Framework selection depends on developer background:
- A-Frame: HTML-based syntax for maximum accessibility, best for beginners with web development background
- Babylon.js: TypeScript-native with excellent WebXR Experience Helper, ideal for TypeScript developers
- Three.js: Most flexibility and control with the largest community but requires manual WebXR setup, suited for developers wanting fine-grained control
- React-XR (react-three-fiber): Built on Three.js, offers React components with modern hooks-based API, perfect for React developers
WebXR requires HTTPS (secure context) for security but enables:
- Deployment across desktop and mobile without app installation
- Instant updates without app store approval
- Future-proof compatibility with new hardware automatically supported
- Zero installation barrier for users
GDScript for Godot
GDScript is Godot's Python-like scripting language designed specifically for game development. It offers:
- Easy-to-read syntax similar to Python
- Tight integration with Godot's scene system
- Dynamic typing with optional static typing
- Built-in support for signals (Godot's event system)
Godot also supports C# for developers familiar with that language (same as Unity), though C# support doesn't yet extend to web exports. GDScript's gentler learning curve compared to C# or C++ makes it excellent for beginners.
6. Advanced Topics
Beyond basic VR development, several advanced topics enable more sophisticated experiences.
Cross-Platform VR Development with OpenXR
Developing for multiple VR platforms simultaneously maximizes audience reach. OpenXR provides the foundation for cross-platform VR, enabling single codebase deployment to Meta Quest, SteamVR, Pico, Windows Mixed Reality, HTC Vive, and most modern headsets. Apple Vision Pro remains the notable exception, requiring completely separate development using visionOS, Swift, and proprietary APIs.
Unity with XR Interaction Toolkit represents the most popular cross-platform approach (68% of XR developers use Unity). The framework provides unified VR/AR interactions across Quest, PSVR2, SteamVR, Pico, Windows MR, iOS, and Android. Native OpenXR plugin support enables single codebase building to 25+ platforms.
Cross-Platform Challenges
- Performance differences between Quest (mobile), PC VR, and PSVR2 require:
- Quality settings and LOD systems with platform-specific presets - Adaptive rendering based on device capabilities - Separate optimization passes per platform
- Input differences across controllers, hand tracking, and eye tracking require:
- Using OpenXR's unified action system - Abstracting input into gameplay actions rather than specific buttons - Testing on multiple devices
- Platform-specific features like passthrough MR (Quest/Pico) and eye tracking (PSVR2/Varjo) require:
- Designing core experiences that work everywhere - Adding platform enhancements as optional features that gracefully degrade
- Store requirements vary significantly between platforms—read each platform's technical requirements
Publishing Platforms and Distribution
Meta Quest Store
Meta Quest Store provides access to the largest VR user base (20+ million Quest 2 users, growing with Quest 3 and 3S)[65].
Submission process: 1. Create developer account on Meta Developer Portal 2. Enable Developer Mode on Quest device 3. Build and package app meeting Quest requirements 4. Upload via Meta Quest Developer Hub 5. Pass Virtual Reality Check (VRC) guidelines 6. Wait 1-5 days for review (30-day waiting period for first titles)
Requirements include:
- Proper controller support and input handling
- Correct application manifest with appropriate permissions
- Correctly signed APK with secure keystore
- 72 Hz minimum performance (90 Hz recommended)
- VRC compliance covering packaging, performance, functionality, and assets
- High-quality branded assets (icons, screenshots, trailers)[66]
App Lab apps merged into main Meta Horizon Store as of August 2024, simplifying distribution.
SideQuest and App Lab
SideQuest serves as third-party discovery platform and official App Lab discovery channel for Quest. Less rigorous approval than main Quest Store enables experimental content distribution. 24 indie games launched from SideQuest to official Meta Store in 2023, demonstrating it as a proving ground for developers. SideQuest provides:
- Easier approval process for experimental content
- Direct APK sideloading support
- Community feedback before main store launch
- Alternative monetization options
Steam VR
Steam VR offers open PC VR publishing with:
- $100 app deposit fee (recoupable after $1,000 gross revenue)
- Support for Vive, Index, Rift, Windows MR, and Quest via Link
- 30% revenue share (decreasing to 25% after $10M, 20% after $50M)[67]
- No mandatory quality bar makes it more open than Quest Store
- Access to Steam's large PC gaming audience
- Built-in workshop support for user-generated content
Pico Store
Pico Store provides active and growing distribution with:
- Global presence with strong focus in China and Europe
- Developer support through Unity and Unreal SDKs
- OpenXR 1.1 conformance announced 2024
- Growing library of enterprise and consumer applications
PlayStation VR2
PlayStation VR2 publishing remains challenging for indie developers, requiring Sony partnership rather than self-service like Quest or Steam. Best suited for established studios with existing portfolios or publisher relationships. Sony provides developer kits and support but maintains stricter curation than other platforms.
Viveport
Viveport is HTC's app store offering both standalone purchases and a subscription service (Viveport Infinity) for VR titles. It provides an alternative distribution channel for HTC Vive and Focus devices.
Distribution strategy recommendation: Release on Quest first (largest audience), simultaneous Steam release for PC VR players, Pico for international reach, PSVR2 only if budget allows higher production values.
7. Getting Started: Your First VR Project
The path to VR development starts with choosing your technology stack based on your goals, budget, and existing skills. For most beginners, the recommended stack is Meta Quest 3S ($299) + Unity with C# + free learning resources + Blender for 3D assets. This combination minimizes cost while targeting the largest VR user base and leveraging the most extensive learning resources.
Getting Started with Unity
1. Download and install Unity Hub 2. Install Unity 6 (latest LTS version) 3. Create a new project using the VR template (or 3D Core template) 4. Install XR Interaction Toolkit via Package Manager 5. Configure your Quest 3S in Developer Mode (via Meta Quest mobile app) 6. Connect Quest to PC via USB-C cable or Air Link 7. Enable Android build support in Unity 8. Start with the VR template scene and explore interactions
Follow Unity Learn's VR Development Pathway for structured education. Supplement with Valem Tutorials on YouTube for practical examples. Use XR Device Simulator when headset isn't available, but test frequently on actual hardware for comfort and presence validation.
Alternative Paths
- Unreal Engine with Blueprints for superior graphics (ideal for architectural visualization and high-fidelity experiences)
- Godot for completely free open-source development with no licensing concerns whatsoever
- WebXR with A-Frame/Babylon.js for browser-based VR accessible without installation, perfect for web developers
First Project Checklist
Focus first on:
- Maintaining 90 FPS minimum consistently
- Implementing both teleportation and continuous locomotion with comfort options
- Designing UI that doesn't obstruct view and remains readable
- Testing interaction distances and reach in VR physically
- Implementing hand presence with clear visual feedback (controller models or hand models)
- Basic spatial audio for immersion
- Simple grabbable objects to learn interaction systems
Start small. Begin with a very simple scene or prototype (e.g., a single room with a cube you can pick up). VR development involves learning multiple disciplines (graphics, UX, code) at once, so focus on one mechanic at a time. Work at it step by step; once you have a few projects under your belt you'll be in a much better position to attack more complex problems.
Motion comfort should never be compromised—always prioritize user comfort over ambitious features. Test your application with multiple people if possible, as motion sensitivity varies significantly between individuals.
Community Resources
Join VR development communities:
- Reddit: r/virtualreality, r/Unity3D, r/unrealengine, r/oculus, r/VRdev
- Discord servers for specific engines and tools (Unity VR Discord, Unreal Slackers, etc.)
- Official forums from Meta, Unity, and Unreal Engine
- Local VR developer meetups and game jams
The VR development community actively helps beginners, and sharing early prototypes yields valuable feedback. Don't be afraid to ask questions—everyone started as a beginner.
Essential Skills Beyond Coding
VR development requires interdisciplinary skills:
- 3D math fundamentals: Vectors, quaternions, matrices for transformations
- User experience design: Understanding human perception and comfort in VR
- Performance profiling: Using engine tools to identify and fix bottlenecks
- 3D modeling basics: Even if using assets, understanding topology, UVs, and LODs helps
- Audio design: Spatial audio significantly impacts immersion
- Project management: Scope control prevents feature creep in solo/small team projects
The most important step is starting. Download Unity or Unreal today, follow the official VR template tutorials, and build something simple—a room with grabbable objects, a teleportation system, or a simple VR experience. Learning VR development requires experiencing VR development. The technology is more accessible than ever in 2024-2025, and the community is welcoming to newcomers.
See Also
- Developer Resource
- VR sickness
- Locomotion in virtual reality
- Unity (game engine)
- Unreal Engine
- OpenXR
References
- ↑ https://erikralston.medium.com/getting-started-with-vr-development-in-2024-20d22f05d5b9 - Getting Started with VR Development in 2024
- ↑ https://www.webmobril.com/vr-game-development-guide-for-2025/ - VR Game Development Guide for 2025
- ↑ https://www.xrtoday.com/mixed-reality/meta-quest-3s-price-features-and-specs/ - Meta Quest 3S: Price, Features, and Specs
- ↑ https://en.wikipedia.org/wiki/Meta_Quest_3 - Meta Quest 3 Wikipedia
- ↑ https://www.cnet.com/tech/gaming/best-vr-headset/ - Best VR Headset of 2024
- ↑ https://www.meta.com/quest/quest-pro/ - Meta Quest Pro
- ↑ https://www.techradar.com/computing/virtual-reality-augmented-reality/pico-4-ultra-vs-meta-quest-3-the-battle-of-the-best-mid-range-vr-headsets - Pico 4 Ultra vs Meta Quest 3
- ↑ https://www.picoxr.com/global/products/pico4-ultra - PICO 4 Ultra
- ↑ https://store.steampowered.com/valveindex - Valve Index VR Kit
- ↑ https://www.vive.com/us/product/vive-pro2/specs/ - VIVE Pro 2 Specs
- ↑ https://www.vive.com/us/product/vive-focus-vision/overview/ - VIVE Focus Vision
- ↑ https://en.wikipedia.org/wiki/PlayStation_VR2 - PlayStation VR2 Wikipedia
- ↑ https://blog.playstation.com/2024/06/03/playstation-vr2-players-can-access-games-on-pc-with-adapter-starting-on-august-7/ - PlayStation VR2 PC Adapter Announcement
- ↑ https://mrtv.co/2025/01/bigscreen-beyond-review-2025-still-worth-it-in-2025/ - Bigscreen Beyond Review 2025
- ↑ https://varjo.com/products - High-Resolution Virtual and Mixed Reality Headsets
- ↑ https://www.apple.com/apple-vision-pro/ - Apple Vision Pro
- ↑ https://www.slashgear.com/1624183/why-google-cardboard-vr-discontinued/ - Why Google Discontinued Cardboard VR
- ↑ https://investors.unity.com/news/news-details/2024/Unity-6-Will-Release-Globally-October-17-2024-Unity-Announces-at-Annual-Unite-Developer-Conference/ - Unity 6 Release Announcement
- ↑ https://unity.com/products/pricing-updates - Unity Pricing Updates
- ↑ https://docs.unity3d.com/Packages/[email protected]/manual/index.html - XR Interaction Toolkit Documentation
- ↑ https://unity.com/solutions/xr - Unity XR Solutions
- ↑ https://en.wikipedia.org/wiki/Unreal_Engine_5 - Unreal Engine 5 Wikipedia
- ↑ https://www.cgchannel.com/2024/10/epic-games-to-cut-royalty-rate-on-unreal-engine-games/ - Epic Games Reduces Unreal Engine Royalty Fee
- ↑ https://dev.epicgames.com/documentation/en-us/unreal-engine/vr-template-in-unreal-engine - VR Template in Unreal Engine
- ↑ https://godotengine.org/article/godot-xr-update-oct-2024/ - Godot XR Update October 2024
- ↑ https://www.w3.org/TR/webxr/ - WebXR Device API W3C Specification
- ↑ https://github.com/aframevr/aframe - A-Frame GitHub Repository
- ↑ https://immersiveweb.dev/ - Immersive Web Developer Resources
- ↑ https://developer.mozilla.org/en-US/docs/Glossary/Three_js - Three.js - MDN Glossary
- ↑ https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API - WebXR Device API - MDN
- ↑ https://www.khronos.org/openxr/ - OpenXR Official Website
- ↑ https://www.khronos.org/news/press/khronos-releases-openxr-1.1-to-further-streamline-cross-platform-xr-development - OpenXR 1.1 Release
- ↑ https://developers.meta.com/horizon/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis/ - Meta All In on OpenXR
- ↑ https://www.khronos.org/news/permalink/android-xr-adopts-openxr - Android XR Adopts OpenXR
- ↑ https://developers.meta.com/horizon/blog/start-building-meta-quest-3s-3-launch-spatial-sdk-2D-mixed-reality/ - Start Building with Meta Quest 3 and 3S
- ↑ https://partner.steamgames.com/doc/features/steamvr - SteamVR Steamworks Documentation
- ↑ https://valvesoftware.github.io/steamvr_unity_plugin/ - SteamVR Unity Plugin
- ↑ 38.0 38.1 https://github.com/ValveSoftware/openvr/wiki/SteamVR-Input - SteamVR Input
- ↑ https://learn.unity.com/pathway/vr-development - Unity Learn VR Development Pathway
- ↑ https://www.udemy.com/course/multiplayer-virtual-reality-vr-development-with-unity/ - Multiplayer Virtual Reality Development With Unity
- ↑ https://videos.feedspot.com/virtual_reality_youtube_channels/ - 100 Virtual Reality YouTubers
- ↑ https://www.blender.org/ - Blender Official Website
- ↑ https://www.autodesk.com/products/maya/overview - Autodesk Maya Pricing
- ↑ https://www.cgchannel.com/2024/09/maxon-releases-zbrush-2025/ - Maxon Releases ZBrush 2025
- ↑ https://www.maxon.net/en/zbrush-plan-options - ZBrush Plan Options
- ↑ https://all3dp.com/2/3d-modeling-vr-software-tool/ - The Best VR 3D Modeling Software
- ↑ https://www.adobe.com/products/substance3d/apps/modeler.html - 3D modeling software for 3D sculpting - Adobe Substance 3D
- ↑ https://www.realityscan.com/en-US/news/realityscan-20-new-release-brings-powerful-new-features-to-a-rebranded-realitycapture - RealityScan 2.0 Release
- ↑ https://www.agisoft.com/ - Agisoft Metashape Photogrammetry Software
- ↑ https://www.3dmag.com/reviews/photogrammetry-software/what-is-the-best-photogrammetry-software-in-2025/ - What is the best photogrammetry software in 2025?
- ↑ https://assetstore.unity.com/popular-assets/vr-assets-and-tools - Unity Asset Store VR Assets
- ↑ https://www.unrealengine.com/marketplace/en-US/ - Unreal Engine Marketplace (now Fab)
- ↑ https://www.turbosquid.com/ - TurboSquid
- ↑ https://developers.google.com/vr/reference/ios-ndk/group/audio - Google VR NDK for iOS: Audio
- ↑ https://valvesoftware.github.io/steam-audio/ - Steam Audio Official Documentation
- ↑ https://github.com/ValveSoftware/steam-audio - Steam Audio GitHub Repository
- ↑ https://developers.meta.com/horizon/documentation/unity/meta-xr-audio-sdk-features/ - Meta XR Audio SDK Features
- ↑ https://www.fmod.com/licensing - FMOD Licensing
- ↑ https://developers.meta.com/horizon/documentation/native/pc/dg-performance-guidelines/ - Meta VR Performance Guidelines
- ↑ https://vr.arvilab.com/blog/combating-vr-sickness-debunking-myths-and-learning-what-really-works - How to Avoid VR Motion Sickness
- ↑ https://developers.meta.com/horizon/design/mr-design-guideline/ - Meta VR Design Guidelines
- ↑ https://merge.rocks/blog/vr-ar-ux-design-immersive-ux-best-practices - VR/AR UX design. Immersive UX best practices
- ↑ https://www.meta.com/quest/quest-3/ - Meta Quest 3: Next-Gen Mixed Reality Headset
- ↑ https://www.khronos.org/OpenXR - OpenXR Overview
- ↑ https://developers.meta.com/horizon/resources/publish-upload-overview/ - Meta Quest Publishing Overview
- ↑ https://developers.meta.com/horizon/resources/asset-guidelines - Branded Asset and Marketing Media Guidelines
- ↑ https://partner.steamgames.com/steamdirect - Steam Direct
SDKs
SDKs or Software Development Kits are used to build VR apps. An app can either implement OVR or OpenVR or both. This means that the app has access to native functionality in it's corresponding runtime. SDKs do not handle asynchronous timewarp or reprojection, those are handled by the runtime!
- OVR - Made by Oculus VR for the Oculus Rift. Current version (14th May 2016) is 1.3.1 and can access all features of the Oculus Runtime.
- OpenVR - Made by Valve and supports Vive and Rift via the SteamVR Runtime.
Sidenote to SDK's and Unity games: Unity 5.3 currently has optimizations for VR in their native mode. The native mode supports Rift, Gear VR and PSVR, but not SteamVR. A game compiled with Unity 5.3 can use those optimizations with the Oculus SDK but not the OpenVR SDK. The OpenVR SDK has it's own optimizations, which may or may not result in similar performance. However, the upcoming Unity 5.4 will support SteamVR natively and performance should be more or less identical. Please note: this is Unity specific and other engines might have similar or different optimizations for some or all headsets.
Runtimes
Runtimes are
- Oculus Runtime is responsible for asynchronous timewarp and handles device detection, display, etc. It (the runtime service) needs to be running for Oculus Home to launch.
- SteamVR Runtime is responsible for reprojection and supports Rift and Vive.
If you launch a game on Steam that supports Rift natively (meaning, it has been compiled against the Oculus SDK), you will get asynchronous timewarp and it will run exactly like the game from Oculus Home. It won't use the the SteamVR Runtime, it will use the Oculus Runtime. Only downside is that launching the game is more difficult.
If you buy a game that is compiled against the OpenVR SDK, then it can run on Vive and Rift. However, running on Rift uses both runtimes:
Rendered Image using the OpenVR SDK -> SteamVR Runtime -> Oculus Runtime -> Rift
The Oculus Runtime effectively thinks you are playing a game called "SteamVR", but the content of the "game" is actually the image that the SteamVR runtime got from the OpenVR compiled game. This seems to work rather well and personally I have not noticed any additional latency. In theory you would even get Asynchronous Timewarp, but that would only happen if SteamVR itself isn't responding in time. If an OpenVR game stalls completely, the SteamVR runtime will throw you back into the loading construct and continues to render at 90fps. Since the SteamVR runtime has this fail safe mechanism and all the Oculus Runtime sees is "SteamVR", it will most likely never trigger Asynchronous Timewarp.
App Stores
App Stores are distribution platforms for software.
- Oculus Home needs to be running for the Rift to work. By default only supports apps from the Oculus Store (checkbox in the settings of the 2d desktop client to enable other sources). It downloads games and runs them. It also handles the Universal Menu on the Xbox button.
- Steam/SteamVR technically does not need to run when launching OpenVR games, but highly recommended (room setup and config is pulled from there). Also handles overlay menu on the Xbox button, or when running on the Rift, it launches by pressing the select/start button in the Oculus Universal Menu.
Example
Project Cars on Steam supports both OpenVR and OVR. The Oculus Home version supports only OVR as of today.
- Project Cars on Home + Rift -> will run natively
- Project Cars on Steam + Vive -> will run natively
- Project Cars on Steam + Rift + launched as Rift app -> will run natively
- Project Cars on Home + Vive -> won't run
- Project Cars on Steam + Rift + launched as SteamVR app -> will run via SteamVR
Wrappers and Injectors
Oculus Rift apps, or better said apps compiled with the OVR dlls do not render to the HMD directly. They render their images and send them to the runtime. The interface for that are the dlls. Wrappers replace the content of the dlls with functions that are named identically, but redirect the games rendered image to a different runtime. Equally, they provide the game with the rotation and position data of the headset.
- ReVive replaces the content of the dlls with functions that communicate with the SteamVR Runtime. The game thinks it's connected to a Rift, but it is not. The SteamVR runtime is rendering the game and supports reprojection, but not asynchronous timewarp (remember, this is a feature of the Oculus Runtime).
All Oculus games distributed through Oculus Home implement an entitlement check. This means the game is asking Oculus Home if the user is allowed to play it. ReVive does not hack this entitlement check and games need to be downloaded/purchased via Oculus Home. Games need to be started through Home once or the Oculus Service needs to be restarted. Example of the entitlement check failing in Lucky's Tale: Instead of the main menu, ReVive users see a worm on a log showing you it's tongue.
Sidenote: technically one could even write a wrapper that routes OpenVR calls to the Oculus runtime directly, bypassing SteamVR altogether. But since SteamVR can do it already, there is little need for that
- LibOVRWrapper installs dlls into Windows that old Rift games need in order to communicate with the new runtime. It effectively translates old games to the new runtime. [unsure: Do these old games now also use timewarp or do they need to specifically request it via an sdk call?]
- LibOVRWrapper + ReVive allows you to run old Rift games on Vive.
Resources
Oculus Sample Framework and Oculus Sample Framework for Unity 5[1] - Experimental sandbox for VR developers and designers.
- See also: Resources Directory
Hardware & User Experience
- User Experience (UX)
Engine-specific
Unreal Engine 4
- Unreal Engine YouTube Channel, full of tutorials, announcements, and stream recordings
- Unreal forums, and the Unreal VR development forum
- C++ (programming language for Unreal Engine 4)
- Blueprints Visual Scripting
- Video tutorials
- Mitch's VR Lab and his [VR content example assets
- WTF is? Unreal Engine video tutorial series
- Pong in VR tutorial project, created by /u/Enter_the_Metaverse
- Creating a VR Space Combat Sim Without Code in UE4
Unity
- Unity3d Videos on YouTube, featuring many tutorials and panels
- Unity forums, and the Unity VR forum
- Unity tutorial section, has beginner and intermediate tutorials for using the engine.
- Microsoft Developer Network - Unity: Developing Your First Game with Unity and C#
- Unity manual
- C# (programming language for Unity)
- Unity and C# Video Tutorials by Sebastian Lague
- Unity tutorials for C#
- Unity documentation for C#
- Unity Coding Tips on YouTube
- Intro to C# programming and Scripting for Game in Unity, a $30 video tutorial series over at Udemy
- Introductory C# lessons with interactive code
- A handful of useful C# tutorials at SourceCodester
- Discussion forum for learning C# at Dream.In.Code
3D Models and Art Assets
3D Models
- TurboSquid 3D modelling community
- Sketchfab 3D modelling community
- Blender, a free and open source 3D modelling program
- OpenGameArt has a trove of mixed-quality 3D models free for use.
- NASA's 3D Resources gallery has a number of public domain, space-themed 3D models. The files are in a few different formats, including .FBX, .3DS, .OBJ, and the less familiar .STL (stereolithography), but converters are easily found across the web.
Texture Mapping
- Normal Map Technical Details at the Polycount Wiki
- Texture Maps Explained: Physically-Based Rendering Workflow
Audio
- Oculus's Introduction to VR Audio
- Audacity a free, simple-yet-powerful audio editor
- For stock audio, check out The Free Sound Project, the Free Bluezone SFX list, the #GameAudioGDC Bundle, and the Oculus Audio Pack
Mobile Development
- Nvidia AndroidWorks provides all the SDKs needed to port a Unity or UE4 application to Android
- Android Studio is Android's all-in-one-suite for developing Android applications, and is useful for debugging and many development tasks. It also fulfills required SDKs.
- Google VR SDK for Unity (works for both Cardboard and Daydream)
- Google Cardboard Developer Overview, and the official Cardboard developer community on Google+
- Google Daydream Developer Overview
Additional VR/AR Platforms
Web VR
- Google web tools
- Mozilla A-Frame
- Google I/O 2016 Livestream- Building High Performance Daydream Apps
- Projects:
- WorkshopQuest, an online app that teaches core concepts of WebVR dev. From VR.Lab Brussels
Other
- Leap Motion forum, and their development forum
- Leap Motion best practices guide, generally useful for studying user experience
3D models
- Yobi3D - 3D model search engine: offers a 3D viewer to see search results in 3D. Also works with Cardboard.