WebXR
WebXR | |
---|---|
File:WebXR logo.svg | |
Information | |
Type | Virtual Reality and Augmented Reality |
Subtype | Web API |
Creator | Immersive Web Working Group and Immersive Web Community Group |
Developer | W3C |
Manufacturer | N/A |
Operating System | Cross-platform |
Browser | Google Chrome 79+, Microsoft Edge 79+, Firefox, Opera 66+, Samsung Internet 12+, Meta Quest Browser, Safari (visionOS) |
Devices | VR headsets, AR headsets, AR-enabled smartphones |
Accessories | Motion controllers, Gamepads, Hand tracking devices |
Release Date | 2018 (initial specification), Latest CR Draft April 17, 2025 |
Price | Free Web Standard |
Website | https://immersiveweb.dev/ |
WebXR Device API (commonly known as WebXR) is a Web API developed by the World Wide Web Consortium (W3C) that provides interfaces for accessing virtual reality (VR) and augmented reality (AR) devices on the web. WebXR enables developers to create immersive experiences that work across a wide range of hardware platforms, including head-mounted displays, mobile AR devices, and desktop environments with appropriate peripherals, all without requiring plugins or specialized applications.[1]
The API allows web applications to detect compatible VR/AR devices, query their capabilities, render 3D scenes to the devices at the appropriate frame rate, and respond to input from associated controllers. WebXR represents an evolution from the earlier WebVR API, expanding the scope to include augmented reality and other immersive technologies under the "XR" (Extended Reality) umbrella.[2]
History
WebVR: The Predecessor
The development of WebXR began with its predecessor, WebVR. The WebVR API was first conceived in spring 2014 by Vladimir Vukićević from Mozilla. Key contributors to the early API included Brandon Jones from Google, Boris Smus, and other members of the Mozilla team.[3]
On March 1, 2016, the Mozilla VR team and the Google Chrome team announced the WebVR 1.0 release. This early version of the API was implemented in Firefox and Chromium-based browsers, providing basic virtual reality functionality to web applications.
Transition to WebXR
As the technology evolved, developers recognized the need for a more scalable and ergonomic API that would break backward compatibility with WebVR. Initially referred to as "WebVR 2.0," the new API was officially renamed WebXR to acknowledge its expanded scope that would include both VR and AR content.[4]
The transition benefited from:
- Experience gained from the WebVR implementation
- A more mature landscape of immersive computing devices
- The emergence of both mobile and headset AR technologies
- Multiple mature native APIs to draw inspiration from
On September 24, 2018, the Immersive Web Working Group became official, formalizing the development process for WebXR standards.[5] WebXR was designed to completely replace WebVR, with all browsers that initially shipped WebVR committing to adopt WebXR once the API design was finalized.
Recent Developments
The WebXR standard continues to evolve, with key milestones including:
- 5 Feb 2019 – First Public Working Draft published.[6]
- 31 Oct 2019 – Google Chrome 79 Beta ships WebXR for VR by default.[7]
- Mar 2020 – Chrome 81 adds first stable AR session and WebXR Hit Test Module.[8]
- 31 Mar 2022 – Candidate Recommendation snapshot; continuous CR drafts through 2025.[6]
- Dec 2019 – 2024 – Adoption in Meta Quest Browser 7.0, Chromium‑based Microsoft Edge, and eventually Safari for visionOS.[9]
The editors of the specification currently come from major technology companies including Google and Meta, with additional input from Mozilla, Microsoft, Samsung Electronics, Apple, as well as various startups and invited experts.[3]
Purpose and Goals
The primary goal of WebXR is to enable developers to create and distribute compelling, comfortable, and safe immersive experiences directly via the web.[10] Key advantages include:
- Accessibility: Users can access XR content through a URL click or QR code scan in their browser, removing the need to download and install separate applications from app stores. This significantly lowers the barrier to entry.
- Cross-Platform Compatibility: A single WebXR application can potentially run on a wide variety of hardware, including high-end VR headsets, AR glasses, smartphones, and even standard desktop browsers (for non-immersive "inline" viewing).
- Ease of Development: Leverages existing web technologies (HTML, CSS, JavaScript, WebGL), making it accessible to millions of web developers.
- Instant Updates: Developers can update experiences seamlessly, just like updating any website, without requiring users to update installed apps.
- Integration with web ecosystem: Works with existing web technologies and services.
- Future-Proofing: Experiences built on the standard should continue to work on new hardware as it emerges, without needing major code rewrites.
Technical Overview
Core Concepts
XR Session Modes
WebXR supports different modes of operation:
- inline - Renders XR content within an HTML element on a web page
- immersive-vr - Provides an exclusive, fully immersive VR experience
- immersive-ar - Blends virtual content with the real world environment[11]
Reference Spaces
WebXR uses reference spaces to define coordinate systems:
- viewer - Coordinates relative to the user's head/device
- local - Stationary coordinate system near the user
- local-floor - Like local, but with Y=0 at floor level
- bounded-floor - A floor-relative space with defined boundaries
- unbounded - A space for world-scale AR experiences[1]
Rendering Process
At its most basic level, WebXR rendering works by: 1. Computing the perspective for each eye's viewpoint 2. Rendering the scene from each eye's position 3. Delivering the combined framebuffer to the XR device for display
The API handles the complex timing and scheduling required for comfortable XR experiences, but does not directly manage 3D assets or perform rendering—that responsibility falls to WebGL or other graphics libraries.[12]
Input Handling
WebXR supports various input mechanisms:
- Motion controllers (through the WebXR Gamepads Module)
- Hand tracking
- Gaze-based input methods
- Session-specific events (select, squeeze, etc.)[13]
API Architecture
The WebXR Device API is organized into several key interfaces:
Interface | Description |
---|---|
XRSystem | Main entry point for the WebXR API, accessed via `navigator.xr` |
XRSession | Represents an active XR session, managing the presentation loop |
XRFrame | Provides information about a single frame to be rendered |
XRView | Represents a single view to be rendered (typically one per eye) |
XRViewport | Defines the rectangular area of the output canvas |
XRReferenceSpace | Defines spatial relationship to the user's environment |
XRPose | Contains position and orientation information |
XRInputSource | Represents input devices like controllers or hands |
Modular Structure
The WebXR specification is designed to be modular, with the core WebXR Device API providing fundamental functionality and additional modules extending its capabilities:
Core Module
- **WebXR Device API** - Provides basic session management, device detection, and rendering capabilities[1]
Extension Modules
Module | Purpose | Latest status |
---|---|---|
WebXR Device API | Core sessions, rendering, input | Candidate Recommendation (17 Apr 2025)[6] |
WebXR AR Module | Adds augmented reality support through the "immersive-ar" session mode | Candidate Recommendation (25 Apr 2025)[11] |
WebXR Gamepads Module | Provides interfaces for button, trigger, thumbstick, and touchpad input | Working Draft[13] |
WebXR Hand Input Module | Enables hand tracking functionality | Working Draft (5 Jun 2024)[14] |
WebXR Layers API | Multi-layer composition for performance and UI | Working Draft (4 Apr 2025)[15] |
WebXR Hit Test Module | Ray-casting for real-world surface detection (AR) | Working Draft (11 Jun 2024)[16] |
WebXR Lighting Estimation Module | Allows AR applications to match virtual lighting with real-world conditions | Working Draft[17] |
WebXR Anchors Module | Creating anchors to attach virtual objects to specific points in the real world | Working Draft[18] |
WebXR Depth Sensing Module | Accessing depth information about the real-world environment for occlusion and interaction | Working Draft[19] |
WebXR DOM Overlays Module | Displaying standard HTML content overlaid on the immersive view | Working Draft[20] |
This modular approach allows the WebXR standard to evolve and expand while maintaining compatibility with existing implementations.
How it Works
The WebXR Device API acts as a bridge between the web browser and the XR hardware.[12] A typical WebXR application follows these steps:
1. **Device Detection:** The application uses `navigator.xr.isSessionSupported()` to check if the desired mode (e.g., `immersive-vr`, `immersive-ar`, `inline`) is supported by the available hardware. 2. **Session Request:** If supported, the application requests an `XRSession` using `navigator.xr.requestSession()`, specifying required or optional features. This often requires user permission, especially for immersive modes that access sensors and cameras. 3. **Reference Spaces:** Once a session starts, the application defines Reference Spaces to establish coordinate systems for tracking. 4. **Render Loop:** The application uses `session.requestAnimationFrame()` to create a render loop. In each frame callback, it receives an `XRFrame` object containing the current time and an `XRViewerPose`. 5. **Pose and Views:** The `XRViewerPose` provides the position and orientation of the viewer (headset) and contains multiple `XRView` objects (usually one per eye in VR). Each `XRView` has projection and view matrices needed to render the scene correctly from that eye's perspective. 6. **Rendering:** The application uses WebGL (often via a library) to draw the 3D scene for each `XRView` into an `XRWebGLLayer` associated with the session. 7. **Input Handling:** The API provides information about connected input devices (`XRInputSource`) like controllers, including their pose and button/trigger states. 8. **AR Compositing:** In AR mode, the browser/runtime typically handles compositing the WebGL-rendered content with the live camera feed; the web application usually doesn't get direct access to the camera image for privacy reasons.
A basic VR setup might look like this: ```javascript if ('xr' in navigator) {
navigator.xr.requestSession('immersive-vr').then(session => { // Set up WebGL and rendering loop // ... });
}
- ↑ 1.0 1.1 1.2 World Wide Web Consortium. "WebXR Device API." W3C. https://www.w3.org/TR/webxr/
- ↑ Mozilla Developer Network. "WebXR Device API." MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API
- ↑ 3.0 3.1 Wikipedia. "WebXR." https://en.wikipedia.org/wiki/WebXR
- ↑ Immersive Web Working Group. "WebXR Device API Explained." https://immersive-web.github.io/webxr/explainer.html
- ↑ DuHoc. "Webxr: History, Design, Support." https://www.duhoctrungquoc.vn/wiki/en/WebXR
- ↑ 6.0 6.1 6.2 World Wide Web Consortium. "WebXR Device API" Candidate Recommendation Draft history, 17 April 2025.
- ↑ Chromium Blog. "Chrome 79 Beta: Virtual Reality Comes to the Web," 31 Oct 2019.
- ↑ Chromium Blog. "Chrome 81: Near Field Communications, Augmented Reality, and more," Feb 2020.
- ↑ Meta Quest. "Meta Quest Browser – Version history," 2024.
- ↑ Immersive Web. "Immersive Web Developer Home." https://immersiveweb.dev/
- ↑ 11.0 11.1 World Wide Web Consortium. "WebXR Augmented Reality Module - Level 1." W3C. https://www.w3.org/TR/webxr-ar-module-1/
- ↑ 12.0 12.1 Mozilla Developer Network. "Fundamentals of WebXR." MDN Web Docs. https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API/Fundamentals
- ↑ 13.0 13.1 Immersive Web Working Group. "WebXR Gamepads Module - Level 1." https://immersive-web.github.io/webxr-gamepads-module/
- ↑ W3C. "WebXR Hand Input Module." Working Draft, 5 Jun 2024.
- ↑ W3C. "WebXR Layers API Level 1" Working Draft, 4 Apr 2025.
- ↑ W3C. "WebXR Hit Test Module Level 1" Working Draft, 11 Jun 2024.
- ↑ World Wide Web Consortium. "WebXR Lighting Estimation API Level 1." W3C. https://www.w3.org/standards/history/webxr-lighting-estimation-1/
- ↑ W3C. "WebXR Anchors Module." Working Draft, 2024.
- ↑ W3C. "WebXR Depth Sensing Module." Working Draft, 2024.
- ↑ W3C. "WebXR DOM Overlays Module." Working Draft, 2024.