The WebXR Device API has matured from an experimental specification into the backbone of browser-based immersive experiences. In 2026, every major headset ships with a WebXR-capable browser, and mobile AR through Chrome on Android reaches over two billion devices. This article breaks down the current state of the standard, what is stable, what is still in origin trial, and how to build production-grade AR and VR experiences that run without app stores.
The WebXR API Landscape in 2026
WebXR is not a single API but a family of modules. The core specification handles session management, reference spaces, and input sources. On top of that sit feature modules that unlock specific hardware capabilities. Understanding which modules are stable and which remain experimental is critical for planning production deployments.
- Core session and input: Fully stable across Chrome, Edge, Firefox Reality, Meta Quest Browser, and Safari on Vision Pro. Immersive-vr and immersive-ar session types, gamepad-based input, and transient input for screen taps are production-ready.
- Hit testing: The WebXR Hit Test Module is stable on Chrome Android and Quest. It allows raycasting against real-world surfaces to place virtual objects accurately. Essential for any AR product placement experience.
- Anchors and persistent anchors: The Anchors Module lets you pin virtual content to real-world locations that survive across frames. Persistent anchors extend this across sessions, allowing users to return and find their placed objects where they left them.
- Light estimation: Available on Chrome Android and Quest, this module provides ambient light intensity and spherical harmonics data, allowing virtual objects to match real-world lighting for more convincing compositing.
- Hand tracking: Stable on Quest browsers and Vision Pro Safari. Provides 25-joint hand skeleton data per hand, enabling gesture-based interaction without controllers.
Browser Support and Device Matrix
The fragmentation story has improved significantly. Chrome on Android supports immersive-ar sessions with ARCore providing the tracking layer. Meta Quest Browser and Quest's built-in Chromium engine support immersive-vr and passthrough AR. Apple's Safari on Vision Pro joined the WebXR ecosystem with solid immersive-vr support and partial immersive-ar features. The notable gap remains Safari on iOS — Apple still routes AR experiences through Quick Look rather than WebXR, meaning iPhone users cannot access WebXR AR sessions through the browser.
For production projects, this means you need a dual strategy: WebXR for Android and headsets, and USDZ Quick Look links for iOS. Libraries like model-viewer abstract this decision, automatically routing users to the appropriate experience based on device capability detection via navigator.xr.isSessionSupported().
Building with Three.js, A-Frame, and Babylon.js
You rarely interact with the raw WebXR API directly. The three dominant frameworks each take a different approach to abstraction. Three.js provides a WebXRManager that handles session lifecycle and integrates XR input into its existing scene graph — ideal if you already have a Three.js codebase. A-Frame offers a declarative HTML approach where adding an ar-hit-test component to an entity is sufficient to enable surface-placed AR, making it accessible to developers more comfortable with markup than shader code.
Babylon.js has invested heavily in WebXR with its WebXR Experience Helper, which provides turnkey setup for both AR and VR sessions with teleportation, hand tracking, and near-interaction built in. For enterprise projects that need inspector tooling and a visual editor, Babylon's ecosystem is compelling. All three frameworks render through WebGL 2 or WebGPU, with WebGPU adoption accelerating as Chrome and Edge ship stable implementations.
Performance Considerations for WebXR
Immersive experiences are unforgiving of frame drops. A VR session must sustain 72-90 fps depending on the headset, and AR sessions need consistent 60 fps on mobile. Key optimisations include:
- Draw call budgets: Aim for under 100 draw calls per frame on mobile AR. Use instanced rendering for repeated geometry and merge static meshes where possible. Three.js BatchedMesh and Babylon's mesh merging utilities help enormously.
- Texture budgets: Limit total GPU texture memory to under 128 MB on mobile. Use KTX2 compressed textures with Basis Universal transcoding. Load textures progressively — low-resolution mipmaps first, then swap in full resolution once the session stabilises.
- Foveated rendering: On Quest and Vision Pro, leverage the browser's built-in foveated rendering to reduce pixel shading load at the periphery. This is handled automatically by the compositor but you can hint quality levels through the XR session's render state.
- Asset streaming: Do not load every model upfront. Stream assets on demand based on user gaze or proximity triggers. Use glTF extensions like EXT_meshopt_compression for progressive mesh delivery.
WebXR as a Production Platform
WebXR has crossed the threshold from experiment to viable production platform. The combination of broad device reach, stable core APIs, and mature framework support means businesses can deliver immersive experiences without the cost and friction of native app development. Born Digital builds WebXR experiences using Three.js and A-Frame, from product configurators to virtual showrooms. If you are evaluating browser-based AR or VR for your business, our team can advise on the right architecture for your device targets and performance requirements.