WebXR Augmented Reality Module - Level 1

Editor’s Draft,

This version:
https://immersive-web.github.io/webxr-ar-module/
Latest published version:
https://www.w3.org/TR/webxr-ar-module-1/
Previous Versions:
Issue Tracking:
GitHub
Inline In Spec
Editors:
(Google)
(Amazon [Microsoft until 2018])
(Mozilla)
Participate:
File an issue (open issues)
Mailing list archive
W3C’s #immersive-web IRC
Unstable API

The API represented in this document is under development and may change at any time.

For additional context on the use of this API please reference the WebXR Augmented Reality Module Explainer.


Abstract

The WebXR Augmented Reality module expands the WebXR Device API with the functionality available on AR hardware.

Status of this document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.

This document was published by the Immersive Web Working Group as an Editors' Draft. This document is intended to become a W3C Recommendation. Feedback and comments on this specification are welcome. Please use Github issues. Discussions may also be found in the public-immersive-web@w3.org archives.

Publication as an Editors' Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 1 March 2019 W3C Process Document.

This WebXR Augmented Reality Module is designed as a module to be implemented in addition to WebXR Device API, and is originally included in WebXR Device API which was divided into core and modules.

1. Introduction

Hardware that enables Virtual Reality (VR) and Augmented Reality (AR) applications are now broadly available to consumers, offering an immersive computing platform with both new opportunities and challenges. The ability to interact directly with immersive hardware is critical to ensuring that the web is well equipped to operate as a first-class citizen in this environment. The WebXR Augmented Reality module expands the functionality available to developers when their code is running on AR hardware.

1.1. Terminology

Augmented Reality describes a class of XR experiences in which virtual content is aligned and composed with the real-world environment before being displayed to users.

XR hardware can be divided into categories based on display technology.

2. WebXR Device API Integration

2.1. XRSessionMode

As defined in the WebXR Device API categorizes XRSessions based on their XRSessionMode. This module expands the allowable values of the XRSessionMode enum to include "immersive-ar".

enum XRSessionMode {
  "inline",
  "immersive-vr",
  "immersive-ar"
};

A session mode of "immersive-ar" indicates that the session’s output will be given exclusive access to the immersive XR device display and that content is intended to be blended with the real-world environment.

The definition of immersive session is also expanded to include sessions with a mode of "immersive-ar". On compatible hardware, user agents MAY support "immersive-vr" sessions, "immersive-ar" sessions, or both. Supporting for the additional "immersive-ar" session mode, does not change the requirement that user agents MUST support "inline" sessions.

NOTE: This means that "immersive-ar" sessions support all the features and reference spaces that "immersive-vr" sessions do, since both are immersive sessions.

The following code checks to see if "immersive-ar" sessions are supported.
navigator.xr.supportsSession('immersive-ar').then(() => {
  // 'immersive-ar' sessions are supported.
  // Page should advertise AR support to the user.
}
The following code attempts to retrieve an "immersive-ar" XRSession.
let xrSession;

navigator.xr.requestSession("immersive-ar").then((session) => {
  xrSession = session;
});

2.2. XREnvironmentBlendMode

When drawing XR content, it is often useful to understand how the rendered pixels will be blended by the XR Compositor with the real-world environment.
enum XREnvironmentBlendMode {
  "opaque",
  "alpha-blend",
  "additive"
};

partial interface XRSession {
  // Attributes
  readonly attribute XREnvironmentBlendMode environmentBlendMode;
};

The environmentBlendMode attribute MUST report the XREnvironmentBlendMode value that matches blend technique currently being performed by the XR Compositor.

2.3. XR Compositor Behaviors

When presenting content to the XR device, the XR Compositor MUST apply the appropriate blend technique to combine virtual pixels with the real-world environment. The appropriate technique is determined based on the XR device's display technology and the mode.

NOTE: When using a device that performs alpha-blend environment blending, use of a baseLayer with no alpha channel will result in the real-world environment being completely obscured. It should be assumed that this is intentional on the part of developer, and the user agent may wish to suspend compositing of real-world environment as an optimization in such cases.

The XR Compositor MAY make additional color or pixel adjustments to optimize the experience. The timing of composition MUST NOT depend on the blend technique or source of the real-world environment. but MUST NOT perform occlusion based on pixel depth relative to real-world geometry; only rendered content MUST be composed on top of the real-world background.

NOTE: Future modules may enable automatic or manual pixel occlusion with the real-world environment.

The XR Compositor MUST NOT automatically grant the page access to any additional information such as camera intrinsics, media streams, real-world geometry, etc.

NOTE: Developers may request access to an XR Device's camera, should one be exposed through the existing [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/) specification. However, doing so does not provide a mechanism to query the XRRigidTransform between the camera’s location and the native origin of the viewer reference space. It also does not provide a guaranteed way to determine the camera intrinsics necessary to match the view of the real-world environment. As such, performing effective computer vision algorithms wil be significantly hampered. Future modules or specifications may enable such functionality.

3. Security, Privacy, and Comfort Considerations

3.1. Protected functionality

add information about new threats and mitigations introduced by functionality in this module

4. Acknowledgements

The following individuals have contributed to the design of the WebXR Device API specification:

And a special thanks to Vladimir Vukicevic (Unity) for kick-starting this whole adventure!

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[COMPOSITING-1]
Rik Cabanier; Nikos Andronikos. Compositing and Blending Level 1. 13 January 2015. CR. URL: https://www.w3.org/TR/compositing-1/
[WEBXR]
Brandon Jones; Nell Waliczek. WebXR Device API. 21 May 2019. WD. URL: https://www.w3.org/TR/webxr/

IDL Index

enum XRSessionMode {
  "inline",
  "immersive-vr",
  "immersive-ar"
};

enum XREnvironmentBlendMode {
  "opaque",
  "alpha-blend",
  "additive"
};

partial interface XRSession {
  // Attributes
  readonly attribute XREnvironmentBlendMode environmentBlendMode;
};

Issues Index

add information about new threats and mitigations introduced by functionality in this module