WebXR Test API

Editor’s Draft,

More details about this document
This version:
https://immersive-web.github.io/webxr-test-api/
Issue Tracking:
GitHub
Editors:
(Google [Mozilla until 2020])
(Google)
Testing-only API

The API represented in this document is for testing only and should not be exposed to users.


Abstract

The WebXR Test API module provides a mocking interface for Web Platform Tests to be able to test the WebXR Device API.

Status of this document

This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.

This document was published by the Immersive Web Working Group as an Editors' Draft. This document is intended to become a W3C Recommendation. Feedback and comments on this specification are welcome. Please use Github issues. Discussions may also be found in the public-immersive-web-wg@w3.org archives.

Publication as an Editors' Draft does not imply endorsement by W3C and its Members. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 03 November 2023 W3C Process Document.

1. Introduction

In order to allow Web Platform Tests for WebXR there are some basic functions which are common across all tests, such as adding a fake test device and specifying poses. Below is an API which attempts to capture the necessary functions, based off what was defined in the spec. Different browser vendors can implement this API in whatever way is most compatible with their browser. For example, some browsers may back the interface with a WebDriver API while others may use HTTP or IPC mechanisms to communicate with an out of process fake backend.

These initialization object and control interfaces do not represent a complete set of WebXR functionality, and are expected to be expanded on as the WebXR spec grows.

2. Conformance

Interfaces and functionality exposed by this specification SHOULD NOT be exposed to typical browsing experiences, and instead SHOULD only be used when running Web Platform Tests.

3. Simulated devices

3.1. Simulated XR Device

This API gives tests the ability to spin up a simulated XR device which is an XR device which from the point of view of the WebXR API behaves like a normal XR device. These simulated XR devices can be controlled by the associated FakeXRDevice object.

Every simulated XR device may have an native bounds geometry which is an array of DOMPointReadOnlys, used to initialize the native bounds geometry of any XRBoundedReferenceSpaces created for the device. If null, the device is treated as if it is not currently tracking a bounded reference space.

Every simulated XR device may have a floor origin which is an XRRigidTransform used to note the position of the physical floor. If null, the device is treated as if it is unable to identify the physical floor.

Every simulated XR device may have an viewer origin which is an XRRigidTransform used to set the position and orientation of the viewer. If null, the device is treated as if it has lost tracking.

Every simulated XR device has an emulated position boolean which is a boolean used to set the emulatedPosition of any XRPoses produced involving the viewer. This is initially false.

Every simulated XR device has an visibility state which is an XRVisibilityState used to set the visibilityState of any XRSessions associated with the simulated XR device . This is initially "visible". When it is changed, the associated changes must be reflected on the XRSession, including triggering onvisibilitychange events if necessary.

Every simulated XR device has a list of primary views which is a list of views that must be rendered to for an immersive experience. There must be at least one primary view.

Every simulated XR device may have a list of secondary views which is a list of views that may or may not be rendered to. There may be any number of secondary views.

Every view for a simulated XR device has an associated device resolution, which is an instance of FakeXRDeviceResolution. This resolution must be used when constructing XRViewport values for the view, based on the canvas size.

Every view for a simulated XR device may have an associated field of view, which is an instance of FakeXRFieldOfViewInit used to calculate projection matrices using depth values. If the field of view is set, projection matrix values are calculated using the field of view and depthNear and depthFar values.

3.2. Simulated Input Device

This API gives tests the ability to spin up a simulated XR input source which is an XR input source which from the point of view of the WebXR API behaves like a normal XR input source. These simulated XR input sources can be controlled by the associated FakeXRInputController object.

Every simulated XR input source has a handedness which is an XRHandedness value that MUST be returned for the corresponding XR input source’s handedness attribute.

Every simulated XR input source has a targetRayMode which is an XRTargetRayMode value that MUST be returned for the corresponding XR input source’s targetRayMode attribute.

Every simulated XR input source has a pointerOrigin which is an XRRigidTransform used to note the origin of the targetRaySpace.

A simulated XR input source may have a gripOrigin which is an XRRigidTransform used to note the origin of the gripSpace. If this is null the simulated XR input source is not tracked.

Every simulated XR input source has a profiles array which is an array of DOMStrings which MUST be returned for the corresponding XR input source’s profiles attribute.

Every simulated XR input source has a buttonState array which is an array of FakeXRButtonStateInits. If a "grip" button is specified, it SHOULD drive the primary squeeze action. If a UA implements the WebXR Gamepads Module buttonState SHOULD be used to set the state for the corresponding XR input source’s gamepad object, which SHOULD be of type "xr-standard" if enough buttons are specified to support it.

Every simulated XR input source has a connectionState which is a boolean that is initially true and indicates whether the associated XR input source should appear in inputSources. When it is changed the associated changes must be reflected on the XRSession, including triggering the inputsourceschange event if necessary by the next animation frame.

Every simulated XR input source has a primaryActionStarted which is a boolean, initially set to false, that indicates whether or not the primary action of the XR input source has been started.

4. Initialization

4.1. navigator.xr.test

partial interface XRSystem {
    [SameObject] readonly attribute XRTest test;
};

The test attribute’s getter MUST return the XRTest object that is associated with it. This object MAY be lazily created.

4.2. XRTest

The XRTest object is the entry point for all testing.

interface XRTest {
  Promise<FakeXRDevice> simulateDeviceConnection(FakeXRDeviceInit init);
  undefined simulateUserActivation(Function f);
  Promise<undefined> disconnectAllDevices();
};
The simulateDeviceConnection(init) method creates a new simulated XR device.

When this method is invoked, the user agent MUST run the following steps:

  1. Let promise be a new Promise.

  2. Run the following steps in parallel:

    1. Let device be a new simulated XR device.

    2. For each view in init’s views:

      1. Let p be the result of running parse a view on view.

      2. If running parse a view threw an error, reject promise with this error and abort these steps.

      3. Append p to device’s list of primary views.

    3. If init’s secondaryViews is set, for each secondaryView in init’s secondaryViews:

      1. Let s be the result of running parse a view on secondaryView.

      2. If running parse a view threw an error, reject promise with this error and abort these steps.

      3. Append s to device’s list of secondary views.

    4. If init’s boundsCoordinates is set, perform the following steps:

      1. If init’s boundsCoordinates has less than 3 elements, reject promise with TypeError and abort these steps.

      2. Set device’s native bounds geometry to init’s boundsCoordinates.

    5. If init’s floorOrigin is set, set device’s floor origin to init’s floorOrigin.

    6. If init’s viewerOrigin is set, set device’s viewer origin to init’s viewerOrigin.

    7. Let supportedModes be an empty list of XRSessionModes.

    8. Modify supportedModes as follows:

      If init’s supportedModes is present:
      1. Append the contents of init’s supportedModes to supportedModes.

      2. If supportedModes is empty, append "inline" to it.

      Else
      1. Append "inline" to supportedModes.

      2. If init’s supportsImmersive is true, append "immersive-vr" to supportedModes.

    9. If init’s supportedFeatures is set, for each mode in supportedModes:

    10. Associate init’s supportedFeatures to mode

NOTE: each device stores a list of features it is capable of supporting per XRSessionMode. Most tests only test one mode anyway so there isn’t much to be gained by splitting features per mode in FakeXRDeviceInit. Users wishing different modes supporting different features should create multiple devices instead.

  1. Set device’s list of supported modes to supportedModes.

  2. Register device based on the following:

    1. If supportedModes contains "immersive-vr" or "immersive-ar", append device to the xr's list of immersive XR devices.

    2. If supportedModes contains "inline", set the inline XR device to device.

  3. Let d be a new FakeXRDevice object with device as device.

  4. Resolve promise with d.

  1. Return promise.

When simulateUserActivation(f) is called, invoke f as if it had transient activation.

When disconnectAllDevices() is called, remove all simulated XR devices from the xr's list of immersive XR devices as if they were disconnected. If the inline XR device is a simulated XR device, reset it to the default inline XR device.

4.3. FakeXRDeviceInit

dictionary FakeXRDeviceInit {
    required boolean supportsImmersive;
    sequence<XRSessionMode> supportedModes;
    required sequence<FakeXRViewInit> views;
    sequence<FakeXRViewInit> secondaryViews;

    sequence<any> supportedFeatures;
    sequence<FakeXRBoundsPoint> boundsCoordinates;
    FakeXRRigidTransformInit floorOrigin;
    FakeXRRigidTransformInit viewerOrigin;

    // Hit test extensions:
    FakeXRWorldInit world;

    // Depth sensing extensions:
    FakeXRDepthSensingDataInit depthSensingData;
};

dictionary FakeXRViewInit {
  required XREye eye;
  required sequence<float> projectionMatrix;
  required FakeXRDeviceResolution resolution;
  required FakeXRRigidTransformInit viewOffset;
  FakeXRFieldOfViewInit fieldOfView;

  // Raw camera access extensions:
  FakeXRCameraImage cameraImageInit;
};

dictionary FakeXRFieldOfViewInit {
  required float upDegrees;
  required float downDegrees;
  required float leftDegrees;
  required float rightDegrees;
};

dictionary FakeXRDeviceResolution {
    required long width;
    required long height;
};

dictionary FakeXRBoundsPoint {
  double x; double z;
};

dictionary FakeXRRigidTransformInit {
  required sequence<float> position;
  required sequence<float> orientation;
};

The supportsImmersive is deprecated in favor of supportedModes and will be removed in future revisions of the specification.

To parse a rigid transform given a FakeXRRigidTransformInit init, perform the following steps:
  1. Let p be init’s position.

  2. If p does not have three elements, throw a TypeError.

  3. Let o be init’s orientation.

  4. If o does not have four elements, throw a TypeError.

  5. Let position be a DOMPointInit with x, y and z equal to the three elements of p in order, and w equal to 1.

  6. Let orientation be a DOMPointInit with x, y, z, and w equal to the four elements of o in order.

  7. Construct an XRRigidTransform transform with position position and orientation orientation.

  8. Return transform.

To parse a view given a FakeXRViewInit init, perform the following steps:
  1. Let view be a new view.

  2. Set view’s eye to init’s eye.

  3. If init’s projectionMatrix does not have 16 elements, throw a TypeError.

  4. Set view’s projection matrix to init’s projectionMatrix.

  5. Set view’s view offset to the result of running parse a rigid transform init’s viewOffset.

  6. Set view’s device resolution to init’s resolution.

  7. If init’s fieldOfView is set, perform the following steps:

    1. Set view’s field of view to init’s fieldOfView.

    2. Set view’s projection matrix to the projection matrix corresponding to this field of view, and depth values equal to depthNear and depthFar of any XRSession associated with the device. If there currently is none, use the default values of near=0.1, far=1000.0.

  8. Return view.

4.4. FakeXRRigidTransformInit

The WebXR API never exposes native origins directly, instead exposing transforms between them, so we need to specify a base reference space for FakeXRRigidTransformInits so that we can have consistent numerical values across implementations. When used as an origin, FakeXRRigidTransformInits are in the base reference space where the viewer's native origin is identity at initialization, unless otherwise specified. In this space, the "local" reference space has a native origin of identity. This is an arbitrary choice: changing this reference space doesn’t affect the data returned by the WebXR API, but we must make such a choice so that the tests produce the same results across different UAs. When used as an origin it is logically a transform from the origin’s space to the underlying base reference space described above.

5. Mocking

5.1. FakeXRDevice

interface FakeXRDevice : EventTarget {
  undefined setViews(sequence<FakeXRViewInit> views, optional sequence<FakeXRViewInit> secondaryViews);

  Promise<undefined> disconnect();

  undefined setViewerOrigin(FakeXRRigidTransformInit origin, optional boolean emulatedPosition = false);
  undefined clearViewerOrigin();
  undefined setFloorOrigin(FakeXRRigidTransformInit origin);
  undefined clearFloorOrigin();
  undefined setBoundsGeometry(sequence<FakeXRBoundsPoint> boundsCoordinates);
  undefined simulateResetPose();

  undefined simulateVisibilityChange(XRVisibilityState state);

  FakeXRInputController simulateInputSourceConnection(FakeXRInputSourceInit init);

  // Hit test extensions:
  undefined setWorld(FakeXRWorldInit world);
  undefined clearWorld();

  // Depth sensing extensions:
  undefined setDepthSensingData(FakeXRDepthSensingDataInit depthSensingData);
  undefined clearDepthSensingData();
};

Each FakeXRDevice object has an associated device, which is a simulated XR device that it is able to control.

Operations on the FakeXRDevice's device typically take place on the next animation frame, i.e. they are not immediately observable until a future requestAnimationFrame() callback.

To determine when this frame is, for a given operation, choose a frame based on the following:

If such an operation is triggered within an XR animation frame:
Choose the next XR animation frame, whenever it may occur
If such an operation is triggered outside of an XR animation frame:
Choose either the next or next-to-next XR animation frame. The precise choice is up to the user agent and may be dependent on the exact timing of these events.

NOTE: The reason we defer an extra frame when there are pending animation frame callbacks is to avoid having to deal with potential race conditions when the device is ready to trigger an animation frame callback, but has not yet. In practice, this means that tests should be written so that they wait until they have performed all such operations before calling the next requestAnimationFrame(), and in case they are running outside of an XR animation frame, should always wait two frames before expecting any updates to take effect.

To parse a list of views on a list views run the following steps:
  1. Let l be an empty list

  2. For each view in views:

    1. Let v be the result of running parse a view on view.

    2. Append v to l.

  3. Return l.

The setViews(views, secondaryViews) method performs the following steps:
  1. On the next animation frame, run the following steps:

    1. Let p be the result of running parse a list of views on views.

    2. Set device's list of primary views to p.

    3. If secondaryViews is set, let s be the result of running parse a list of views on secondaryViews.

      1. Set device's list of secondary views to s.

When disconnect() method is called, perform the following steps:

  1. Remove device from the xr's list of immersive XR devices as if it were disconnected.

  2. If the inline XR device is equal to the FakeXRDevice, reset it to the default inline XR device.

The setViewerOrigin(origin, emulatedPosition) performs the following steps:
  1. Let o be the result of running parse a rigid transform on origin.

  2. On the next animation frame, perform the following steps:

    1. Set device's viewer origin to o.

    2. Set device's emulated position boolean to emulatedPosition.

The clearViewerOrigin() method will, on the next animation frame, set device's viewer origin to null.

The simulateVisibilityChange(state) method will, as soon as possible, set device's visibility state to state.

The setFloorOrigin(origin) performs the following steps:
  1. Let o be the result of running parse a rigid transform on origin.

  2. On the next animation frame, set device's floor origin to o.

The clearFloorOrigin() method will, on the next animation frame, set device's floor origin to null.

The setBoundsGeometry(boundsCoordinates) performs the following steps:
  1. If boundsCoordinates has fewer than 3 elements, throw a TypeError.

  2. On the next animation frame, set device's native bounds geometry to boundsCoordinates.

The simulateResetPose() method will, as soon as possible, behave as if the device's viewer's native origin had a discontinuity, triggering appropriate reset events.

The simulateInputSourceConnection(init) method creates a new simulated XR input source.

When this method is invoked, the user agent MUST run the following steps:

  1. Let inputSource be a new simulated XR input source.

  2. Set inputSource’s handedness to init’s handedness.

  3. Set inputSource’s targetRayMode to init’s targetRayMode.

  4. Set inputSource’s profiles to init’s profiles

  5. If init’s gripOrigin is set, set inputSource’s gripOrigin to the result of running parse a rigid transform on init’s gripOrigin

  6. Set inputSource’s pointerOrigin to the result of running parse a rigid transform on init’s pointerOrigin

  7. If init’s supportedButtons is set, set inputSource’s buttonState to the result of running parse supported buttons on init’s supportedButtons

  8. If init’s selectionClicked is set to true, run simulate a full primary action on inputSource.

  9. If init’s selectionStarted is set to true, run start a primary action on inputSource.

  10. By the next animation frame notify XRSession of the new XR input source.

  11. Let c be a new FakeXRInputController object with {inputSource as inputSource.

  12. Return c.

5.2. FakeXRInputController

dictionary FakeXRInputSourceInit {
  required XRHandedness handedness;
  required XRTargetRayMode targetRayMode;
  required FakeXRRigidTransformInit pointerOrigin;
  required sequence<DOMString> profiles;
  boolean selectionStarted = false;
  boolean selectionClicked = false;
  sequence<FakeXRButtonStateInit> supportedButtons;
  FakeXRRigidTransformInit gripOrigin;
};

interface FakeXRInputController {
  undefined setHandedness(XRHandedness handedness);
  undefined setTargetRayMode(XRTargetRayMode targetRayMode);
  undefined setProfiles(sequence<DOMString> profiles);
  undefined setGripOrigin(FakeXRRigidTransformInit gripOrigin, optional boolean emulatedPosition = false);
  undefined clearGripOrigin();
  undefined setPointerOrigin(FakeXRRigidTransformInit pointerOrigin, optional boolean emulatedPosition = false);

  undefined disconnect();
  undefined reconnect();

  undefined startSelection();
  undefined endSelection();
  undefined simulateSelect();

  undefined setSupportedButtons(sequence<FakeXRButtonStateInit> supportedButtons);
  undefined updateButtonState(FakeXRButtonStateInit buttonState);
};

enum FakeXRButtonType {
  "grip",
  "touchpad",
  "thumbstick",
  "optional-button",
  "optional-thumbstick"
};

dictionary FakeXRButtonStateInit {
  required FakeXRButtonType buttonType;
  required boolean pressed;
  required boolean touched;
  required float pressedValue;
  float xValue = 0.0;
  float yValue = 0.0;
};

Each FakeXRInputController object has an associated inputSource, which is a simulated XR input source that it is able to control.

Since user agents may opt to send input events on a per-frame basis, the results of all FakeXRInputController methods and simulateInputSourceConnection() are not guaranteed to be visible (via, e.g. inputSources or oninputsourceschange events) until the next animation frame.

To start a primary action on a simulated XR input source run the following steps:
  1. If primaryActionStarted is true, abort these steps.

  2. Set primaryActionStarted to true.

  3. By the next animation frame indicate to the XRSession that the corresponding XR input source’s primary action has started.

To stop a primary action on a simulated XR input source run the following steps:
  1. If primaryActionStarted is false, abort these steps.

  2. Set primaryActionStarted to false.

  3. By the next animation frame indicate to the XRSession that the corresponding XR input source’s primary action has stopped.

To simulate a full primary action on a simulated XR input source source, run the following steps:
  1. Let current be the current value of primaryActionStarted.

  2. Run start a primary action on source

  3. Run stop a primary action on source

  4. If current is true run start a primary action on source

Note: If a gamepad is attached to the simulated XR input source, then running start a primary action or stop a primary action should also ensure that the primary input’s corresponding gamepad button is updated accordingly.

Note: If both start a primary action and stop a primary action are run in the same frame, then by the next animation frame It is expected that onselect and onselectend events will fire.

To parse supported buttons on a sequence of FakeXRButtonStateInits, buttons run the following steps:
  1. Let l be an empty list of FakeXRButtonStateInits

  2. For each button in buttons:

    1. If l does not contain a FakeXRButtonStateInit whose buttonType matches button’s buttonType, append button to l.

  3. Return l

The setHandedness(handedness) method will, by the next animation frame, set inputSource's handedness to handedness.

The setTargetRayMode(targetRayMode) method will, by the next animation frame, set inputSource's targetRayMode to targetRayMode.

The setProfiles(profiles) method will, by the next animation frame, set inputSource's profiles to profiles.

The setGripOrigin(gripOrigin) method will, by the next animation frame, set inputSource's gripOrigin to the result of running parse a rigid transform on gripOrigin.

The clearGripOrigin() method will, by the next animation frame, set inputSource's gripOrigin to null.

The setPointerOrigin(pointerOrigin) method will, by the next animation frame, set inputSource's pointerOrigin to the result of running parse a rigid transform on pointerOrigin.

The disconnect() method will run the following steps:
  1. If inputSource's connectionState is false, abort these steps.

  2. Set inputSource's connectionState to false.

  3. By the next animation frame, notify the XRSession that this XR input source has been removed.

The reconnect() method will run the following steps:
  1. If inputSource's connectionState is true, abort these steps.

  2. Set inputSource's connectionState to true.

  3. By the next animation frame, notify the XRSession that this XR input source has been added.

The startSelection() method will run start a primary action on inputSource.

The endSelection() method will run stop a primary action on inputSource.

The simulateSelect() method will run simulate a full primary action on inputSource.

The setSupportedButtons(supportedButtons) will, by the next animation frame, set inputSource's buttonState to the result of running parse supported buttons on supportedButtons.

Note: As user agents may recreate the XRInputSource or gamepad objects on buttons being changed, this method SHOULD NOT be used to simulate changes to button state.

The updateButtonState(buttonState) will run the following steps:
  1. Let validState equal the results of running validate a button state on buttonState.

  2. Let foundState be null.

  3. For every state in inputSource's buttonState array:

    1. If state’s buttonType matches buttonState’s buttonType:

      1. Set foundState to a reference of state

      2. Break out of this loop

  4. If foundState is null throw a NotFoundError

  5. Update foundState’s attributes in inputSource's buttonState to match those of validState. Note: If buttonType is grip, then XR input source's primary squeeze action should be updated.

To validate a button state on a FakeXRButtonStateInit buttonState run the following steps:
  1. Let validState equal buttonState.

  2. If pressed is true and touched is false, throw a TypeError.

  3. If pressedValue is less than 0.0, throw a TypeError.

  4. If pressedValue is greater than 0.0 and touched is false throw a TypeError.

  5. If buttonType is not one of: "touchpad", "thumbstick", or "optional-thumbstick":

    1. Set validState’s xValue to 0.0.

    2. Set validState’s yValue to 0.0.

  6. Return validState.

6. Hit test extensions

The hit test extensions for test API SHOULD be implemented by all user agents that implement WebXR Hit Test Module.

dictionary FakeXRWorldInit {
  required sequence<FakeXRRegionInit> hitTestRegions;
};

FakeXRWorldInit dictionary describes the state of the world that will be used when computing hit test results on a FakeXRDevice.

hitTestRegions contains a collection of FakeXRRegionInits that are used to describe specific regions of the fake world. The order of the regions does not matter.

dictionary FakeXRRegionInit {
  required sequence<FakeXRTriangleInit> faces;
  required FakeXRRegionType type;
};

FakeXRRegionInit dictionary describes the contents of a specific region of the world.

faces contains a collection of FakeXRTriangleInits that enumerate all the faces contained by the region. The order of the faces does not matter.

type contains a type of the region that will be used during computation of hit test results.

dictionary FakeXRTriangleInit {
  required sequence<DOMPointInit> vertices;  // size = 3
};

FakeXRTriangleInit dictionary describes a single face of a region.

vertices contains a collection of DOMPointInits that comprise the face. The face will be considered as solid when computing hit test results and as such, the winding order of the vertices does not matter.

enum FakeXRRegionType {
  "point",
  "plane",
  "mesh"
};

FakeXRRegionType enum is used to describe a type of the world region.

7. DOM overlay extensions

The DOM Overlay extensions for test API SHOULD be implemented by all user agents that implement WebXR DOM Overlay Module.

partial interface FakeXRInputController {
  undefined setOverlayPointerPosition(float x, float y);
};

When setOverlayPointerPosition(x, y) is called, it sets a position within the DOM overlay in DOM coordinates for the next XR animation frame, and is cleared after that frame. It is intended to be used along with a primary action for that frame, simulating that the user is interacting with the DOM overlay. The UA will emit a beforexrselect event at this location before generating XR select events.

8. Anchors extensions

The anchors extensions for test API SHOULD be implemented by all user agents that implement WebXR Anchors.

dictionary FakeXRAnchorCreationParameters {
  FakeXRRigidTransformInit requestedAnchorOrigin;
  boolean isAttachedToEntity;
};

callback FakeXRAnchorCreationCallback = Promise<boolean> (FakeXRAnchorCreationParameters parameters, FakeXRAnchorController anchorController);

partial interface FakeXRDevice {
  undefined setAnchorCreationCallback(FakeXRAnchorCreationCallback? callback);
};

The FakeXRAnchorCreationCallback callback can be used by the Web Platform Tests to control the result of a call to create an anchor, and to be able to subsequently control the newly created anchor.

The FakeXRDevice interface is extended with internal anchorCreationCallback, initially set to null. When the device receives a request to create an anchor, it MUST run the determine if the anchor creation succeeded algorithm.

In order to determine if the anchor creation succeeded, the FakeXRDevice device MUST run the following steps:

  1. If the device’s anchorCreationCallback is null, return false and abort these steps.

  2. Let promise be the result of invoking anchorCreationCallback with parameters set so that they reflect the parameters passed to anchor creation request.

  3. React to promise:

    • If promise was fulfilled with value v, then return v and abort these steps.

    • If promise was rejected, then return false and abort these steps.

The WPTs can set the anchor creation callback by calling setAnchorCreationCallback(callback).

The requestedAnchorOrigin attribute represents a transform expressed relative to base reference space used by the device.

The isAttachedToEntity attribute will be set to true if the created anchor should be treated as attached to some entity. If so, the tests could emulate entity changing location by appropriately controlling the anchor via anchorController.

The anchorController parameter passed in to FakeXRAnchorCreationCallback can be used to update the state of the anchor, assuming that the creation request was deemed successful. Tests SHOULD store it and issue commands to it for the entire duration of controlled anchor’s lifetime.

interface FakeXRAnchorController {
  readonly attribute boolean deleted;

  // Controlling anchor state:
  undefined pauseTracking();
  undefined resumeTracking();
  undefined stopTracking();

  // Controlling anchor location:
  undefined setAnchorOrigin(FakeXRRigidTransformInit anchorOrigin);
};

Successfully created anchors can be controlled by the test through the use of FakeXRAnchorController interface.

The FakeXRAnchorController has an associated internal anchor origin, which is a FakeXRRigidTransformInit describing the current state of the anchor’s native origin.

The deleted attribute will be set to true when the application has invoked an delete() method on the anchor - in that case, the changes to the fake anchor controller will be ignored.

The pauseTracking() method can be used by the tests to signal that the controlled anchor is temporarily untracked (i.e. its location will be unknown). Calling this method does not modify anchor origin of the controller.

The resumeTracking() method can be used by the tests to signal that the controlled anchor should have its tracking resumed, if it was temporarily untracked. Calling this method does not modify anchor origin of the controller.

The stopTracking() method can be used by the tests to signal that the controlled anchor is no longer tracked and that anchor tracking will not be resumed. After calling this method, the other calls on anchor controller will be ignored.

The setAnchorOrigin(anchorOrigin) method can be used to set the controller’s anchor origin. Tests can use this method to simulate updates in anchor pose.

9. Lighting estimation extensions

The lighting estimation extensions for test API SHOULD be implemented by all user agents that implement WebXR Lighting Estimation.

dictionary FakeXRLightEstimateInit {
  required sequence<float> sphericalHarmonicsCoefficients;
  DOMPointInit primaryLightDirection;
  DOMPointInit primaryLightIntensity;
};

partial interface FakeXRDevice {
  undefined setLightEstimate(FakeXRLightEstimateInit init);
};

The FakeXRDevice is extended with internal light estimate which is a FakeXRLightEstimateInit, used to supply data for any requested XRLightEstimate.

When the setLightEstimate(init) method is invoked on FakeXRDevice device, run the following steps:
  1. Let c be init’s sphericalHarmonicsCoefficients.

  2. If c does not have 27 elements, throw a TypeError and abort these steps.

  3. Let d be init’s primaryLightDirection.

  4. If d is set and d’s w value does not equal 0, throw a TypeError and abort these steps.

  5. Let i be init’s primaryLightIntensity.

  6. If i is set and i’s w value does not equal 1, throw a TypeError and abort these steps.

  7. Set device’s light estimate to init by the next animation frame.

10. Depth sensing extensions

The depth sensing extensions for test API SHOULD be implemented by all user agents that implement WebXR Depth Sensing Module.

The FakeXRDevice is extended with internal depth sensing data which is a FakeXRDepthSensingDataInit, used to supply data for requests to native depth sensing.

dictionary FakeXRDepthSensingDataInit {
  required ArrayBuffer depthData;
  required FakeXRRigidTransformInit normDepthBufferFromNormView;
  required float rawValueToMeters;
  required unsigned long width;
  required unsigned long height;
};

FakeXRDepthSensingDataInit dictionary describes the state of the depth sensing data that should be used when returning latest depth information in creating a CPU depth information instance and creating a GPU depth information instance algorithms. All keys present in FakeXRDepthSensingDataInit correspond to the data required to be returned by native depth sensing capabilities of the device.

depthData corresponds to the desired depth buffer that is to be set on native depth information returned from querying the native device. Not setting depthData key in the dictionary signals that the returned native depth information should be null.

normDepthBufferFromNormView corresponds to the desired depth coordinates transformation matrix that is to be set on native depth information returned from querying the native device.

rawValueToMeters corresponds to the desired conversion factor that is to be set on native depth information returned from querying the native device.

width and height correspond to the desired dimensions of the depth buffer that are to be set on native depth information returned from querying the native device.

When the setDepthSensingData() method is invoked on FakeXRDevice device with depthSensingData, run the following steps:

  1. If depthSensingData’s depthData is null, throw a TypeError and abort these steps.

  2. Set device’s depth sensing data to depthSensingData.

When the clearDepthSensingData() method is invoked on FakeXRDevice device, run the following steps:

  1. Set device’s depth sensing data to null.

11. Raw camera access extensions

The raw camera access extensions for test API SHOULD be implemented by all user agents that implement WebXR Raw Camera Access Module.

The FakeXRViewInit dictionary is extended with cameraImageInit dictionary of type FakeXRDeviceResolution. This dictionary carries information about the camera image, and is intended to affect the camera image variable in obtain camera algorithm. If the cameraImageInit key is not present in the FakeXRViewInit dictionary, the obtain camera algorithm should treat this as null camera image (and thus the algorithm will return null).

dictionary FakeXRCameraImage {
  required long width;
  required long height;

  Uint32Array pixels;
};

The width controls the width of the camera image buffer in obtain camera algorithm.

The height controls the height of the camera image buffer in obtain camera algorithm.

The pixels control the camera image contents in obtain camera algorithm. The pixels will be used to initialize the camera image texture.

The camera image will be initialized as if by a call to gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, width, height, 0, gl.RGBA, gl.UNSIGNED_BYTE, pixels), where width, height, and pixels are, respectively, width, height, and pixels. In case pixels key is not present in the dictionary, the behavior would be as if a call to gl.texImage2D() variant that omits the pixels parameter was made.

Any time a simulated XR device's list of primary views and list of secondary views is set, the user agent MUST verify that the camera images associated with the views present across both of those lists are all equal to each other. Camera images are considered equal when their widths and heights are equal, and their pixels are the same instance (if present). If they are not equal, the user agent MUST throw an error from within the algorithm that attempted to set them.

Conformance

Document conventions

Conformance requirements are expressed with a combination of descriptive assertions and RFC 2119 terminology. The key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in the normative parts of this document are to be interpreted as described in RFC 2119. However, for readability, these words do not appear in all uppercase letters in this specification.

All of the text of this specification is normative except sections explicitly marked as non-normative, examples, and notes. [RFC2119]

Examples in this specification are introduced with the words “for example” or are set apart from the normative text with class="example", like this:

This is an example of an informative example.

Informative notes begin with the word “Note” and are set apart from the normative text with class="note", like this:

Note, this is an informative note.

Conformant Algorithms

Requirements phrased in the imperative as part of algorithms (such as "strip any leading space characters" or "return false and abort these steps") are to be interpreted with the meaning of the key word ("must", "should", "may", etc) used in introducing the algorithm.

Conformance requirements phrased as algorithms or specific steps can be implemented in any manner, so long as the end result is equivalent. In particular, the algorithms defined in this specification are intended to be easy to understand and are not intended to be performant. Implementers are encouraged to optimize.

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[DOM]
Anne van Kesteren. DOM Standard. Living Standard. URL: https://dom.spec.whatwg.org/
[GAMEPAD]
Steve Agoston; Matthew Reynolds. Gamepad. URL: https://w3c.github.io/gamepad/
[GEOMETRY-1]
Simon Pieters; Chris Harrelson. Geometry Interfaces Module Level 1. URL: https://drafts.fxtf.org/geometry/
[HTML]
Anne van Kesteren; et al. HTML Standard. Living Standard. URL: https://html.spec.whatwg.org/multipage/
[INFRA]
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc2119
[WEBIDL]
Edgar Chen; Timothy Gu. Web IDL Standard. Living Standard. URL: https://webidl.spec.whatwg.org/
[WEBXR]
Brandon Jones; Manish Goregaokar; Rik Cabanier. WebXR Device API. URL: https://immersive-web.github.io/webxr/
[WEBXR-GAMEPADS-MODULE-1]
Brandon Jones; Manish Goregaokar; Rik Cabanier. WebXR Gamepads Module - Level 1. URL: https://immersive-web.github.io/webxr-gamepads-module/

IDL Index

partial interface XRSystem {
    [SameObject] readonly attribute XRTest test;
};

interface XRTest {
  Promise<FakeXRDevice> simulateDeviceConnection(FakeXRDeviceInit init);
  undefined simulateUserActivation(Function f);
  Promise<undefined> disconnectAllDevices();
};

dictionary FakeXRDeviceInit {
    required boolean supportsImmersive;
    sequence<XRSessionMode> supportedModes;
    required sequence<FakeXRViewInit> views;
    sequence<FakeXRViewInit> secondaryViews;

    sequence<any> supportedFeatures;
    sequence<FakeXRBoundsPoint> boundsCoordinates;
    FakeXRRigidTransformInit floorOrigin;
    FakeXRRigidTransformInit viewerOrigin;

    // Hit test extensions:
    FakeXRWorldInit world;

    // Depth sensing extensions:
    FakeXRDepthSensingDataInit depthSensingData;
};

dictionary FakeXRViewInit {
  required XREye eye;
  required sequence<float> projectionMatrix;
  required FakeXRDeviceResolution resolution;
  required FakeXRRigidTransformInit viewOffset;
  FakeXRFieldOfViewInit fieldOfView;

  // Raw camera access extensions:
  FakeXRCameraImage cameraImageInit;
};

dictionary FakeXRFieldOfViewInit {
  required float upDegrees;
  required float downDegrees;
  required float leftDegrees;
  required float rightDegrees;
};

dictionary FakeXRDeviceResolution {
    required long width;
    required long height;
};

dictionary FakeXRBoundsPoint {
  double x; double z;
};

dictionary FakeXRRigidTransformInit {
  required sequence<float> position;
  required sequence<float> orientation;
};

interface FakeXRDevice : EventTarget {
  undefined setViews(sequence<FakeXRViewInit> views, optional sequence<FakeXRViewInit> secondaryViews);

  Promise<undefined> disconnect();

  undefined setViewerOrigin(FakeXRRigidTransformInit origin, optional boolean emulatedPosition = false);
  undefined clearViewerOrigin();
  undefined setFloorOrigin(FakeXRRigidTransformInit origin);
  undefined clearFloorOrigin();
  undefined setBoundsGeometry(sequence<FakeXRBoundsPoint> boundsCoordinates);
  undefined simulateResetPose();

  undefined simulateVisibilityChange(XRVisibilityState state);

  FakeXRInputController simulateInputSourceConnection(FakeXRInputSourceInit init);

  // Hit test extensions:
  undefined setWorld(FakeXRWorldInit world);
  undefined clearWorld();

  // Depth sensing extensions:
  undefined setDepthSensingData(FakeXRDepthSensingDataInit depthSensingData);
  undefined clearDepthSensingData();
};

dictionary FakeXRInputSourceInit {
  required XRHandedness handedness;
  required XRTargetRayMode targetRayMode;
  required FakeXRRigidTransformInit pointerOrigin;
  required sequence<DOMString> profiles;
  boolean selectionStarted = false;
  boolean selectionClicked = false;
  sequence<FakeXRButtonStateInit> supportedButtons;
  FakeXRRigidTransformInit gripOrigin;
};

interface FakeXRInputController {
  undefined setHandedness(XRHandedness handedness);
  undefined setTargetRayMode(XRTargetRayMode targetRayMode);
  undefined setProfiles(sequence<DOMString> profiles);
  undefined setGripOrigin(FakeXRRigidTransformInit gripOrigin, optional boolean emulatedPosition = false);
  undefined clearGripOrigin();
  undefined setPointerOrigin(FakeXRRigidTransformInit pointerOrigin, optional boolean emulatedPosition = false);

  undefined disconnect();
  undefined reconnect();

  undefined startSelection();
  undefined endSelection();
  undefined simulateSelect();

  undefined setSupportedButtons(sequence<FakeXRButtonStateInit> supportedButtons);
  undefined updateButtonState(FakeXRButtonStateInit buttonState);
};

enum FakeXRButtonType {
  "grip",
  "touchpad",
  "thumbstick",
  "optional-button",
  "optional-thumbstick"
};

dictionary FakeXRButtonStateInit {
  required FakeXRButtonType buttonType;
  required boolean pressed;
  required boolean touched;
  required float pressedValue;
  float xValue = 0.0;
  float yValue = 0.0;
};

dictionary FakeXRWorldInit {
  required sequence<FakeXRRegionInit> hitTestRegions;
};

dictionary FakeXRRegionInit {
  required sequence<FakeXRTriangleInit> faces;
  required FakeXRRegionType type;
};

dictionary FakeXRTriangleInit {
  required sequence<DOMPointInit> vertices;  // size = 3
};

enum FakeXRRegionType {
  "point",
  "plane",
  "mesh"
};

partial interface FakeXRInputController {
  undefined setOverlayPointerPosition(float x, float y);
};

dictionary FakeXRAnchorCreationParameters {
  FakeXRRigidTransformInit requestedAnchorOrigin;
  boolean isAttachedToEntity;
};

callback FakeXRAnchorCreationCallback = Promise<boolean> (FakeXRAnchorCreationParameters parameters, FakeXRAnchorController anchorController);

partial interface FakeXRDevice {
  undefined setAnchorCreationCallback(FakeXRAnchorCreationCallback? callback);
};

interface FakeXRAnchorController {
  readonly attribute boolean deleted;

  // Controlling anchor state:
  undefined pauseTracking();
  undefined resumeTracking();
  undefined stopTracking();

  // Controlling anchor location:
  undefined setAnchorOrigin(FakeXRRigidTransformInit anchorOrigin);
};

dictionary FakeXRLightEstimateInit {
  required sequence<float> sphericalHarmonicsCoefficients;
  DOMPointInit primaryLightDirection;
  DOMPointInit primaryLightIntensity;
};

partial interface FakeXRDevice {
  undefined setLightEstimate(FakeXRLightEstimateInit init);
};

dictionary FakeXRDepthSensingDataInit {
  required ArrayBuffer depthData;
  required FakeXRRigidTransformInit normDepthBufferFromNormView;
  required float rawValueToMeters;
  required unsigned long width;
  required unsigned long height;
};

dictionary FakeXRCameraImage {
  required long width;
  required long height;

  Uint32Array pixels;
};