This documentation concerns
xr_browser_test.cc, and files that use them or their subclasses.
These files port the framework used by XR instrumentation tests (located in
//chrome/android/javatests/src/org/chromium/chrome/browser/vr/ and documented in
//chrome/android/javatests/src/org/chromium/chrome/browser/vr/*.md) for use in browser tests in order to test XR features on desktop platforms.
Both the instrumentation tests and browser tests have hardware/software restrictions - in the case of browser tests, XR is only supported on Windows 8 and later (or Windows 7 with a non-standard patch applied) with a GPU that supports DirectX 11.1, although several tests exist that don‘t actually use XR functionality, and thus don’t have these requirements.
Runtime restrictions in browser tests are handled via the macros in
conditional_skipping.h. To add a runtime requirement to a test class, simply append it to the
runtime_requirements_ vector that each class has. The test setup will automatically skip tests that don't meet all requirements.
One-off skipping within a test can also be done by using the XR_CONDITIONAL_SKIP macro directly in a test.
The bots can be made to ignore these runtime requirement checks if we expect the requirements to always be met (and thus we want the tests to fail if they aren't) via the
--ignore-runtime-requirements argument. This takes a comma-separated list of requirements to ignore, or the wildcard (*) to ignore all requirements. For example,
--ignore-runtime-requirements=DirectX_11.1 would cause a test that requires a DirectX 11.1 device to be run even if a suitable device is not found.
New requirements can be added by adding to the
XrTestRequirement enum in
conditional_skipping.h and adding its associated checking logic in
Instrumentation tests are able to add and remove command line switches on a per-test-case basis using
@CommandLine annotations, but equivalent functionality does not exist in browser tests.
Instead, if different command line flags are needed, a new class will need to be created that extends the correct type of
*BrowserTestBase and overrides the flags that are set in its
The tests are compiled in the
xr_browser_tests target. This is a combination of the
xr_browser_tests_binary target, which is the actual test, and the
xr_browser_tests_runner target, which is a wrapper script that ensures special setup is completed before running the tests.
Once compiled, the tests can be run using the following command line:
run_xr_browser_tests.py --enable-gpu --test-launcher-jobs=1 --enable-pixel-output-in-tests
Because the “test” is actually a Python wrapper script, you may need to prepend
python to the front of the command on Windows if Python file association is not set up on your machine.
The XR browser tests provide a way to plumb controller and headset data (e.g. currently touched/pressed buttons and poses) from the test through the runtime being tested. Details about what goes on under the hood can be found in
//chrome/browser/vr/test/xr_browser_test_details.md, but below is a quick guide on how to use them.
In order to let a test provide data to a runtime, it must create an instance of
MockXRDeviceHookBase or some subclass of it. This should be created at the beginning of the test before any attempts to enter VR are made, as there are currently assumptions that prevent switching to or from the mock runtimes once they have been attempted to be started.
Once created, the runtime being used will call the various functions inherited from
VRTestHook whenever it would normally acquire or submit data from or to an actual device. For example,
WaitGetControllerData() will be called every time the runtime would normally check the state of a real controller, and
OnFrameSubmitted() will be called each time the runtime submits a finished frame to the headset.
For real examples on how to use the input capabilities, look at the tests in
There are currently several assumptions made that must be adhered to in order for input to work properly in both OpenVR and Windows Mixed Reality.
OpenVR supports arbitrary controller mappings, but WMR only supports one actual controller type (+ voice input). What this means is that WMR will always report a certain set of buttons and axes when a controller is connected, regardless of which buttons and axes are set as supported. This means that tests involving things not supported by WMR (e.g. a third touchpad/joystick) must be restricted to OpenVR.