tree: e9f21154894799545c6e97b38d4a03ffab775afe [path history] [tgz]
  1. android/
  2. docs/
  3. linux/
  4. windows/
  5. absolute_orientation_euler_angles_fusion_algorithm_using_accelerometer_and_magnetometer.cc
  6. absolute_orientation_euler_angles_fusion_algorithm_using_accelerometer_and_magnetometer.h
  7. absolute_orientation_euler_angles_fusion_algorithm_using_accelerometer_and_magnetometer_unittest.cc
  8. BUILD.gn
  9. COMMON_METADATA
  10. DEPS
  11. DIR_METADATA
  12. fake_platform_sensor_and_provider.cc
  13. fake_platform_sensor_and_provider.h
  14. fake_platform_sensor_fusion.cc
  15. fake_platform_sensor_fusion.h
  16. generic_sensor_consts.h
  17. generic_sensor_service_unittest.cc
  18. gravity_fusion_algorithm_using_accelerometer.cc
  19. gravity_fusion_algorithm_using_accelerometer.h
  20. gravity_fusion_algorithm_using_accelerometer_unittest.cc
  21. linear_acceleration_fusion_algorithm_using_accelerometer.cc
  22. linear_acceleration_fusion_algorithm_using_accelerometer.h
  23. linear_acceleration_fusion_algorithm_using_accelerometer_unittest.cc
  24. orientation_euler_angles_fusion_algorithm_using_quaternion.cc
  25. orientation_euler_angles_fusion_algorithm_using_quaternion.h
  26. orientation_euler_angles_fusion_algorithm_using_quaternion_unittest.cc
  27. orientation_quaternion_fusion_algorithm_using_euler_angles.cc
  28. orientation_quaternion_fusion_algorithm_using_euler_angles.h
  29. orientation_quaternion_fusion_algorithm_using_euler_angles_unittest.cc
  30. orientation_test_data.h
  31. orientation_util.cc
  32. orientation_util.h
  33. OWNERS
  34. platform_sensor.cc
  35. platform_sensor.h
  36. platform_sensor_accelerometer_mac.cc
  37. platform_sensor_accelerometer_mac.h
  38. platform_sensor_ambient_light_mac.cc
  39. platform_sensor_ambient_light_mac.h
  40. platform_sensor_and_provider_unittest.cc
  41. platform_sensor_and_provider_unittest_linux.cc
  42. platform_sensor_and_provider_unittest_win.cc
  43. platform_sensor_android.cc
  44. platform_sensor_android.h
  45. platform_sensor_chromeos.cc
  46. platform_sensor_chromeos.h
  47. platform_sensor_chromeos_unittest.cc
  48. platform_sensor_fusion.cc
  49. platform_sensor_fusion.h
  50. platform_sensor_fusion_algorithm.cc
  51. platform_sensor_fusion_algorithm.h
  52. platform_sensor_fusion_unittest.cc
  53. platform_sensor_linux.cc
  54. platform_sensor_linux.h
  55. platform_sensor_provider.cc
  56. platform_sensor_provider.h
  57. platform_sensor_provider_android.cc
  58. platform_sensor_provider_android.h
  59. platform_sensor_provider_chromeos.cc
  60. platform_sensor_provider_chromeos.h
  61. platform_sensor_provider_chromeos_unittest.cc
  62. platform_sensor_provider_linux.cc
  63. platform_sensor_provider_linux.h
  64. platform_sensor_provider_linux_base.cc
  65. platform_sensor_provider_linux_base.h
  66. platform_sensor_provider_mac.cc
  67. platform_sensor_provider_mac.h
  68. platform_sensor_provider_unittest_android.cc
  69. platform_sensor_provider_win.cc
  70. platform_sensor_provider_win.h
  71. platform_sensor_provider_winrt.cc
  72. platform_sensor_provider_winrt.h
  73. platform_sensor_provider_winrt_unittest.cc
  74. platform_sensor_reader_linux.cc
  75. platform_sensor_reader_linux.h
  76. platform_sensor_reader_win.cc
  77. platform_sensor_reader_win.h
  78. platform_sensor_reader_win_base.h
  79. platform_sensor_reader_winrt.cc
  80. platform_sensor_reader_winrt.h
  81. platform_sensor_reader_winrt_unittests.cc
  82. platform_sensor_util.cc
  83. platform_sensor_util.h
  84. platform_sensor_util_unittest.cc
  85. platform_sensor_win.cc
  86. platform_sensor_win.h
  87. README.md
  88. relative_orientation_euler_angles_fusion_algorithm_using_accelerometer.cc
  89. relative_orientation_euler_angles_fusion_algorithm_using_accelerometer.h
  90. relative_orientation_euler_angles_fusion_algorithm_using_accelerometer_and_gyroscope.cc
  91. relative_orientation_euler_angles_fusion_algorithm_using_accelerometer_and_gyroscope.h
  92. relative_orientation_euler_angles_fusion_algorithm_using_accelerometer_and_gyroscope_unittest.cc
  93. relative_orientation_euler_angles_fusion_algorithm_using_accelerometer_unittest.cc
  94. sensor_fusion_algorithm_using_accelerometer_unittest.cc
  95. sensor_impl.cc
  96. sensor_impl.h
  97. sensor_provider_impl.cc
  98. sensor_provider_impl.h
  99. virtual_platform_sensor.cc
  100. virtual_platform_sensor.h
  101. virtual_platform_sensor_provider.cc
  102. virtual_platform_sensor_provider.h
services/device/generic_sensor/README.md

Sensors

Introduction

This document explains how sensor APIs (such as Ambient Light Sensor, Accelerometer, Gyroscope, Magnetometer) are implemented in Chromium.

This directory contains the platform-specific parts of the implementation, which is used, among others, by the Generic Sensor API and the Device Orientation API.

The document describes the Generic Sensor API implementation in both the renderer and the browser process and lists important implementation details, such as how data from a single platform sensor is distributed among multiple JavaScript sensor instances and how sensor configurations are managed.

Background

The Generic Sensor API defines base interfaces that should be implemented by concrete sensors. In most cases, concrete sensors should only define sensor-specific data structures and, if required, sensor configuration options.

The same approach is applied to the implementation in Chromium, which was designed with the following requirements in mind:

  1. Share the crucial parts of functionality between the concrete sensor implementations. Avoid the code duplication and thus simplify maintenance and development of new features.
  2. Support simultaneous existence and functioning of multiple JS Sensor instances of the same type that can have different configurations and lifetimes.
  3. Support for both “slow” sensors that provide periodic updates (e.g. Ambient Light, Proximity), and “fast” streaming sensors that have low-latency requirements for sensor reading updates (motion sensors).

Note: the implementation is architected in such a way that Blink (i.e. third_party/blink/renderer/modules/sensor) is just a consumer of the data from services/device/generic_sensor like any other. For example, the Blink Device Orientation API consumes sensor data from //services independently from the Blink Generic Sensor implementation. The same applies to device/vr/orientation.

Implementation Design

Main components and APIs

The Generic Sensor API implementation consists of two main components: the sensor module in Blink (//third_party/blink/renderer/modules/sensor/) which contains JS bindings for Generic Sensor API and concrete sensors APIs, and the generic_sensor device service (//services/device/generic_sensor/) - a set of classes running on the service process side that eventually call system APIs to access the actual device sensor data.

The //services side also includes a few other directories:

  • //services/device/public/cpp/generic_sensor contains C++ classes and data structures used by both //services/device/generic_sensor as well as its consumers.
  • //services/device/public/mojom contains Mojo interfaces by the Generic Sensor implementation.
    • SensorProvider is a “factory-like” interface that provides data about the sensors present on the device and their capabilities (reporting mode, maximum sampling frequency), and allows users to request a specific sensor or request that a specific sensor be backed by a “virtual sensor” for testing purposes. Note that Blink calls go through WebSensorProvider first.
    • Sensor is an interface wrapping a concrete device sensor.
    • SensorClient is implemented by Blink (and other consumers) to be notified about errors occurred on platform side and about sensor reading updates for sensors with ‘onchange’ reporting mode.

The Blink implementation also contains the following directories:

  • third_party/blink/public/mojom/sensor contains the Mojo interfaces that are exposed to Blink users.
    • WebSensorProvider provides an API that is a subset of what SensorProvider exposes. This allows the latter to offer privileged methods that should not be visible or accessible by Blink. The translation between the two Mojo interfaces happens in //content.
    • WebSensorProviderAutomation is used by the Internals code in Blink to communicate with WebTestVirtualSensorProvider and allow content_shell to invoke virtual sensor operations.

Actual sensor data is not passed to consumers (such as Blink) via Mojo calls - a shared memory buffer is used instead, thus we avoid filling up the Mojo IPC channel with sensor data (for sensors with continuous reporting mode) when the platform sensor has a high sampling frequency, and also avoid adding extra latency.

A high-level diagram of the Mojo architecture looks like this:

Generic Sensor Framework component diagram

Sensor Fusion

Some sensors provide data that is obtained by combining readings from other sensors (so-called low-level sensors). This process is called sensor fusion. It can be done in hardware or software.

In Chromium, we sometimes perform software sensor fusion when a certain hardware sensor is not available but “fusing” readings from other sensors provides a similar reading. The fusion process involves reading data from one or more sensors and applying a fusion algorithm to derive another reading from them (possibly in a different unit).

The figure below figure shows an overview of the fusion sensor flow:

Overview of fusion sensor flow

In the code, the main classes are PlatformSensorFusion and PlatformSensorFusionAlgorithm.

PlatformSensorFusion owns a PlatformSensorFusionAlgorithm instance. It inherits from both PlatformSensor as well as PlatformSensor::Client. The former indicates it can be treated by consumers as a regular sensor, while the latter means that it subscribes to updates from low-level sensors (like SensorImpl itself). It is in its implementation of OnSensorReadingChanged() that it invokes its PlatformSensorFusionAlgorithm to fuse data from the underlying sensors.

Once any of the low-level sensors receive a new value, it notifies its clients (including the fusion sensor). The fusion sensor algorithm reads the low-level sensor raw values and outputs a new reading, which is fed to PlatformSensor::UpdateSharedBufferAndNotifyClients() as usual.

Security and Privacy

Platform sensor readings can expose more information about a device and consequently lead to an increase in the fingerprinting surface exposed by the browser, eavesdropping, and keystroke monitoring.

The security and anti-fingerprinting considerations are also based on existing literature on the topic, especially research on sensors exposed to native applications on mobile devices:

The Generic Sensor implementation in Chromium follows the Mitigation Strategies section of the Generic Sensor API specification. Namely, this means that:

  • The sensor APIs are only exposed to secure contexts (the same also applies to the API exposed by the Device Orientation spec).
  • There is integration with both the Permissions API and the Permissions Policy API.
  • Sensor readings are only available to documents whose visibility state is “visible” (this also applies to the Device Orientation API).
  • Sensor readings are only available for active documents whose origin is same origin-domain with the currently focused area document.

The Chromium implementation also applies additional privacy measures (some of which are making their way back to the specification):

  • Frequency: The maximum sampling frequency is capped at 60Hz for most sensor types. Ambient Light sensors and magnetometers are capped at 10Hz.
  • Accuracy: Readings are quantized, and for some sensor types readings which do not differ by a certain threshold are discarded and never exposed.

There is no distinction in how the Generic Sensor APIs are exposed to regular and incognito windows.

Classes & APIs

//services main classes

  • SensorImpl: implements the exposed Sensor Mojo interface and forwards IPC calls to the owned PlatformSensor instance. SensorImpl implements the PlatformSensor::Client interface to receive notifications from PlatformSensor.

  • SensorProviderImpl: implements the exposed SensorProvider Mojo interface and forwards IPC calls to either the PlatformSensorProvider singleton instance or one of the VirtualPlatformSensorProvider instances.

  • PlatformSensor: represents a device sensor of a given type. There can be only one PlatformSensor instance of the same type at a time; its ownership is shared between existing SensorImpl instances. PlatformSensor is an abstract class which encapsulates generic functionality and is inherited by the platform-specific implementations (PlatformSensorAndroid, PlatformSensorWin etc).

    VirtualPlatformSensor is a specialized subclass used for testing. Its implementation is platform-agnostic, and ownership and cardinality vary slightly: there is one VirtualPlatformSensor instance of the same type per Mojo connection to SensorProviderImpl, as the VirtualPlatformSensor instances are managed by VirtualPlatformSensorProvider, which exists on a per-Mojo connection (i.e. per-content::WebContents) basis.

  • PlatformSensorProvider: singleton class whose main functionality is to create and track PlatformSensor instances. PlatformSensorProvider is also responsible for creating a shared buffer for sensor readings. Every platform has its own implementation of PlatformSensorProvider (PlatformSensorProviderAndroid, PlatformSensorProviderWin etc).

    VirtualPlatformSensorProvider is a specialized subclass used for tests that manages VirtualPlatformSensor instances, which are OS-independent.

The classes above have the following ownership relationships:

  • SensorProviderImpl owns a single PlatformSensorProvider instance via a std::unique_ptr and multiple VirtualPlatformSensorProvider (one per mojo::RemoteId).
  • SensorProviderImpl owns all SensorImpl instances via a mojo::UniqueReceiverSet.
  • PlatformSensor is a ref-counted class, and a SensorImpl has a reference to a PlatformSensor.
  • DeviceService owns a single SensorProviderImpl instance. DeviceService::BindSensorProvider() is responsible for creating a PlatformSensorProvider if one does not exist and pass it to SensorProviderImpl.

//content main classes

  • EmulationHandler is not a sensor-specific class, but it implements several commands from the Emulation DevTools domain as specified in browser_protocol.pdl. Namely, the getOverriddenSensorInformation, setSensorOverrideEnabled, and setSensorOverrideReadings commands are implemented by using a ScopedVirtualSensorForDevTools to invoke virtual sensor operations.

  • FrameSensorProviderProxy: a per-RenderFrameHost implementation of the SensorProvider Mojo interface. Blink Mojo connections are routed to instances of this class, which then forwards the binding request to WebContentsSensorProviderProxy with extra RenderFrameHost information.

  • WebContentsSensorProviderProxy: does not implement any Mojo interface, but communicates with SensorProviderImpl in //services via Mojo. This class provides access to privileged virtual sensor operations and can only be reached by other classes in content itself; Blink can only access FrameSensorProviderProxy and its GetSensor() method.

  • WebTestVirtualSensorProvider: partial implementation of the VirtualSensorProvider Mojo interface that is exposed only to content_shell when it is run in web tests mode. It is used by the InternalsSensor code to provide access to the virtual sensor functions to testdriver.js when web tests are run via content_shell.

FrameSensorProviderProxy and WebContentsSensorProviderProxy exist as separate entities with different granularity solely because of the virtual sensor-related operations used by WebDriver and web tests.

In Blink, SensorProviderProxy exists on a per-DOMWindow basis, but limitations in testdriver and how it communicates with WebDriver require all WebDriver communication to go through the frame that includes testharness.js. In other words, if an iframe tries to create a virtual sensor, the call will ultimately be issued by the top-level frame instead. If the calls to //services are all routed through a per-WebContents object, this does not matter since the same virtual sensors are used by all frames in the tree. This also makes more sense from a testing perspective since the environment behaves more similarly to a real one.

Furthermore, while the Blink issue above could be solved by using a per-Page object rather than a per-WebContents one, we also want the virtual sensor information to persist across navigations, as this allow WebDriver users to make use of the sensor endpoints more effectively: they can set up virtual sensors before loading the page(s) that will be tested.

Blink main classes

  • Sensor: implements bindings for the Sensor IDL interface. All classes that implement concrete sensor interfaces (such as AmbientLightSensor, Gyroscope, Accelerometer) must inherit from it.

  • SensorProviderProxy: owns one side of the WebSensorProvider Mojo interface pipe and manages SensorProxy instances. This class supplements DOMWindow, so Sensor obtains a SensorProviderProxy instance via SensorProviderProxy::From() and uses it to the get SensorProxy instance for a given sensor type.

  • SensorProxy: owns one side of the Sensor Mojo interface and implements the SensorClient Mojo interface. It also defines a SensorProxy::Observer interface that is used to notify Sensor and its subclasses of errors or data updates from the platform side. Sensor and its subclasses interact with the //services side via SensorProxy (and SensorProviderProxy) rather than owning the Mojo pipes themselves.

In a LocalDOMWindow, there is one SensorProxy instance for a given sensor type (ambient light, accelerometer, etc) whose ownership is shared among Sensor instances. SensorProxy instances are created when Sensor::start() is called and are destroyed when there are no more active Sensor instances left.

Testing and virtual sensors

When running web tests and/or doing automation via WebDriver, one cannot depend on the availability of specific hardware sensors, especially when the idea is to test the general mechanism of sensor management.

This is addressed by introducing the concept of virtual sensors: sensors that are not backed by one or more hardware sensor and whose state and readings are entirely controlled by API users.

In the Generic Sensor specification, “virtual sensors” are defined as “device sensors” since its definition of platform sensor is quite abstract and works like a mixture of the PlatformSensor class and Blink‘s SensorProxy. In Chromium, the implementation of the specification’s “virtual sensors” works as both a platform sensor and a device sensor from a spec perspective.

VirtualPlatformSensorProvider and VirtualPlatformSensor inherit from PlatformSensorProvider and PlatformSensor respectively, but their cardinality and the way they are managed differ from the other, non-virtual platform sensor classes in //services.

Conceptually, virtual sensors exist on a per-WebContents basis, so that different tabs can run their tests concurrently without one page‘s readings interfering with another’s. This is achieved in SensorProviderImpl, which keeps a mapping of mojo::RemoteIds to VirtualPlatformSensorProvider instances. This works in contrast to the non-virtual approach, where the idea is that the same readings are shared with all readers, and as such SensorProviderImpl keeps a single PlatformSensorProvider instance.

It is possible for regular sensors and virtual sensors (even those of the same type) to coexist in the same page, provided that the real sensors are created before SensorProviderImpl::CreateVirtualSensor() is called. After that function is called, all calls to GetSensor() with the sensor type passed to it will result in the creation of a virtual sensor, and not a real one. Existing real sensors will continue to work as usual.

When SensorProviderImpl::RemoveVirtualSensor() is called, all existing virtual sensors of a given type for a given frame will stop and behave as if a corresponding hardware sensor had been disconnected (resulting in an “error” event being fired in Blink on all active sensors of the given type, for example).

The only way to update a virtual sensor's readings are by calling SensorProviderImpl::UpdateVirtualSensor(). In other words, any mechanisms to update readings on a periodic basis, for example, need to be implemented by callers.

It is also important to notice that updating a virtual sensor‘s readings does not necessarily mean that the exposed readings will be updated and that e.g. a “reading” event will be fired by Blink: calling UpdateVirtualSensor() is akin to a PlatformSensor class receiving a new reading from the operating system. The new reading will still be passed to PlatformSensor::UpdateSharedBufferAndNotifyClients(), which may discard it depending on the sensor’s reporting mode and the threshold and rounding checks.

Interacting with virtual sensors as a user

The implementation of the CDP commands below, the content_shell version and WebContentsSensorProviderProxy's architecture are documented in more depth in content/browser/generic_sensor's README.md.

API users cannot reach //service classes such as SensorProviderImpl and VirtualPlatformSensor directly. The virtual sensor functionality is exposed to users in a few different manners.

The base of the user-facing API is in CDP (Chrome DevTools Protocol): the getOverriddenSensorInformation, setSensorOverrideEnabled, and setSensorOverrideReadings commands in the Emulation domain are essentially wrappers for the virtual sensor SensorProvider Mojo calls that reach SensorProviderImpl via //content's WebContentsSensorProviderProxy.

On top of the CDP layer, there sits the ChromeDriver layer: it implements the WebDriver endpoints described in the Generic Sensor spec as a thin wrapper for the CDP calls above and also exposes them via its Python API.

Web tests in WPT can make use of the testdriver.js APIs to manage virtual sensors. The testdriver.js code is essentially a JavaScript wrapper for the WebDriver endpoints described above.

When the web tests in WPT are run in the Chromium CI (i.e. via run_web_tests.py), they follow a different code path compared to WPT‘s wpt run (or Chromium’s run_wpt_tests.py), as they are run with content_shell rather than Chromium and ChromeDriver.

To allow the tests to work with content_shell, there is a separate implementation of the testdriver.js API that relies on Blink's Internals JS API. The Internals implementation for the Sensor APIs lives in internals_sensor.cc. It uses a separate Mojo interface to make virtual sensor calls. This Mojo interface is implemented on the //content side by WebTestSensorProviderManager, which exists only to content_shell running in web tests mode.

Code flow

Low-level sensor

The figure below shows the code flow for a low-level (i.e. non-fusion) sensor:

Low-level sensorflow

Each OS-specific PlatformSensor implementation retrieves sensor readings differently, but they all ultimately call PlatformSensor::UpdateSharedBufferAndNotifyClients(). This function invokes PlatformSensor::UpdateSharedBuffer() in platform_sensor.cc, which checks and transforms a reading before storing it in the shared buffer:

  1. Sensors whose reporting mode is mojom::ReportingMode::ON_CHANGE (i.e. they only send notifications when the reading has changed) first check if the new value is different enough compared to the current value. What is considered different enough (i.e. the threshold check) varies per sensor type (see PlatformSensor::IsSignificantlyDifferent() for the base implementation). Threshold-chapter has more information why code uses threshold value. And Used threshold values-chapter has the actual values.
  2. If the check above passes, the so-called “raw” (unrounded) reading is stored to last_raw_reading_.
  3. The reading is rounded via RoundSensorReading() (in platform_sensor_util.cc) using a per-type algorithm. Rounding-chapter has more information why sensor values are rounded.
  4. The rounded reading is stored in the shared buffer and becomes the value that clients can read.

Fusion sensor

Fusion sensors behave similarly, but with extra steps at the end:

  1. They get notified of a new reading when PlatformSensor::UpdateSharedBufferAndNotifyClients() invokes PlatformSensorFusion::OnSensorReadingChanged().
  2. PlatformSensorFusion::OnSensorReadingChanged() invokes the sensor‘s fusion algorithm, which fetches the low-level sensors’ raw readings and fuses them.
  3. It invokes UpdateSharedBufferAndNotifyClients(), which will go through the same threshold check and rounding process described above, but for the fused reading.

The figure below shows an example of the code flow: