1. Introduction
This section is non-normative.
This specification provides two new DOM events for obtaining information about the physical orientation and movement of the hosting device. The information provided by the events is not raw sensor data, but rather high-level data which is agnostic to the underlying source of information. Common sources of information include gyroscopes, compasses and accelerometers.
The deviceorientation
event represents the physical orientation of the device, expressed as a series of rotations from a local coordinate frame.
The devicemotion
event represents the acceleration of the device, expressed in Cartesian coordinates in a coordinate frame defined in the device. It also supplies the rotation rate of the device about a local coordinate frame. Where practically possible, the event should provide the acceleration of the device’s center of mass.
The following code extracts illustrate basic use of the events.
deviceorientation
events:
window. addEventListener( "deviceorientation" , event=> { // process event.alpha, event.beta and event.gamma }); // Alternatively... window. ondeviceorientation= event=> { // process event.alpha, event.beta and event.gamma };
{ alpha: 90 , beta: 0 , gamma: 0 };
To get the compass heading, one would simply subtract alpha
from 360 degrees. As the device is turned on the horizontal surface, the compass heading is (360 - alpha).
beta
is 90, irrespective of what alpha
and gamma
are. { alpha: 270 - alpha, beta: 0 , gamma: 90 };
devicemotion
events:
window. addEventListener( "devicemotion" , ( event) => { // Process event.acceleration, event.accelerationIncludingGravity, // event.rotationRate and event.interval }); // Alternatively... window. ondevicemotion= ( event) => { // Process event.acceleration, event.accelerationIncludingGravity, // event.rotationRate and event.interval };
acceleration
of zero and the following value for accelerationIncludingGravity
:
{ x: 0 , y: 0 , z: 9.8 };
accelerationIncludingGravity
of zero and the following value for acceleration:
{ x: 0 , y: 0 , z: - 9.8 };
acceleration
and accelerationIncludingGravity
. The device also records a negative value for rotationRate
.gamma
:
{ acceleration: { x: v^ 2 / r, y: 0 , z: 0 }, accelerationIncludingGravity: { x: v^ 2 / r, y: 9.8 , z: 0 }, rotationRate: { alpha: 0 , beta: 0 , gamma: - v/ r* 180 / pi} };
2. Scope
This section is non-normative.
Within the scope of this specification are events that represent the physical orientation and motion of the hosting device. Out of scope are utilities for manipulating orientation data, such as transformation libraries, and providing access to raw sensor data or methods for directly interfacing with these sensors.
3. Model
3.1. Device Orientation
This specification expresses a device’s physical orientation as a series of rotations relative to an implementation-defined reference coordinate frame.
The sequence of rotation steps is a set of intrinsic Tait-Bryan angles of type Z - X' - Y'' ([EULERANGLES]) that are applied on the device coordinate system defined in [ACCELEROMETER] and summarized below:
-
x is in the plane of the screen or keyboard and is positive towards the right hand side of the screen or keyboard.
-
y is in the plane of the screen or keyboard and is positive towards the top of the screen or keyboard.
-
z is perpendicular to the screen or keyboard, positive out of the screen or keyboard.
For a mobile device such as a phone or tablet, the device coordinate frame is defined relative to the screen in its standard orientation, typically portrait. This means that slide-out elements such as keyboards are not deployed, and swiveling elements such as displays are folded to their default position.
If the orientation of the screen changes when the device is rotated or a slide-out keyboard is deployed, this does not affect the orientation of the coordinate frame relative to the device.
For a laptop computer, the device coordinate frame is defined relative to the integrated keyboard.
Note: Developers wanting to detect changes in screen orientation can refer to [SCREEN-ORIENTATION].
Rotations use the right-hand convention, such that positive rotation around an axis is clockwise when viewed along the positive direction of the axis.
Note: The coordinate system used by this specification differs from CSS Transforms 2 § 4 The Transform Rendering Model, where the y axis is positive to the bottom and rotations follow the left-hand convention.
Additionally, rotateSelf()
and rotate()
, specified in [GEOMETRY-1], apply rotations in an Z - Y' - X'' order, which differs from the order specified here.
A rotation represented by alpha
, beta
and gamma
is carried out by the following steps:
-
Rotate the device frame around its z axis by
alpha
degrees, withalpha
in [0, 360). -
Rotate the device frame around its x axis by
beta
degrees, withbeta
in [-180, 180). -
Rotate the device frame around its y axis by
gamma
degrees, withgamma
in [-90, 90).
Note: This choice of angles follows mathematical convention, but means that alpha is in the opposite sense to a compass heading. It also means that the angles do not match the roll-pitch-yaw convention used in vehicle dynamics.
3.1.1. Choice of reference coordinate system
A device’s orientation is always relative to another coordinate system, whose choice influences the kind of information that the orientation conveys as well as the source of the orientation data.
Relative device orientation is measured with an accelerometer and a gyroscope, and the reference coordinate system is arbitrary. Consequently, the orientation data provides information about changes relative to the initial position of the device.
Note: In native platform terms, this is similar to a relative OrientationSensor on Windows, a game rotation vector sensor on Android, or the xArbitraryZVertical option for Core Motion.
Absolute orientation is measured with an accelerometer, a gyroscope and a magnetometer, and the reference coordinate system is the Earth’s reference coordinate system.
Note: In native platform terms, this is similar to an absolute OrientationSensor on Windows, a rotation vector sensor on Android, or the xMagneticNorthZVertical option for Core Motion.
3.2. Device Motion
This specification expresses a device’s motion in space by measuring its acceleration and rotation rate, which are obtained from an accelerometer and a gyroscope. The data is provided relative to the device coordinate system summarized in the previous section.
Acceleration is the rate of change of velocity of a device with respect to time. Is is expressed in meters per second squared (m/s2).
Linear device acceleration represents the device’s acceleration rate without the contribution of the gravity force. When the device is laying flat on a table, its linear acceleration is 0 m/s2.
When the acceleration includes gravity, its value includes the effect of gravity and represents proper acceleration ([PROPERACCELERATION]). When the device is in free-fall, the acceleration is 0 m/s2. This is less useful in many applications but is provided as a means of providing best-effort support by implementations that are unable to provide linear acceleration (due, for example, to the lack of a gyroscope).
Note: In practice, acceleration with gravity represents the raw readings obtained from an Motion Sensors Explainer § accelerometer, or the [G-FORCE] whereas linear acceleration provides the readings of a Motion Sensors Explainer § linear-acceleration-sensor and is likely a fusion sensor. [MOTION-SENSORS] and [ACCELEROMETER] both contain a more detailed discussion about the different types of accelerometers and accelerations that can be measured.
The rotation rate measures the rate at which the device rotates about a specified axis in the device coordinate system. As with device orientation, rotations must use the right-hand convention, such that positive rotation around an axis is clockwise when viewed along the positive direction of the axis. The rotation rate is measured in degrees per second (deg/s).
Note: [MOTION-SENSORS] and [GYROSCOPE] both contain a more detailed discussion of gyroscopes, rotation rates and measurements.
4. Permissions
DeviceMotionEvent.requestPermission()
and DeviceOrientationEvent.requestPermission()
tests. This specification is a powerful feature and, as such, it defines the following permissions which are policy-controlled features with the given default allowlists:
-
"accelerometer"
, whose default allowlist is 'self'. -
"gyroscope"
, whose default allowlist is 'self'. -
"magnetometer"
, whose default allowlist is 'self'.
-
When providing relative orientation data, the deviceorientation event is only dispatched if the "accelerometer" and "gyroscope" permissions are granted. For the implementation to fall back to absolute orientation data, the "magnetometer" permission must also be granted.
-
The deviceorientationabsolute event is only dispatched if the "accelerometer", "gyroscope", and "magnetometer" permissions are granted.
-
The devicemotion event is only dispatched if the "accelerometer" and "gyroscope" permissions are granted.
5. Task Source
The task source for the tasks mentioned in this specification is the device motion and orientation task source.
6. API
6.1. deviceorientation Event
partial interface Window { [SecureContext ]attribute EventHandler ondeviceorientation ; }; [Exposed =Window ,SecureContext ]interface :
DeviceOrientationEvent Event {(
constructor DOMString ,
type optional DeviceOrientationEventInit = {});
eventInitDict readonly attribute double ?alpha ;readonly attribute double ?beta ;readonly attribute double ?gamma ;readonly attribute boolean absolute ;static Promise <PermissionState >requestPermission (optional boolean =
absolute false ); };dictionary :
DeviceOrientationEventInit EventInit {double ?=
alpha null ;double ?=
beta null ;double ?=
gamma null ;boolean =
absolute false ; };
The ondeviceorientation
attribute is an event handler IDL attribute for the ondeviceorientation
event handler, whose event handler event type is deviceorientation
.
The alpha
attribute must return the value it was initialized to. It represents the rotation around the Z axis in the Z - X' - Y'' intrinsic Tait-Bryan angles described in § 3.1 Device Orientation.
The beta
attribute must return the value it was initialized to. It represents the rotation around the X' axis (produced after the rotation around the Z axis has been applied) axis in the Z - X' - Y'' intrinsic Tait-Bryan angles described in § 3.1 Device Orientation.
The gamma
attribute must return the value it was initialized to. It represents the rotation around the Y'' axis (produced after the rotation around the Z and X' axes have been applied in this order) in the Z - X' - Y'' intrinsic Tait-Bryan angles described in § 3.1 Device Orientation.
The absolute
attribute must return the value it was initialized to. It indicates whether relative orientation or absolute orientation data is being provided.
requestPermission(absolute)
method steps are:
-
Let global be the current global object.
-
Let hasTransientActivation be true if this's relevant global object has transient activation, and false otherwise.
-
Let promise be a new promise in this's relevant Realm.
-
Let permissions be « "accelerometer", "gyroscope" ».
-
If absolute is true, append "magnetometer" » to permissions.
-
Run these steps in parallel:
-
For each name of permissions:
-
If name’s permission state is "
prompt
" and hasTransientActivation is false:-
Queue a global task on the device motion and orientation task source given global to reject promise with a "
NotAllowedError
"DOMException
. -
Return.
-
-
-
Let permissionState be "
granted
". -
For each name of permissions:
Note: There is no algorithm for requesting multiple permissions at once. However, user agents are encouraged to bundle concurrent requests for different kinds of media into a single user-facing permission prompt.
-
If the result of requesting permission to use name is not "
granted
":
-
-
Queue a global task on the device motion and orientation task source given global to resolve promise with permissionState.
-
-
Return promise.
DOMString
event, Window
window and boolean
absolute:
-
Let orientation be null.
-
Let topLevelTraversable be window’s navigable's top-level traversable.
-
Let virtualSensorType be "
relative-orientation
" if absolute is false, and "absolute-orientation
" otherwise. -
If topLevelTraversable’s virtual sensor mapping contains virtualSensorType:
-
Let virtualSensor be topLevelTraversable’s virtual sensor mapping[virtualSensorType].
-
If virtualSensor’s can provide readings flag is true:
-
Set orientation to the latest readings provided to virtualSensor with the "
alpha
", "beta
", and "gamma
" keys.
-
-
-
Otherwise:
-
If absolute is false:
-
Set orientation to the device’s relative orientation in the tridimensional plane.
-
-
Otherwise:
-
Set orientation to the device’s absolute orientation in the tridimensional plane.
-
-
-
Let permissions be null.
-
If absolute is false:
-
Set permissions to « "accelerometer", "gyroscope" ».
-
-
Otherwise:
-
Set permissions to « "accelerometer", "gyroscope", "magnetometer" ».
-
-
Let environment be window’s relevant settings object.
-
Run these steps in parallel:
-
For each permission in permissions:
-
Let state be the result of getting the current permission state with permission and environment.
-
If state is not "
granted
", return.
-
-
Queue a global task on the device motion and orientation task source given window to run the following steps:
-
Let z be orientation’s representation as intrinsic Tait-Bryan angles Z - X' - Y'' along the Z axis, or null if the implementation cannot provide an angle value.
-
If z is not null, limit z’s precision to 0.1 degrees.
-
Let x be orientation’s representation as intrinsic Tait-Bryan angles Z - X' - Y'' along the X' axis, or null if the implementation cannot provide an angle value.
-
If x is not null, limit x’s precision to 0.1 degrees.
-
Let y be orientation’s representation as intrinsic Tait-Bryan angles Z - X' - Y'' along the Y'' axis, or null if the implementation cannot provide an angle value.
-
If y is not null, limit y’s precision to 0.1 degrees.
-
Fire an event named event at window, using
DeviceOrientationEvent
, with thealpha
attribute initialized to z, thebeta
attribute initialized to x, thegamma
attribute initialized to y, and theabsolute
attribute initialized to absolute.
-
-
A significant change in orientation indicates a difference in orientation values compared to the previous ones that warrants the firing of a deviceorientation or deviceorientationabsolute event. The process of determining whether a significant change in orientation has occurred is implementation-defined, though a maximum threshold for change of 1 degree is recommended. Implementations may also consider that it has occurred if they have reason to believe that the page does not have sufficiently fresh data.
Note: Implementations must take § 9 Automation into account to determine whether a significant change in orientation has occurred, so that a virtual sensor reading update causes it to be assessed.
-
Let document be window’s associated Document.
-
If document’s visibility state is not
"visible"
, return. -
Let absolute be false.
-
Let features be « "accelerometer", "gyroscope" ».
-
If the implementation cannot provide relative orientation or the resulting absolute orientation data is more accurate:
-
Set absolute to true.
-
Append "magnetometer" to features.
-
-
For each feature of features:
-
If document is not allowed to use feature, return.
-
-
Fire an orientation event with deviceorientation, window, and absolute.
If an implementation can never provide orientation information, the event should be fired with the alpha
, beta
and gamma
attributes set to null, and the absolute
attribute set to false.
6.2. deviceorientationabsolute Event
deviceorientationabsolute
event and its ondeviceorientationabsolute
event handler IDL attribute are at risk due to limited implementation experience. partial interface Window { [SecureContext ]attribute EventHandler ondeviceorientationabsolute ; };
The ondeviceorientationabsolute
attribute is an event handler IDL attribute for the ondeviceorientationabsolute
event handler, whose event handler event type is deviceorientationabsolute
.
A deviceorientationabsolute event is completely analogous to the deviceorientation event, except that it must always provide absolute orientation data.
-
Fire an orientation event with deviceorientationabsolute, window, and true.
If an implementation can never provide absolute orientation information, the event should be fired with the alpha
, beta
and gamma
attributes set to null, and the absolute
attribute set to true.
6.3. devicemotion Event
6.3.1. The DeviceMotionEventAcceleration interface
[Exposed =Window ,SecureContext ]interface {
DeviceMotionEventAcceleration readonly attribute double ?x ;readonly attribute double ?y ;readonly attribute double ?z ; };
The DeviceMotionEventAcceleration
interface represents the device’s acceleration as described in § 3.2 Device Motion. It has the following associated data:
- x axis acceleration
-
The device’s acceleration rate along the X axis, or null. It is initially null.
- y axis acceleration
-
The device’s acceleration rate along the Y axis, or null. It is initially null.
- z axis acceleration
-
The device’s acceleration rate along the Z axis, or null. It is initially null.
The x
getter steps are to return the value of this's x axis acceleration.
The y
getter steps are to return the value of this's y axis acceleration.
The z
getter steps are to return the value of this's z axis acceleration.
6.3.2. The DeviceMotionEventRotationRate interface
[Exposed =Window ,SecureContext ]interface {
DeviceMotionEventRotationRate readonly attribute double ?alpha ;readonly attribute double ?beta ;readonly attribute double ?gamma ; };
The DeviceMotionEventRotationRate
interface represents the device’s rotation rate as described in § 3.2 Device Motion. It has the following associated data:
- x axis rotation rate
-
The device’s rotation rate about the X axis, or null. It is initially null.
- y axis rotation rate
-
The device’s rotation rate about the Y axis, or null. It is initially null.
- z axis rotation rate
-
The device’s rotation rate about the Z axis, or null. It is initially null.
The alpha
getter steps are to return the value of this's x axis rotation rate.
The beta
getter steps are to return the value of this's y axis rotation rate.
The gamma
getter steps are to return the value of this's z axis rotation rate.
6.3.3. The DeviceMotionEvent interface
partial interface Window { [SecureContext ]attribute EventHandler ondevicemotion ; }; [Exposed =Window ,SecureContext ]interface :
DeviceMotionEvent Event {(
constructor DOMString ,
type optional DeviceMotionEventInit = {});
eventInitDict readonly attribute DeviceMotionEventAcceleration ?acceleration ;readonly attribute DeviceMotionEventAcceleration ?accelerationIncludingGravity ;readonly attribute DeviceMotionEventRotationRate ?rotationRate ;readonly attribute double interval ;static Promise <PermissionState >requestPermission (); };dictionary {
DeviceMotionEventAccelerationInit double ?=
x null ;double ?=
y null ;double ?=
z null ; };dictionary {
DeviceMotionEventRotationRateInit double ?=
alpha null ;double ?=
beta null ;double ?=
gamma null ; };dictionary :
DeviceMotionEventInit EventInit {DeviceMotionEventAccelerationInit ;
acceleration DeviceMotionEventAccelerationInit ;
accelerationIncludingGravity DeviceMotionEventRotationRateInit ;
rotationRate double = 0; };
interval
The ondevicemotion
attribute is an event handler IDL attribute for the ondevicemotion
event handler, whose event handler event type is devicemotion
.
The acceleration
attribute must return the value it was initialized to. When the object is created, this attribute must be initialized to null. It represents the device’s linear acceleration.
The accelerationIncludingGravity
attribute must return the value it was initialized to. When the object is created, this attribute must be initialized to null. It represents the device’s acceleration with gravity.
The rotationRate
attribute must return the value it was initialized to. When the object is created, this attribute must be initialized to null. It represents the device’s rotation rate.
The interval
attribute must return the value it was initialized to. It represents the interval at which data is obtained from the underlying hardware and must be expressed in milliseconds (ms). It is constant to simplify filtering of the data by the Web application.
requestPermission()
method steps are:
-
Let global be the current global object.
-
Let hasTransientActivation be true if this's relevant global object has transient activation, and false otherwise.
-
Let result be a new promise in this's relevant Realm.
-
Run these steps in parallel:
-
Let permissions be « "accelerometer", "gyroscope" ».
-
For each name of permissions:
-
If name’s permission state is "
prompt
" and hasTransientActivation is false:-
Queue a global task on the device motion and orientation task source given global to reject result with a "
NotAllowedError
"DOMException
. -
Return.
-
-
-
Let permissionState be "
granted
". -
For each name of permissions:
Note: There is no algorithm for requesting multiple permissions at once. However, user agents are encouraged to bundle concurrent requests for different kinds of media into a single user-facing permission prompt.
-
If the result of requesting permission to use name is not "
granted
":
-
-
Queue a global task on the device motion and orientation task source given global to resolve result with permissionState.
-
-
Return result.
-
Let document be window’s associated Document.
-
If document’s visibility state is not "
visible
", return. -
For each policy of « "accelerometer", "gyroscope" »:
-
If document is not allowed to use the policy-controlled feature named policy, return.
-
-
Let topLevelTraversable be window’s navigable's top-level traversable.
-
Let platformLinearAcceleration be null.
-
If topLevelTraversable’s virtual sensor mapping contains "
linear-acceleration
":-
Let virtualSensor be topLevelTraversable’s virtual sensor mapping["
linear-acceleration
"]. -
If virtualSensor’s can provide readings flag is true, then set platformLinearAcceleration to the latest readings provided to virtualSensor.
-
-
Otherwise, if the implementation is able to provide linear acceleration:
-
Set platformLinearAcceleration to the device’s linear acceleration along the X, Y and Z axes.
-
-
Let acceleration be null.
-
If platformLinearAcceleration is not null:
-
Set acceleration to a new
DeviceMotionEventAcceleration
created in window’s realm. -
Set acceleration’s x axis acceleration to platformLinearAcceleration’s value along the X axis, or null if it cannot be provided.
-
If acceleration’s x axis acceleration is not null, limit its precision to no more than 0.1 m/s2.
-
Set acceleration’s y axis acceleration to platformLinearAcceleration’s value along the Y axis, or null if it cannot be provided.
-
If acceleration’s y axis acceleration is not null, limit its precision to no more than 0.1 m/s2.
-
Set acceleration’s z axis acceleration to platformLinearAcceleration’s value along the Z axis, or null if it cannot be provided.
-
If acceleration’s z axis acceleration is not null, limit its precision to no more than 0.1 m/s2.
-
-
Let platformAccelerationIncludingGravity be null.
-
If topLevelTraversable’s virtual sensor mapping contains "
accelerometer
":-
Let virtualSensor be topLevelTraversable’s virtual sensor mapping["
accelerometer
"]. -
If virtualSensor’s can provide readings flag is true, then set platformAccelerationIncludingGravity to the latest readings provided to virtualSensor.
-
-
Otherwise, if the implementation is able to provide acceleration with gravity:
-
Set platformAccelerationIncludingGravity to the device’s linear acceleration along the X, Y and Z axes.
-
-
Let accelerationIncludingGravity be null.
-
If platformAccelerationIncludingGravity is not null:
-
Set accelerationIncludingGravity to a new
DeviceMotionEventAcceleration
created in window’s realm. -
Set accelerationIncludingGravity’s x axis acceleration to platformAccelerationIncludingGravity’s value along the X axis, or null if it cannot be provided.
-
If accelerationIncludingGravity’s x axis acceleration is not null, limit its precision to no more than 0.1 m/s2.
-
Set accelerationIncludingGravity’s y axis acceleration to platformAccelerationIncludingGravity’s value along the Y axis, or null if it cannot be provided.
-
If accelerationIncludingGravity’s y axis acceleration is not null, limit its precision to no more than 0.1 m/s2.
-
Set accelerationIncludingGravity’s z axis acceleration to platformAccelerationIncludingGravity’s value along the Z axis, or null if it cannot be provided.
-
If accelerationIncludingGravity’s z axis acceleration is not null, limit its precision to no more than 0.1 m/s2.
-
-
Let platformRotationRate be null.
-
If topLevelTraversable’s virtual sensor mapping contains "
gyroscope
":-
Let virtualSensor be topLevelTraversable’s virtual sensor mapping["
gyroscope
"]. -
If virtualSensor’s can provide readings flag is true, then set platformRotationRate to the latest readings provided to virtualSensor.
-
-
Otherwise, if the implementation is able to provide rotation rate:
-
Set platformRotationRate to the device’s rotation rate about the X, Y and Z axes.
-
-
Let rotationRate be null.
-
If platformRotationRate is not null:
-
Set rotationRate to a new
DeviceMotionEventRotationRate
created in window’s realm. -
Set rotationRate’s x axis rotation rate to platformRotationRate’s value about the X axis, or null if it cannot be provided.
-
If rotationRate’s x axis rotation rate is not null, limit its precision to no more than 0.1 deg/s.
-
Set rotationRate’s y axis rotation rate to platformRotationRate’s value about the Y axis, or null if it cannot be provided.
-
If rotationRate’s y axis rotation rate is not null, limit its precision to no more than 0.1 deg/s.
-
Set rotationRate’s z axis rotation rate to platformRotationRate’s value about the Z axis, or null if it cannot be provided.
-
If rotationRate’s z axis rotation rate is not null, limit its precision to no more than 0.1 deg/s.
-
-
Let environment be window’s relevant settings object.
-
Run these steps in parallel:
-
For each permission in « "accelerometer", "gyroscope" »:
-
Let state be the result of getting the current permission state with permission and environment.
-
If state is not "
granted
", return.
-
-
Queue a global task on the device motion and orientation task source given window to run the following steps:
-
Fire an event named "devicemotion" at window, using
DeviceMotionEvent
, with theacceleration
attribute initialized to acceleration, theaccelerationIncludingGravity
attribute initialized to accelerationIncludingGravity, therotationRate
attribute initialized to rotationRate, and theinterval
attribute initialized to interval.
-
-
If an implementation can never provide motion information, the event should be fired with the acceleration
, accelerationIncludingGravity
and rotationRate
attributes set to null.
7. Security and privacy considerations
The API defined in this specification can be used to obtain information from hardware sensors, such as accelerometer, gyroscope and magnetometer. Provided data may be considered as sensitive and could become a subject of attack from malicious web pages. The calibration of accelerometers, gyroscopes and magnetometers may reveal persistent details about the particular sensor hardware [SENSORID]. The main attack vectors can be categorized into following categories:
-
Monitoring of a user input [TOUCH]
-
Location tracking [INDOORPOS]
-
User identification [FINGERPRINT]
In light of that, implementations may consider visual indicators to signify the use of sensors by the web page. Additionally, this specification requires users to give express permission for the user agent to provide device motion and/or orientation data via the requestPermission()
API calls.
Furthermore, to minimize privacy risks, the chance of fingerprinting and other attacks the implementations must:
-
fire events only when a navigable's active document's visibility state is "
visible
", -
implement § 4 Permissions so that events are fired on child navigables (including but not restricted to cross-origin ones) only if allowed by the top-level traversable,
-
fire events on a navigable's active windows only when its relevant settings object is a secure context,
-
limit precision of attribute values as described in the previous sections.
Additionally, implementing these items may also have a beneficial impact on the battery life of mobile devices.
Further implementation experience is being gathered to inform the limit for the maximum sampling frequency cap.
8. Accessibility considerations
DeviceOrientation events provide opportunities for novel forms of input, which can open up novel interactions for users. In order to ensure that as many people as possible will be able to interact with the experiences you build, please consider the following:
-
It is important for alternative means of providing input to be available, so that people who cannot make the required gestures have another way to interact. Examples may include people with dexterity-related disabilities, or people who use eye gaze, or head-tracking input.
-
For games, consider supporting either game controller, keyboard or mouse input as alternative interaction methods.
-
For web apps, consider providing UI, such as a button, menu command, and/or keyboard shortcut, to perform the function.
-
-
It is essential that users can undo any accidental input - this may be particularly relevant for people with tremors.
There are two user needs that can arise, which would likely be managed by the user agent, or underlying operating system. However, it can be helpful to bear these considerations in mind, as they represent ways that your content or app may be used.
-
It is important that the user is able to disable the use of gesture or motion-based input. The web app should provide an appropriate accessible means for the user to supply this input, such as a button.
-
For example: whilst the shake-to-undo feature can provide a natural and thoughtful interaction for some, for people with tremors, it may present a barrier. This could be managed by declining permission, or more likely by changing a browser or OS setting, coupled with the web app providing alternative input means.
-
-
It is also important that the device’s orientation can be locked - a primary use case being someone who is interacting with a touch device, such as a phone, non-visually. They may have built up 'muscle memory' as to where elements are on the screen in a given orientation, and having the layout shift would break their ability to navigate. Again, this would most likely be done at the operating system level.
9. Automation
This specification can pose a challenge to test authors, as the events defined here depend on the presence of physical hardware whose readings cannot be easily controlled.
To address this challenge, this document builds upon the [WEBDRIVER2] extension commands and infrastructure laid out by Generic Sensor API § 9 Automation. This was chosen over the option of developing completely new and independent infrastructure with separate extension commands because there is significant overlap between the two specifications: not only does testing the [GENERIC-SENSOR] specification present similar challenges, but many derived APIs (e.g. [GYROSCOPE]) obtain and provide similar information.
This specification only requires implementations to support the Generic Sensor API § 9 Automation section of the [GENERIC-SENSOR] specification, not its interfaces and events.
9.1. Device Orientation Automation
Automation support for the deviceorientation
event is built upon virtual sensors that represent accelerometers, gyroscopes and, optionally, magnetometers.
Orientation data retrieved from the platform by the user agent comes from accelerometers, gyroscopes and, optionally, magnetometers. Contrary to motion data, however, these lower-level readings must be transformed into Euler angles in the formation described in § 3.1 Device Orientation. Furthermore, the platform might provide extra APIs to the user agent that already perform some of those conversions from raw acceleration and rotation data.
Therefore, instead of requiring implementations (and automation users) to provide orientation readings via lower-level virtual sensors which use different units of measurement, this specification defines extra virtual sensor types for relative and orientation data in the format used by this specification.
9.1.1. Parse orientation reading data algorithm
Object
parameters:
-
Let alpha be the result of invoking get a property from parameters with "alpha".
-
If alpha is not a
Number
, or its value is NaN, +∞, or −∞, returnundefined
. -
If alpha is not in the range [0, 360), then return
undefined
. -
Let beta be the result of invoking get a property from parameters with "beta".
-
If beta is not a
Number
, or its value is NaN, +∞, or −∞, returnundefined
. -
If beta is not in the range [-180, 180), then return
undefined
. -
Let gamma be the result of invoking get a property from parameters with "gamma".
-
If gamma is not a
Number
, or its value is NaN, +∞, or −∞, returnundefined
. -
If gamma is not in the range [-90, 90), then return
undefined
. -
Return a new ordered map «[ "alpha" → alpha, "beta" → beta, "gamma" → gamma ]».
Note: The return value is a ordered map to prevent a dependency on the sensor reading concept from the [GENERIC-SENSOR] specification. They should be interchangeable for the purposes of the algorithm above.
9.1.2. The "absolute-orientation" virtual sensor type
The per-type virtual sensor metadata map must have the following entry:
-
- Key
-
"
absolute-orientation
" - Value
-
A virtual sensor metadata whose reading parsing algorithm is parse orientation data reading.
9.1.3. The "relative-orientation" virtual sensor type
The per-type virtual sensor metadata map must have the following entry:
-
- Key
-
"
relative-orientation
" - Value
-
A virtual sensor metadata whose reading parsing algorithm is parse orientation data reading.
9.2. Device Motion Automation
The motion data retrieved from the platform by the user agent comes from accelerometers and gyroscopes. This specification defines certain per-type virtual sensor metadata entries that are shared with the [ACCELEROMETER] and [GYROSCOPE] specifications.
Accelerometer virtual sensors are used to provide acceleration with gravity data to the platform. Linear Acceleration virtual sensors are used to provide linear acceleration data to the platform. Gyroscope virtual sensors are used to provide rotation rate data to the platform.
9.2.1. The "accelerometer" virtual sensor type
The per-type virtual sensor metadata map must have the following entry:
- Key
-
"
accelerometer
" - Value
-
A virtual sensor metadata whose reading parsing algorithm is parse xyz reading.
9.2.2. The "linear-acceleration" virtual sensor type
The per-type virtual sensor metadata map must have the following entry:
- Key
-
"
linear-acceleration
" - Value
-
A virtual sensor metadata whose reading parsing algorithm is parse xyz reading.
9.2.3. The "gyroscope" virtual sensor type
The per-type virtual sensor metadata map must have the following entry:
- Key
-
"
gyroscope
" - Value
-
A virtual sensor metadata whose reading parsing algorithm is parse xyz reading.
A Examples
This section is non-normative.A.1 Calculating compass heading
This section is non-normative.
The following worked example is intended as an aid to users of the DeviceOrientation event.
Introduction section provided an example of using the DeviceOrientation event to obtain a compass heading when the device is held with the screen horizontal. This example shows how to determine the compass heading that the user is facing when holding the device with the screen approximately vertical in front of them. An application of this is an augmented-reality system.
More precisely, we wish to determine the compass heading of the horizontal component of a vector which is orthogonal to the device’s screen and pointing out of the back of the screen.
If v represents this vector in the rotated device body frame xyz, then v is as follows.
The transformation of v due to the rotation about the z axis can be represented by the following rotation matrix.
The transformation of v due to the rotation about the x axis can be represented by the following rotation matrix.
The transformation of v due to the rotation about the y axis can be represented by the following rotation matrix.
If R represents the full rotation matrix of the device in the earth frame XYZ, then since the initial body frame is aligned with the earth, R is as follows.
If v' represents the vector v in the earth frame XYZ, then since the initial body frame is aligned with the earth, v' is as follows.
The compass heading θ is given by
provided that β and γ are not both zero.
The compass heading calculation above can be represented in JavaScript as follows to return the correct compass heading when the provided parameters are defined, not null and represent absolute
values.
var degtorad= Math. PI/ 180 ; // Degree-to-Radian conversion function compassHeading( alpha, beta, gamma) { var _x= beta? beta* degtorad: 0 ; // beta value var _y= gamma? gamma* degtorad: 0 ; // gamma value var _z= alpha? alpha* degtorad: 0 ; // alpha value var cX= Math. cos( _x); var cY= Math. cos( _y); var cZ= Math. cos( _z); var sX= Math. sin( _x); var sY= Math. sin( _y); var sZ= Math. sin( _z); // Calculate Vx and Vy components var Vx= - cZ* sY- sZ* sX* cY; var Vy= - sZ* sY+ cZ* sX* cY; // Calculate compass heading var compassHeading= Math. atan( Vx/ Vy); // Convert compass heading to use whole unit circle if ( Vy< 0 ) { compassHeading+= Math. PI; } else if ( Vx< 0 ) { compassHeading+= 2 * Math. PI; } return compassHeading* ( 180 / Math. PI); // Compass Heading (in degrees) }
As a consistency check, if we set γ = 0, then
as expected.
Alternatively, if we set β = 90, then
as expected.
A.2 Alternate device orientation representations
This section is non-normative.
Describing orientation using Tait-Bryan angles can have some disadvantages such as introducing gimbal lock [GIMBALLOCK]. Depending on the intended application it can be useful to convert the Device Orientation values to other rotation representations.
The first alternate orientation representation uses rotation matrices. By combining the component rotation matrices provided in the worked example above we can represent the orientation of the device body frame as a combined rotation matrix.
If R represents the rotation matrix of the device in the earth frame XYZ, then since the initial body frame is aligned with the earth, R is as follows.
absolute
values.
var degtorad= Math. PI/ 180 ; // Degree-to-Radian conversion function getRotationMatrix( alpha, beta, gamma) { var _x= beta? beta* degtorad: 0 ; // beta value var _y= gamma? gamma* degtorad: 0 ; // gamma value var _z= alpha? alpha* degtorad: 0 ; // alpha value var cX= Math. cos( _x); var cY= Math. cos( _y); var cZ= Math. cos( _z); var sX= Math. sin( _x); var sY= Math. sin( _y); var sZ= Math. sin( _z); // // ZXY rotation matrix construction. // var m11= cZ* cY- sZ* sX* sY; var m12= - cX* sZ; var m13= cY* sZ* sX+ cZ* sY; var m21= cY* sZ+ cZ* sX* sY; var m22= cZ* cX; var m23= sZ* sY- cZ* cY* sX; var m31= - cX* sY; var m32= sX; var m33= cX* cY; return [ m11, m12, m13, m21, m22, m23, m31, m32, m33]; };
Another alternate representation of device orientation data is as Quaternions. [QUATERNIONS]
If q represents the unit quaternion of the device in the earth frame XYZ, then since the initial body frame is aligned with the earth, q is as follows.
absolute
values and those parameters are not null.
var degtorad= Math. PI/ 180 ; // Degree-to-Radian conversion function getQuaternion( alpha, beta, gamma) { var _x= beta? beta* degtorad: 0 ; // beta value var _y= gamma? gamma* degtorad: 0 ; // gamma value var _z= alpha? alpha* degtorad: 0 ; // alpha value var cX= Math. cos( _x/ 2 ); var cY= Math. cos( _y/ 2 ); var cZ= Math. cos( _z/ 2 ); var sX= Math. sin( _x/ 2 ); var sY= Math. sin( _y/ 2 ); var sZ= Math. sin( _z/ 2 ); // // ZXY quaternion construction. // var w= cX* cY* cZ- sX* sY* sZ; var x= sX* cY* cZ- cX* sY* sZ; var y= cX* sY* cZ+ sX* cY* sZ; var z= cX* cY* sZ+ sX* sY* cZ; return [ w, x, y, z]; }
We can check that a Unit Quaternion has been constructed correctly using Lagrange’s four-square theorem
as expected.
Acknowledgments
The Device Orientation and Motion specification, originally published as a Candidate Recommendation in August 2016 under the title DeviceOrientation Event Specification, was initially developed by the Geolocation Working Group. After the group was closed in 2017, the specification was temporarily retired. Revitalized in 2019 by the Devices and Sensors Working Group, this document has undergone significant enhancements including improvements in interoperability, test automation, privacy, and editorial content (see § 10 Changes section).
In 2024, the Devices and Sensors Working Group partnered with the Web Applications Working Group, making this a joint deliverable and continuing the advancement of the specification. The initial design discussions are preserved not in this GitHub repository but can be explored through the Geolocation Working Group’s mailing list archives.
The W3C acknowledges Lars Erik Bolstad, Dean Jackson, Claes Nilsson, George Percivall, Doug Turner, Matt Womer, and Chris Dumez for their contributions.
10. Changes
This section summarizes substantial changes and notable editorial improvements to guide review. Full details are available from the commit log. Changes since the Candidate Recommendation 2016-08-18:
-
Add Permissions Policy integration, which supersedes the previous requirement of only firing events on iframes that were same-origin with the top-level frame
-
Add note to implementers about bundling permission requests
-
Export powerful features Accelerometer, Gyroscope and Magnetometer
-
Add Permissions API integration, start requiring requestPermission() usage
-
editorial: Define API section more normatively and with more dfns
-
editorial: Reorder acceleration explanation in Device Motion Model section
-
editorial: Update explanations of the device rotation and motion references
-
editorial: Use more precise event handling terms, modernize others
-
editorial: Refer to [SCREEN-ORIENTATION] instead of the orientation change event
-
editorial: Reword requirements in "Security and privacy considerations"
-
Mark use cases and requirements and examples sections non-normative
-
Remove the oncompassneedscalibration event
-
Update references to "triggered by user activation", now referred to as "transient activation"
-
Align with DOM phrasing on firing events
-
Add a note about acceleration properties for DeviceMotionEvent
-
Add a note explaining how the coordinate system differs from the CSS coordinate system
-
Require no more precise than 0.1 degrees, 0.1 degrees per second, 0.1 meters per second squared to mitigate a passive fingerprinting issue
-
Update constructor definition in IDL with the Web IDL
-
Add explicit [Exposed] to interfaces
-
Update IDL dictionaries with new dictionary defaulting setup
-
Note the deviceorientationabsolute event and its ondeviceorientationabsolute event handler IDL attribute have limited implementation experience
-
Add requestPermission() API static operation to both DeviceOrientationEvent and DeviceMotionEvent
-
Add [SecureContext] to event handlers ondeviceorientation, ondevicemotion and ondeviceorientationabsolute
-
Restrict all interfaces to secure contexts
-
Remove [NoInterfaceObject] from DeviceAcceleration and DeviceRotationRate
-
Make security and privacy considerations normative
-
Add the ondeviceorientationabsolute event handler attribute into the IDL block (was only in prose)
-
Remove '?' from dictionary members of DeviceMotionEventInit
-
Use [Exposed=Window] extended attribute