breadcrumbs: Audio/Video >
Update (November 22, 2018)
For the Web Audio API, the Autoplay Policy will launch in M71.
Update (May 15, 2018)
The Autoplay Policy launched in M66 Stable for and and is effectively blocking roughly half of unwanted media autoplays in Chrome.
For the Web Audio API, the autoplay policy will launch in M70. This affects web games, some WebRTC applications, and other web pages using audio features. Developers will need to update their code to take advantage of the policy. More detail can be found in the Web Audio API section below.
Summary This policy controls when video and audio is allowed to autoplay, and is designed to meet three primary goals:
Under the new policy media content will be allowed to autoplay under the following conditions:
By default embedded IFrames will only be able to play muted or silent videos. However, if site owners wish for IFrames on their site to be able to play unmuted content, they may pass the autoplay permissions to the IFrame using allow=autoplay. This attribute allows any video contained in the IFrame to play as if it were hosted on the site. For a more detailed design and rationale, please click here. Autoplay blocking Around the same time we will be making two additional changes related to autoplay that will make muted autoplay more reliable. These two changes will make it possible for sites and advertisers to use muted videos instead of animated .gifs, which in most cases will reduce overall bandwidth consumption.
Developer Recommendations: and
var promise = document.querySelector(‘video’).play();
if (promise !== undefined) {
promise.then(_ => {
// Autoplay started!
}).catch(error => {
// Autoplay was prevented.
// Show a “Play” button so that user can start playback.
});
}
Developer Recommendations: Web Audio API
The Web Audio API will be included in the Autoplay policy with M70 (October 2018). Generally, in Chrome developers can no longer assume that audio is allowed to play when a user first arrives at a site, and should assume that playback may be blocked until a user first interacts with the site through a user activation (a click or a tap). Any attempt to create an audioContext before that time may result in a suspended audioContext that will have to be explicitly switched to running after a user activation.
Developers who write games, WebRTC applications, or other websites that use the Web Audio API should call context.resume() after the first user gesture (e.g. a click, or tap). For example:
// Resume playback when user interacted with the page.
document.querySelector(‘button’).addEventListener(‘click’, function() {
context.resume().then(() => {
console.log(‘Playback resumed successfully’);
});
});
Web Audio API developers can detect whether or not autoplay is allowed by creating a new AudioContext and then checking its state to see whether it is running (allowed) or suspended (blocked).
Depending upon the site, it may make sense to add additional user interface elements (such as a ‘play’ button in front of a game, or an ‘unmute’ button in some other cases), to explicitly capture a user gesture. This can either be done prior to creating AudioContext, or afterwards with a call to resume() upon click.
IFrame embedded content
Embedded content in a cross-origin IFrame needs to have permission to autoplay delegated to it, otherwise the audioContext will never be allowed to run.
Developers that host IFrames with content inside them (e.g. game hosting sites) can enable audio for that content without requiring the underlying content to change any code, by doing the following:
If the content is in a cross-origin IFrame, ensure that the IFrame includes the attribute allow="autoplay" Ensure that before the embedded content loads and runs, the site captures a user gesture (e.g. prompt for a click or a tap)
Developers can find more details about specific code changes, and debugging tips here.
Web Audio API FAQs
Why is the Web Audio API part of the autoplay policy? Users don’t like to click on a link and have sound played automatically that they weren’t expecting. The Web Audio API produces sound, so it must be included in the autoplay policy to ensure consistency across all web experiences. Wait, didn’t you launch the autoplay policy for Web Audio API in M66? Yes, briefly, but we reverted the change about a week later. We’re always working to improve things for users and developers, but in this case we did not do an effective job of communicating the change to developers using the Web Audio API. We are moving the launch to October 2018 to give those developers more time to prepare. If you develop web games, WebRTC applications, or other web experiences with sound please see the developer recommendations.
Release Schedule
September 2017New autoplay policies announced Begin collecting Media Engagement Index (MEI) data in M62 Canary and DevDecember 2017Site muting available in M64 Beta Autoplay policies available in M65 Canary and DevJanuary 2018Site muting available in M64 StableApril 2018Autoplay policies are enforced for and in M66 StableOctober 2018Autoplay policies will be enforced for Web Audio API in M70 Stable
General FAQs
More information
Autoplay policy summary presentation
Media engagement index (MEI) design document
Autoplay Policy Changes (developers.google.com)
DOMException: The play() request was interrupted (developers.google.com)