This directory contains the main thread animation engine. This implements the Web Animations timing model that drives CSS Animations, Transitions and exposes the Web Animations API (element.animate()) to Javascript.


As of 2018 Blink animations is maintained by the cc/ team.

Specifications Implemented

Integration with Chromium

The Blink animation engine interacts with Blink/Chrome in the following ways:

  • Chromium's Compositor

    Chromium‘s compositor has a separate, more lightweight animation engine that runs separate to the main thread. Blink’s animation engine delegates animations to the compositor where possible for better performance and power utilisation.

    Compositable animations

    A subset of style properties (currently transform, opacity, filter, and backdrop-filter) can be mutated on the compositor thread. Animations that mutate only these properties are candidates for being accelerated and run on the compositor thread which ensures they are isolated from Blink's main thread work.

    Whether or not an animation can be accelerated is determined by CheckCanStartAnimationOnCompositor() which looks at several aspects such as the composite mode, other animations affecting same property, and whether the target element can be promoted and mutated in compositor. Reasons for not compositing animations are captured in FailureCodes.

    Lifetime of a compositor animation

    Animations that can be accelerated get added to the PendingAnimations list. The pending list is updated as part of document lifecycle and ensures each pending animation gets a corresponding cc::AnimationPlayer representing the animation on the compositor. The player is initialized with appropriate timing values and corresponding effects.

    Note that changing that animation playback rate, start time, or effect, simply adds the animation back on to the pending list and causes the compositor animation to be cancelled and a new one to be started. See Animation::PreCommit() for more details.

    An accelerated animation is still running on main thread ensuring that its effective output is reflected in the Element style. So while the compositor animation updates the visuals the main thread animation updates the computed style. There is a special case logic to ensure updates from such accelerated animations do not cause spurious commits from main to compositor (See CompositedLayerMapping::UpdateGraphicsLayerGeometry(), or FragmentPaintPropertyTreeBuilder::UpdateTransform(), FragmentPaintPropertyTreeBuilder::UpdateEffect(), and FragmentPaintPropertyTreeBuilder::UpdateFilter() for BlinkGenPropertyTrees mode)

    A compositor animation provides updates on its playback state changes (e.g., on start, finish, abort) to its blink counterpart via CompositorAnimationDelegate interface. Blink animation uses the start event callback to obtain an accurate start time for the animation which is important to ensure its output accurately reflects the compositor animation output.

  • Javascript

    EffectInput contains the helper functions that are used to process a keyframe argument which can take an argument of either object or array form.

    PlayStateUpdateScope: whenever there is a mutation to the animation engine from JS level, this gets created and the destructor has the logic that handles everything. It keeps the old and new state of the animation, checks the difference and mutate the properties of the animation, at the end it calls SetOutdatedAnimation() to inform the animation timeline that the time state of this animation is dirtied.

    There are a couple of other integration points that are less critical to everyday browsing:

  • DevTools

    The animations timeline uses InspectorAnimationAgent to track all active animations. This class has interfaces for pausing, adjusting DocumentTimeline playback rate, and seeking animations.

    InspectorAnimationAgent clones the inspected animation in order to avoid firing animation events, and suppresses the effects of the original animation. From this point on, modifications can be made to the cloned animation without having any effect on the underlying animation or its listeners.

  • SVG

    The element.animate() API supports targeting SVG attributes in its keyframes. This is an experimental implementation guarded by the WebAnimationsSVG flag and not exposed on the web.

    This feature should provide a high fidelity alternative to our SMIL implementation.


Animation Timing Model

The animation engine is built around the timing model described in the Web Animations spec.

This describes a hierarchy of entities:

  • DocumentTimeline: Represents the wall clock time.
    • Animation: Represents an individual animation and when it started playing.
      • AnimationEffect: Represents the effect an animation has during the animation (e.g. updating an element's color property).

Time trickles down from the DocumentTimeline and is transformed at each stage to produce some progress fraction that can be used to apply the effects of the animations.

For example:

// Page was loaded at 2:00:00PM, the time is currently 2:00:10PM.
// document.timeline.currentTime is currently 10000 (10 seconds).

let animation = element.animate([
    {transform: 'none'},
    {transform: 'rotate(200deg)'},
  ], {
    duration: 20000,  // 20 seconds

animation.startTime = 6000;  // 6 seconds
  • DocumentTimeline notifies that the time is 10 seconds.
    • Animation computes that its currentTime is 4 seconds due to its startTime being at 6 seconds.
      • AnimationEffect has a duration of 20 seconds and computes that it has a progress of 20% from the parent animation being 4 seconds into the animation.

        The effect is animating an element from transform: none to transform: rotate(200deg) so it computes the current effect to be transfrom: rotate(40deg).

Lifecycle of an Animation


  1. An Animation is created via CSS1 or element.animate().
  2. At the start of the next frame the Animation and its AnimationEffect are updated with the currentTime of the DocumentTimeline.
  3. The AnimationEffect gets sampled with its computed localTime, pushes a SampledEffect into its target element's EffectStack and marks the elements style as dirty to ensure it gets updated later in the document lifecycle.
  4. During the next style resolve on the target element all the SampledEffects in its EffectStack are incorporated into building the element's ComputedStyle.

One key takeaway here is to note that timing updates are done in a separate phase to effect application. Effect application must occur during style resolution which is a highly complex process with a well defined place in the document lifecycle. Updates to animation timing will request style updates rather than invoke them directly.

1 CSS animations and transitions are actually created/destroyed during style resolve (step 4). There is special logic for forcing these animations to have their timing updated and their effects included in the same style resolve. An unfortunate side effect of this is that style resolution can cause style to get dirtied, this is currently a code health bug.


Currently all animations use KeyframeEffect for their AnimationEffect. The generic AnimationEffect from which it inherits is an extention point in Web Animations where other kinds of animation effects can be defined later by other specs (for example Javascript callback based effects).

Structure of a KeyframeEffect

  • KeyframeEffect represents the effect an animation has (without any details of when it started or whether it's playing) and is comprised of three things:
    • Some Timing information (inherited from AnimationEffect). Example:

        duration: 4000,
        easing: 'ease-in-out',
        iterations: 8,
        direction: 'alternate',

      This is used to compute the percentage progress of the effect given the duration of time that the animation has been playing for.

    • The DOM Element that is being animated.

    • A KeyframeEffectModel that holds a sequence of keyframes to specify the properties being animated and what values they pass through. Example:

        {backgroundColor: 'red', transform: 'rotate(0deg)'},
        {backgroundColor: 'yellow'},
        {backgroundColor: 'lime'},
        {backgroundColor: 'blue'},
        {backgroundColor: 'red', transform: 'rotate(360deg)'},

      These keyframes are used to compute:

      • A PropertySpecificKeyframe map that simply breaks up the input multi-property keyframes into per-property keyframe lists.
      • An InterpolationEffect which holds a set of Interpolations, each one representing the animated values between adjacent pairs of PropertySpecificKeyframes, and where in the percentage progress they are active. In the example keyframes above the [Interpolations][] generated would include, among the 5 different property specific keyframe pairs, one for backgroundColor: 'red' to backgroundColor: 'yellow' that applied from 0% to 25% and one for transform: 'rotate(0deg)' to transform: 'rotate(360deg)' that applied from 0% to 100%.

Lifecycle of an Interpolation

Interpolation is the data structure that style resolution uses to resolve what animated value to apply to an animated element's ComputedStyle.

  1. Interpolations are lazily instantiated prior to sampling.
  2. KeyframeEffectModels are sampled every frame (or as necessary) for a stack of Interpolations to apply to the associated Element and stashed away in the Element‘s ElementAnimationsEffectStack's SampledEffects.
  3. During style resolution on the target Element all the Interpolations are collected and organised by category according to whether it's a transition or not (transitions in Blink are suppressed in the presence of non-transition animations on the same property) and whether it affects custom properties or not (animated custom properties are animation-tainted and affect the processing of animation properties.
  4. TODO(alancutter): Describe what happens in processing a stack of interpolations.

Testing pointers

Test new animation features using end to end web-platform-tests to ensure cross-browser interoperability. Use unit testing when access to chrome internals is required. Test chrome specific features such as compositing of animation using web tests or unit tests.

End to end testing

Features in the Web Animations spec are tested in web-animations. Writing web platform tests has pointers for how to get started. If Chrome does not correctly implement the spec, add a corresponding -expected.txt file with your test listing the expected failure in Chrome.

Web tests are located in third_party/blink/web_tests. These should be written when needing end to end testing but either when testing chrome specific features (i.e. non-standardized) such as compositing or when the test requires access to chrome internal features not easily tested by web-platform-tests.

Unit testing

Unit testing of animations can range from extending Test when you will manually construct an instance of your object to extending RenderingTest where you can load HTML, enable compositing if necessary, and run assertions about the state.

Ongoing work

Properties And Values API

TODO: Summarize properties and values API.

Web Animations API

TODO: Summarize Web Animations API.

Animation Worklet

AnimationWorklet is a new primitive for creating high performance procedural animations on the web. It is being incubated as part of the CSS Houdini task force, and if successful will be transferred to that task force for full standardization.

A WorkletAnimation behaves and exposes the same animation interface as other web animation but it allows the animation itself to be highly customized in Javascript by providing an animate callback. These animations run inside an isolated worklet global scope.