tree: 6526172c4f43f3ac9d1de5b808787ea4c8ec5776 [path history] [tgz]
  1. audio/
  2. base/
  3. blink/
  4. capture/
  5. cast/
  6. cdm/
  7. device_monitors/
  8. ffmpeg/
  9. filters/
  10. formats/
  11. gpu/
  12. midi/
  13. mojo/
  14. muxers/
  15. remoting/
  16. renderers/
  17. test/
  18. tools/
  19. video/
  20. BUILD.gn
  21. DEPS
  22. media_options.gni
  23. OWNERS
  24. PRESUBMIT.py
  25. PRESUBMIT_test.py
  26. README.md
media/README.md

media/

Welcome to Chromium Media! This directory primarily contains a collection of components related to media capture and playback. Feel free to reach out to the media-dev@chromium.org mailing list with questions.

As a top level component this may be depended on by almost every other Chromium component except base/. Certain components may not work properly in sandboxed processes.

Directory Breakdown

  • audio/ - Code for audio input and output. Includes platform specific output and input implementations. Due to use of platform APIs, can not normally be used from within a sandboxed process.

  • base/ - Contains miscellaneous enums, utility classes, and shuttling primitives used throughout media/ and beyond; i.e. AudioBus, AudioCodec, and VideoFrame just to name a few. Can be used in any process.

  • blink/ - Code for interfacing with the Blink rendering engine for MediaStreams as well as <video> and <audio> playback. Used only in the same process as Blink; typically the render process.

  • capture/ - Contains content (as in the content layer) capturing and platform specific video capture implementations.

  • cast/ - Contains the tab casting implementation; not to be confused with the Chromecast code which lives in the top-level cast/ directory.

  • cdm/ - Contains code related to the Content Decryption Module (CDM) used for playback of content via Encrypted Media Extensions (EME).

  • device_monitors/ - Contains code for monitoring device changes; e.g. webcam and microphone plugin and unplug events.

  • ffmpeg/ - Contains binding code and helper methods necessary to use the ffmpeg library located in //third_party/ffmpeg.

  • filters/ - Contains data sources, decoders, demuxers, parsers, and rendering algorithms used for media playback.

  • formats/ - Contains parsers used by Media Source Extensions (MSE).

  • gpu/ - Contains the platform hardware encoder and decoder implementations.

  • midi/ - Contains the WebMIDI API implementation.

  • mojo/ - Contains mojo services for media. Typically used for providing out of process media functionality to a sandboxed process.

  • muxers/ - Code for muxing content for the Media Recorder API.

  • remoting/ - Code for transmitting muxed packets to a remote endpoint for playback.

  • renderers/ - Code for rendering audio and video to an output sink.

  • test/ - Code and data for testing the media playback pipeline.

  • tools/ - Standalone media test tools.

  • video/ - Abstract hardware video decoder interfaces and tooling.

Capture

TODO(miu, chfemer): Fill in this section.

mojo

TODO(xhwang): Fill in this section.

MIDI

TODO(toyoshim): Fill in this section.

Playback

Media playback encompasses a large swatch of technologies, so by necessity this will provide only a brief outline. Inside this directory you'll find components for media demuxing, software and hardware video decode, audio output, as well as audio and video rendering.

Specifically under the playback heading, media/ contains the implementations of components required for HTML media elements and extensions:

As a case study we'll consider the playback of a video through the <video> tag.

<video> (and <audio>) starts in blink::HTMLMediaElement in third_party/WebKit/ and reaches media/blink in media::WebMediaPlayerImpl after a brief hop through content::MediaFactory. Each blink::HTMLMediaElement owns a media::WebMediaPlayerImpl for handling things like play, pause, seeks, and volume changes (among other things).

media::WebMediaPlayerImpl handles or delegates media loading over the network as well as demuxer and pipeline initialization. media::WebMediaPlayerImpl owns a media::PipelineController which manages the coordination of a media::DataSource, media::Demuxer, and media::Renderer during playback.

During a normal playback, the media::Demuxer owned by WebMediaPlayerImpl may be either media::FFmpegDemuxer or media::ChunkDemuxer. The ffmpeg variant is used for standard src= playback where WebMediaPlayerImpl is responsible for loading bytes over the network. media::ChunkDemuxer is used with Media Source Extensions (MSE), where JavaScript code provides the muxed bytes.

The media::Renderer is typically media::RendererImpl which owns and coordinates media::AudioRenderer and media::VideoRenderer instances. Each of these in turn own a set of media::AudioDecoder and media::VideoDecoder implementations. Each issues an async read to a media::DemuxerStream exposed by the media::Demuxer which is routed to the right decoder by media::DecoderStream. Decoding is again async, so decoded frames are delivered at some later time to each renderer.

The media/ library contains hardware decoder implementations in media/gpu for all supported Chromium platforms, as well as software decoding implementations in media/filters backed by FFmpeg and libvpx. Decoders are attempted in the order provided via the media::RendererFactory; the first one which reports success will be used for playback (typically the hardware decoder for video).

Each renderer manages timing and rendering of audio and video via the event- driven media::AudioRendererSink and media::VideoRendererSink interfaces respectively. These interfaces both accept a callback that they will issue periodically when new audio or video frames are required.

On the audio side, again in the normal case, the media::AudioRendererSink is driven via a base::SyncSocket and shared memory segment owned by the browser process. This socket is ticked periodically by a platform level implementation of media::AudioOutputStream within media/audio.

On the video side, the media::VideoRendererSink is driven by async callbacks issued by the compositor to media::VideoFrameCompositor. The media::VideoRenderer will talk to the media::AudioRenderer through a media::TimeSource for coordinating audio and video sync.

With that we‘ve covered the basic flow of a typical playback. When debugging issues, it’s helpful to review the internal logs at chrome://media-internals. The internals page contains information about active media::WebMediaPlayerImpl, media::AudioInputController, media::AudioOutputController, and media::AudioOutputStream instances.