The cast_streaming component provides a wrapper around all Cast mirroring and Cast remoting receiver functionality in a platform-agnostic way. The Cast Streaming transport/protocol is implemented by the Openscreen library, and media rendering is handled by the Chromium media stack, so this component acts as an intermediary between the two. It is currently used both as part of Fuchsia WebEngine and the //components/cast_receiver Chromecast implementation, which can be used to run a Cast Streaming receiver on Linux.
This component does NOT provide any Cast sender support, although some code in //media/cast/openscreen is shared with the sender.
//contentcast_streaming component:enable_cast_receiver = true cast_streaming_enable_remoting = true
Starting the cast_streaming component in the browser process has two parts:
SetNetworkContextGetter() to set the network context to use for the streaming session.ReceiverSession object and calling ReceiverSession::StartStreamingAsync() on it.In the ContentRendererClient for the service, a new instance of cast_streaming::ResourceProvider must be returned by CreateCastStreamingResourceProvider() function overload.
The remainder of integration is already taken care of in the //content layer.
The code for this component can roughly be broken down into a few parts, as reflected by the component’s directory structure:
This section of code is responsible for sending frame data from Openscreen to the media pipeline. This is located in /browser/frame, /renderer/frame, and /common/frame.
This section of code is responsible for sending media::Renderer commands to the embedder-specific Renderer on top of which cast_streaming is running. It is located at /browser/control, /renderer/control, and /common/control. Selection of this Renderer is as with vanilla Chromium - through the MediaFactory. No cast_streaming-specific changes are required.
A subset of Control which is responsible for translating control commands to and from the cast_streaming media remoting protocol, a proto-based communication. Code is located within the /control directory, e.g /browser/control/remoting.
An alternative implementation of the Renderer-process side of the frame implementation. This section is minimal, as its implementation has largely been integrated into the standard Frame flow to avoid code duplication.
In the diagrams below, note the following:
The startup process for the cast_streaming component can most easily be visualized by splitting up the Browser and Renderer processes. Very few communications are made between the two, so they can largely be viewed independently.
On the browser side
ReceiverSessionImpl creates and starts a CastStreamingSession, which in turn creates a PlaybackCommandDispatcher and starts an openscreen::cast::ReceiverSession.openscreen::ReceiverSession negotiates a session and passes it to CastStreamingSession, If it’s a remoting session, wait for the “real” configs to be sent from the sender side.CastStreamingSession::StartStreamingSession() then creates the remainder of the requisite objects.StartPlayingFrom() on the associated media::Renderer instance in the renderer process. In the case of remoting, more frames must also be requested from the remote renderer running on the streaming sender.The files involved in this section are located mainly at the top-level of each directory (i.e. //components/cast_streaming/browser or //components/cast_streaming/renderer), which make calls into frame and control as necessary.
Starting of the Control and Frame pathways in the Renderer process is entirely separate, so the two can be examined separately:
For the former of these, the setup is triggered by the MediaFactory which owns the root-level singleton objects responsible for enabling the flow. In order to create a media::Renderer mojo connection between the browser and renderer processes, ResourceProviderImpl is used as an intermediary to avoid timing issues. The receiver side of this pipe is passed to the PlaybackCommandForwardingRenderer during its creation, which will act in response to these commands by forwarding them to the underlying embedder-specific media::Renderer instance, as well as forwarding back any RendererClient events.
Creation of the Frame channel is slightly more complex. First, an override from the embedder-specific ContentBrowserClient triggers usage of the FrameInjectingDemuxer. The renderer-process being “ready” is signified by both of:
DemuxerConnectorFrameInjectingDemuxerAt that point, the browser process will send the OnStreamsInitialized() call which provides the connection information to create all remaining objects and begin pulling frames from the browser process.
At a high level, sending frame data to the media pipeline in the renderer process works as follows:
DemuxerStream::Read() call, the FrameInjectingDemuxerStream triggers a GetBuffer() mojo call.DemuxerStreamDataProvider receives this call, and makes a request to the StreamConsumer to get a frame.RpcDemuxerStreamHandler.DecoderBuffer, write the data() field to a pipe and return the remainder to the DemuxerStreamDataProvider.DemuxerStreamDataProvider receives and then sends this frame data to the Renderer, where it gets combined back with its data and provided to the FrameInjectingDemuxerStream.AudioDecoderConfig / VideoDecoderConfigA config change can be triggered in two ways:
Changing the config during an ongoing remoting session occurs in a number of steps:
AudioDecoderConfig / VideoDecoderConfig are sent.StartStreamingSession().StartPlayingFrom() call from the remote sender, but in practice that will only be sent intermittently and cannot be relied on.DemuxerStreamDataProvider will send the new config to the Renderer process.FrameInjectingDemuxerStream will receive a new StreamConsumer associated with this new stream, and may then will return the config as part of the next (or ongoing) Read() call.This situation occurs either when an ongoing mirroring session re-configures itself (e.g. as the result of Chrome changing the quality of an ongoing session) or when the user changes between mirroring and remoting.
As with previous sections, this scenario is largely the same as the remoting section, except that instead of a new config being received through the ReadUntil() call, the OnNegotiated() function will immediately provide the new config and “reset the state” of the pipeline.
TODO(crbug.com/40765922): Add these details
In the Renderer process, the same media::DemuxerStream and media::Renderer instances are used, even when the stream is re-initialized. Rather than re-creating the entire pipeline, Flush() and StartPlayingFrom() commands are sent and the same instances are used.
Preloading was a concept added relatively late into the development of the cast_streaming component to solve a number of edge cases:
pts = 0 ms. In such cases, naively calling StartPlayingFrom(0 ms) as has historically been relied upon in WebEngine does not work.StartPlayingFrom() command is not always sent.In order to account for such cases, a StartPlayingFrom() must be “injected in”, for which the timestamp of the first frame is required. It is also true that this approach decreases the playback delay between the sender and receiver, but that is more of a happy coincidence than the original goal of this workflow.