Intent to Implement: Presentation API

534 views
Skip to first unread message

Anton Vayvod

unread,
Sep 9, 2014, 8:52:13 AM9/9/14
to blink-dev
Contact emails

Spec

Summary
The goal of the Presentation API is to enable web content to access external presentation-type displays and use them for presenting web content.

Motivation
We are working on the specification with the Second Screen W3C community group [1] led by Intel; we want to implement this API so that potential users can experiment and help us iterate on the spec. The API will allow developers to build websites instead of native apps to present content on remote screens. We have a list of use cases at [2].

Compatibility Risk: Unknown

* Mozilla is pretty active in the community group and have just posted their Intent to Implement [3]. We are constantly in touch with them to make sure the spec is interoperable by both Chrome and Firefox.
* IE team is watching the spec but haven’t confirmed any plans to implement.
* Members of the Safari team have joined the W3C Community Group, but have not participated in discussions as of yet.

The spec is still being shaped by the CG so we are expecting a few changes and plan on advancing the implementation together with the specification. The formal process of transitioning the community group to a working group has been kicked off already.

Ongoing technical constraints
None.

Will this feature be supported on all five Blink platforms (Windows, Mac, Linux, Chrome OS and Android)?
Yes.

OWP launch tracking bug?

Link to entry on the feature dashboard

Requesting approval to ship?
No.

Rick Byers

unread,
Sep 9, 2014, 9:54:44 AM9/9/14
to ava...@chromium.org, blink-dev
Cool!  This looks reasonable to me, and aligned with blink's top goal of making the web better for mobile.  From the state of the spec and mozilla discussion, it definitely seems like experimenting with implementations is the right next step to move this forward.

When implementing this in blink, you'll presumably also be adding support for specific display technology to chromium (eg. chromecast?), right?  Can you talk briefly about your plans here?  In particular, to really be convinced the API is right, I think we'd want to have working (and perf-stressed) examples of both local-rendering and remote-rendering.

Rick

Anton Vayvod

unread,
Sep 9, 2014, 2:04:25 PM9/9/14
to Rick Byers, blink-dev
Thanks!

For now, we plan to support Chromecast for both remote and local rendering (polyfilling as much as possible of the Cast Web SDK [1] on top of the API) and also protocols that're supported natively by the OS (i.e. by the Android Presentation API [2]) for local-rendering.

Rick Byers

unread,
Sep 9, 2014, 2:50:51 PM9/9/14
to ava...@chromium.org, blink-dev
Sounds great, thanks!

Rick

Rich Tibbett

unread,
Sep 10, 2014, 5:29:03 AM9/10/14
to ava...@chromium.org, blink-dev
What is the benefit of this approach instead of, say, implementing an
Airplay-like mirroring/streaming solution? The only aspect of
mirroring/sharing exposed to web pages there is an x-webkit-airplay
attribute on media elements.

For Chromecast-like use cases (i.e. whenever a data comms channel is
required toward a local device/service) we could provide a more
generic network communication primitive. Web browsers could discovered
suitable local devices and provide endpoint control URLs for web
developers to pass e.g. JSON messages via XMLHttpRequest to those
services. Chromecast provides such an HTTP-level interface that we
could harness here:
https://developers.google.com/cast/docs/reference/messages. There is a
proposal that supports communicating exactly this way with e.g. a
Chromecast device at
https://dvcs.w3.org/hg/dap/raw-file/tip/discovery-api/Overview.html.

Providing a more generic local services discovery mechanism and a
better network communications abstraction point (i.e. HTTP) would
allow web developers to communicate with a wider range of local
services and devices than just those services a web browser supports
natively. It would also require less work to not have to implement and
maintain multiple messaging protocol stacks in implementations since
communication with UPnP/DLNA/Chromecast/NextBigServiceHere services
could be left to web developers and suitable JavaScript libraries e.g.
https://richtr.github.io/plug.play.js/.

Why would we not discover and provide HTTP URLs toward local services
and re-use existing platform APIs like XMLHttpRequest and WebSockets
for this type of network communication?

- Rich
> To unsubscribe from this group and stop receiving emails from it, send an
> email to blink-dev+...@chromium.org.

Anton Vayvod

unread,
Sep 10, 2014, 9:28:11 AM9/10/14
to Rich Tibbett, blink-dev
Hi Rich,

comments inline:

On Wed, Sep 10, 2014 at 10:28 AM, Rich Tibbett <ri...@opera.com> wrote:
What is the benefit of this approach instead of, say, implementing an
Airplay-like mirroring/streaming solution? The only aspect of
mirroring/sharing exposed to web pages there is an x-webkit-airplay
attribute on media elements.

With the Presentation API one can stream a presentation or a game. If the remote rendering is possible, one can achieve much better user experience in many aspects.
The page has better control over the playback/presentation compared to x-webkit-airplay like approach.

For Chromecast-like use cases (i.e. whenever a data comms channel is
required toward a local device/service) we could provide a more
generic network communication primitive. Web browsers could discovered
suitable local devices and provide endpoint control URLs for web
developers to pass e.g. JSON messages via XMLHttpRequest to those
services. Chromecast provides such an HTTP-level interface that we
could harness here:
https://developers.google.com/cast/docs/reference/messages. There is a
proposal that supports communicating exactly this way with e.g. a
Chromecast device at
https://dvcs.w3.org/hg/dap/raw-file/tip/discovery-api/Overview.html.

Providing a more generic local services discovery mechanism and a
better network communications abstraction point (i.e. HTTP) would
allow web developers to communicate with a wider range of local
services and devices than just those services a web browser supports
natively. It would also require less work to not have to implement and
maintain multiple messaging protocol stacks in implementations since
communication with UPnP/DLNA/Chromecast/NextBigServiceHere services
could be left to web developers and suitable JavaScript libraries e.g.
https://richtr.github.io/plug.play.js/.

I believe that the right abstraction point for the web developers is above the network communication protocols.
To support multiple messaging stacks, UA might have an extension framework allowing third-party components dealing with the low-level protocol details. Similar to what Android MediaRouteProvider API [1] does today.

Why would we not discover and provide HTTP URLs toward local services
and re-use existing platform APIs like XMLHttpRequest and WebSockets
for this type of network communication?

This would probably widen the scope of the API (e.g one can connect to a local service running on a thermostat or lights, etc) while reducing the amount of actual TV-like devices supported since many existing (and I believe some of the future) screens don't have an  ability to run a local HTTP server. Presentation API focuses more narrowly on the problem of viewing web content on a remote screen while trying to achieve better compatibility with a wider range of screens.

Anton Vayvod

unread,
Sep 10, 2014, 3:03:25 PM9/10/14
to Rich Tibbett, blink-dev, mfo...@chromium.org
+Mark Foltz

Rich Tibbett

unread,
Sep 11, 2014, 5:10:49 AM9/11/14
to ava...@chromium.org, blink-dev, mark a. foltz
On Wed, Sep 10, 2014 at 3:27 PM, Anton Vayvod <ava...@chromium.org> wrote:
>
> On Wed, Sep 10, 2014 at 10:28 AM, Rich Tibbett <ri...@opera.com> wrote:
>>
>> What is the benefit of this approach instead of, say, implementing an
>> Airplay-like mirroring/streaming solution? The only aspect of
>> mirroring/sharing exposed to web pages there is an x-webkit-airplay
>> attribute on media elements.
>
>
> With the Presentation API one can stream a presentation or a game. If the
> remote rendering is possible, one can achieve much better user experience in
> many aspects.

Second screen sharing implemented in OS and browser UI that a user can
start/stop by selecting a discovered second screen provides the very
best permissions model for this kind of sharing. It reminds me of
<input type=file> in that, in many respects, the user is not even
aware this is a permissions dialog.
There is a definite need to be able to connect and transfer UDP
streams toward second screens. What is less clear is if hiding all
HTTP-based communication toward second screen devices within
higher-level interfaces is the most robust solution we could come up
with. For example, in your model I will not be able to stream content
to AirPlay devices. With a more generic model I would (albeit media
content URLs rather than screen streaming).

> Presentation API focuses more narrowly on the problem of
> viewing web content on a remote screen while trying to achieve better
> compatibility with a wider range of screens.

This objective would be better explored as a Browser Feature rather
than a JavaScript API IMO. Both Firefox [1] and Safari [2] are
currently solving the use cases of the Presentation API without
introducing new APIs.

Mozilla support Chromecast devices as part of their implemented
screen-sharing feature [1]. The browser can then relay certain actions
from web pages to second screens (e.g. pause/play, stop, seek, volume,
mute).without JavaScript interaction.

Could we expose that kind of browser feature first before we start
introducing APIs?

[1] https://blog.mozilla.org/futurereleases/2014/09/05/road-test-sending-video-to-chromecast-and-roku-in-firefox-for-android-beta/

[2] https://www.apple.com/appletv/airplay/

Rottsches, Dominik

unread,
Sep 11, 2014, 8:38:50 AM9/11/14
to ri...@opera.com, mfo...@chromium.org, ava...@chromium.org, blin...@chromium.org
Hi Rich,

On Thu, 2014-09-11 at 11:10 +0200, Rich Tibbett wrote:
> There is a definite need to be able to connect and transfer UDP
> streams toward second screens. What is less clear is if hiding all
> HTTP-based communication toward second screen devices within
> higher-level interfaces is the most robust solution we could come up
> with. For example, in your model I will not be able to stream content
> to AirPlay devices. With a more generic model I would (albeit media
> content URLs rather than screen streaming).

One clarification, if I understand your assumption correctly:

On the implementation side, Presentation API can be implemented in two
ways - where one way is to render the page locally in the browser,
offscreen, and then stream it out via Airplay or other sinks like
Miracast where the remote end does not support HTML rendering. In that
sense, streaming content to Airplay is certainly possible. Presentation
API hides where the secondary user agent is running.

Dominik



Rottsches, Dominik

unread,
Sep 11, 2014, 8:47:12 AM9/11/14
to ri...@opera.com, mfo...@chromium.org, ava...@chromium.org, blin...@chromium.org
Rich, another note regarding your question,

On Thu, 2014-09-11 at 11:10 +0200, Rich Tibbett wrote:
> Mozilla support Chromecast devices as part of their implemented
> screen-sharing feature [1]. The browser can then relay certain actions
> from web pages to second screens (e.g. pause/play, stop, seek, volume,
> mute).without JavaScript interaction.
>
> Could we expose that kind of browser feature first before we start
> introducing APIs?

I would agree that use cases like casting a video that's embedded in a
page, like the Mozilla blog post explains, would not necessarily require
an API. Similar for just mirroring on your local page to a remote
display.

With the API we extend the flexibility for web developers and allow
control over how the local page and the secondary page interact, and
allowing use cases like presenting slides while having your speaker
notes on the local device, or several gaming scenarios where you use the
local device as controller, while the remote side shows the game screen,
etc.

Dominik

Rich Tibbett

unread,
Sep 11, 2014, 10:04:42 AM9/11/14
to Rottsches, Dominik, mfo...@chromium.org, ava...@chromium.org, blin...@chromium.org
If you can pass the current web page's URL to a remote device for
rendering you could then establish a P2P (encrypted) data channel
between the sender and receiver using WebRTC. You would just pass e.g.
a unique session ID in the page's URL parameters.

That does still require a way for the user to pass that URL to a
second screen but it is unclear why we need a JavaScript API instead
of just a feature in the UI that can do that.

>
> Dominik

Rich Tibbett

unread,
Sep 11, 2014, 10:14:27 AM9/11/14
to Rottsches, Dominik, mfo...@chromium.org, ava...@chromium.org, blin...@chromium.org
I was alluding to the fact that AirPlay is not likely to be supported
via the Presentation API in Chromium. Of course, I would like to be
proved wrong or have a way to get at such services even if they are
not officially supported.

>
> Dominik

Anton Vayvod

unread,
Sep 11, 2014, 12:19:04 PM9/11/14
to Rich Tibbett, blink-dev, mark a. foltz
We do on Android. One can Cast a video to a Chromecast since Chrome 35 [1].

The functionality is not good enough for the sites that want a customized playback experience. And it's only for a video.

Anton Vayvod

unread,
Sep 11, 2014, 12:38:19 PM9/11/14
to Rich Tibbett, Rottsches, Dominik, mfo...@chromium.org, blin...@chromium.org
Chromecast as well as other existing devices is using a custom local messaging protocol. One can bypass it by using a server to pair and pass messages between a sender and a receiver, however it's a more complex approach.


That does still require a way for the user to pass that URL to a
second screen but it is unclear why we need a JavaScript API instead
of just a feature in the UI that can do that.

How would the website specify a custom URL for the UA to use? As I see it, either an API or a tag or an attribute are needed.

Jonas Sicking

unread,
Sep 12, 2014, 2:24:54 AM9/12/14
to Rich Tibbett, blink-dev, mark a. foltz, ava...@chromium.org

On Sep 11, 2014 2:10 AM, "Rich Tibbett" <ri...@opera.com> wrote:
>
> On Wed, Sep 10, 2014 at 3:27 PM, Anton Vayvod <ava...@chromium.org> wrote:
> >
> > On Wed, Sep 10, 2014 at 10:28 AM, Rich Tibbett <ri...@opera.com> wrote:
> >>
> >> What is the benefit of this approach instead of, say, implementing an
> >> Airplay-like mirroring/streaming solution? The only aspect of
> >> mirroring/sharing exposed to web pages there is an x-webkit-airplay
> >> attribute on media elements.
> >
> >
> > With the Presentation API one can stream a presentation or a game. If the
> > remote rendering is possible, one can achieve much better user experience in
> > many aspects.
>
> Second screen sharing implemented in OS and browser UI that a user can
> start/stop by selecting a discovered second screen provides the very
> best permissions model for this kind of sharing. It reminds me of
> <input type=file> in that, in many respects, the user is not even
> aware this is a permissions dialog.

I know that we at some point talked about exposing the ability to allow a page to detect that it was being mirrored to a TV. And then enable it to call something like startSession with would automatically and transparently to the user, without any permission prompts, display a separate URL on the mirroring target.

It doesn't appear like we added that to the spec though. It is something that would provide the flow that you are talking about.

It seems worth exploring since I think it would fit very naturally in the current spec model.

/ Jonas

mfo...@chromium.org

unread,
Feb 9, 2015, 1:49:55 PM2/9/15
to blin...@chromium.org, ava...@chromium.org
[Cross-posted from chromium-dev@ as an FYI for blink-dev@]

Design Doc posted: Media Router & Web Presentation API

This project will enable Chrome to take advantage of external screens for display of local & Web content and implement the Web Presentation API.


Prototyping is near completion and we plan on upstreaming our work in Q1-Q2 of this year to ship in Q3.

m.

Anton Vayvod

unread,
Jun 18, 2015, 9:05:56 AM6/18/15
to mark a. foltz, blink-dev
We just switched the API status from test to experimental since the implementation on desktop is pretty much there (and Android version is under a review).
Reply all
Reply to author
Forward
0 new messages