Re: Second Screen Presentation Community Group Proposed

On 11/7/2013 12:38 AM, JC Verdié wrote:
>
>
> Wayne Carr wrote:
>> I wanted to make clear what's discussed below (or in the Web & TV IG) is
>> a much larger scope than what this proposed CG is for, so would be a
>> different effort rather than this CG.  This CG has a narrow scope to
>> solve a particular problem.  Narrow to keep it simpler for participants
>> to get their needed internal approvals (e.g. for licensing). I've also
>> cc'd the folks proposing this.  (searching I don't see any discussion of
>> Miracast, Wi-Di, etc in the Web & TV IG -- that's what this CG is all
>> about - enabling use of those without direct low level support of any of
>> them).
>>
>> This CG is aimed at a device that has some ways to connect to secondary
>> displays (really secondary because no UI, no code execution - just
>> display).  Common ways are Miracast, Airplay, Wi-Di, video port, hdmi.
>> So something where it is easy for a native app to display on the second
>> display - and a thin abstract Web api on top of that to let web pages
>> use it.
>>
>> Here's an example of how it could work.
>>
>> - Web page asks the browser to display an url on a secondary display.
>>
>> - If browser knows how to display on another display (hdmi, video port,
>> Miracast, WiDi, AirPlay - not in the spec, it's an implementation
>> detail), it asks the user to approve using the display and to choose
>> which one.
>>
>> - Requesting page gets back either access to the second Browsing context
>> (like window.open) or just a way to exchange messages with the other
>> page.  (possibly defining a few likely commonly used messages - like
>> some for some of the HTMLMediaElement apis like play, pause - just for
>> the displaying video or slides or things where that type of message
>> makes sense)
>>
>> - not in the spec, but browser on initiating device could create the
>> second Browsing context and use OS facilities to display it with
>> Miracast (or others) converting it to video and sending it to a second
>> display screen.  All the implementation is on the initiating device.
>> the second screen just supports Miracast in this example - it is just
>> receiving and playing video.
>>
>> - also not in the spec, but the browser could support other ways of
>> attaching to secondary displays just for displaying video.  like with a
>> Chromecast like system, the browser could tell the other device to load
>> and display the URL.  It could then translate the CG standardized media
>> control messages with whatever that device uses to do things like pause,
>> etc.  All the spec would know about that would be there would be an
>> alternative that only permits a s some defined subset of messages for
>> control (that browsers could likely convert for use in Chromecast like
>> systems).  So it could indicate it would restrict itself just to CG
>> defined messages and not send arbitrary messages to allow the browser to
>> translate those to particular underlying systems like Chromecast.
>>
>> Summary: a simple api to sit on top of some existing ways to display on
>> secondary displays.  Page asks to display an url on a second display.
>> Browser checks with user.  At a minimum, initiating page can exchange
>> arbitrary messages with the second page.  may be some defined messages
>> related to media control.  may be access to DOM sometimes for second
>> page.   initiating page doesn't know if second page is being processed
>> locally and video sent out or remotely and browser knows how to talk to
>> something remote.  that's in the implementation.
>>
>> With how the content winds up on the other screen not in the spec, that
>> would allow experimentation and standardization elsewhere for various
>> ways to do that.  So trying to enable the easy things like Miracast or
>> HDMI support (without the details of those) and enabling something more
>> complicated in implementation or standardization is some other group or
>> this one under some future charter.
>
> AFAIU it could also be solved in a simple and (relatively) 
> straightforward way using two browsers with a DIAL-based communication 
> to initiate the "discussion", then continue with a REST discussion, or 
> initiate a websocket if needed.
>
> Simple, but clearly there is value in simplifying this from within the 
> browser as the group proposes.
>
> Unless I missed something in the group presentation, we'll need an 
> actual browser on both screen anyway (If they plan to "stream" the 
> content of the "remote popup", we're heading to a dead-end IMHO).

On the second display device, there doesn't need to be any software at 
all related to Web pages.

Here's a primary case that motivates this.

- user  has a slide show on a tablet that supports Miracast
- room has a big TV that supports Miracast
- Page in Browser on the tabletshows a remote control like page and a 
small version of the slides
- Page on tablet asks for use of second display to display a web page 
that is the slide
- Browser on tablet asks user and user says OK, Browser creates a second 
display
- Browser on tablet creates a second browsing context (with the 
characteristics of the second display) for that second page and uses the 
OS to hook up output using Miracast. (so browser on tablet is handling 
both pages)
- Browser on tablet then does the usual for both Browsing contexts, like 
the page just opened up a tab of popup.
- The OS Miracast support translates what is rendered for the 2nd 
browsing content to video and sends it to the second display as video.
- User uses Web page on their tablet to control slide show that gets 
seen on second display -- but all the web page related work is on the 
tablet.

There are other possible use cases, but that's the one that drove 
creation of this group.  Being able to use Miracast or Widi or HDMI 
connected second displays.



>
>
> Regards,
> JC

Received on Thursday, 7 November 2013 21:56:59 UTC