Re: Second Screen Presentation Community Group Proposed

I wanted to make clear what's discussed below (or in the Web & TV IG) is 
a much larger scope than what this proposed CG is for, so would be a 
different effort rather than this CG.  This CG has a narrow scope to 
solve a particular problem.  Narrow to keep it simpler for participants 
to get their needed internal approvals (e.g. for licensing).  I've also 
cc'd the folks proposing this. (searching I don't see any discussion of 
Miracast, Wi-Di, etc in the Web & TV IG -- that's what this CG is all 
about - enabling use of those without direct low level support of any of 
them).

This CG is aimed at a device that has some ways to connect to secondary 
displays (really secondary because no UI, no code execution - just 
display).  Common ways are Miracast, Airplay, Wi-Di, video port, hdmi.  
So something where it is easy for a native app to display on the second 
display - and a thin abstract Web api on top of that to let web pages 
use it.

Here's an example of how it could work.

- Web page asks the browser to display an url on a secondary display.

- If browser knows how to display on another display (hdmi, video port, 
Miracast, WiDi, AirPlay - not in the spec, it's an implementation 
detail), it asks the user to approve using the display and to choose 
which one.

- Requesting page gets back either access to the second Browsing context 
(like window.open) or just a way to exchange messages with the other 
page.  (possibly defining a few likely commonly used messages - like 
some for some of the HTMLMediaElement apis like play, pause - just for 
the displaying video or slides or things where that type of message 
makes sense)

- not in the spec, but browser on initiating device could create the 
second Browsing context and use OS facilities to display it with 
Miracast (or others) converting it to video and sending it to a second 
display screen.  All the implementation is on the initiating device.  
the second screen just supports Miracast in this example - it is just 
receiving and playing video.

- also not in the spec, but the browser could support other ways of 
attaching to secondary displays just for displaying video.  like with a 
Chromecast like system, the browser could tell the other device to load 
and display the URL.  It could then translate the CG standardized media 
control messages with whatever that device uses to do things like pause, 
etc.  All the spec would know about that would be there would be an 
alternative that only permits a s some defined subset of messages for 
control (that browsers could likely convert for use in Chromecast like 
systems).  So it could indicate it would restrict itself just to CG 
defined messages and not send arbitrary messages to allow the browser to 
translate those to particular underlying systems like Chromecast.

Summary: a simple api to sit on top of some existing ways to display on 
secondary displays.  Page asks to display an url on a second display.  
Browser checks with user.  At a minimum, initiating page can exchange 
arbitrary messages with the second page.  may be some defined messages 
related to media control.  may be access to DOM sometimes for second 
page.   initiating page doesn't know if second page is being processed 
locally and video sent out or remotely and browser knows how to talk to 
something remote.  that's in the implementation.

With how the content winds up on the other screen not in the spec, that 
would allow experimentation and standardization elsewhere for various 
ways to do that.  So trying to enable the easy things like Miracast or 
HDMI support (without the details of those) and enabling something more 
complicated in implementation or standardization is some other group or 
this one under some future charter.






On 11/6/2013 7:05 AM, Ng, Sheau (NBCUniversal) wrote:
> +1
>
> There’s nothing to stop a TV from having the ability of a tablet, 
> possibly with different UI, and vice versa. It’ll be useful for a 
> group within the W3C community to tackle the issues without getting 
> caught up in “Who's on first, What’s on second”.
>
> -- 
> Sheau Ng | NBCUniversal | P: +1.609.759.0819
>
>
> From: <Vickers>, Mark Vickers <Mark_Vickers@cable.comcast.com 
> <mailto:Mark_Vickers@cable.comcast.com>>
> Date: Wednesday, November 6, 2013 at 9:25 AM
> To: "Bassbouss, Louay" <louay.bassbouss@fokus.fraunhofer.de 
> <mailto:louay.bassbouss@fokus.fraunhofer.de>>
> Cc: Wayne Carr <wayne.carr@linux.intel.com 
> <mailto:wayne.carr@linux.intel.com>>, "public-web-and-tv@w3.org 
> <mailto:public-web-and-tv@w3.org>" <public-web-and-tv@w3.org 
> <mailto:public-web-and-tv@w3.org>>
> Subject: Re: Second Screen Presentation Community Group Proposed
> Resent-From: <public-web-and-tv@w3.org <mailto:public-web-and-tv@w3.org>>
> Resent-Date: Wednesday, November 6, 2013 at 9:26 AM
>
> We use multi screen also.
>
> On Nov 6, 2013, at 3:02 AM, "Bassbouss, Louay" 
> <louay.bassbouss@fokus.fraunhofer.de 
> <mailto:louay.bassbouss@fokus.fraunhofer.de>> wrote:
>
>> I like more the term „Multi-Screen“ or “Multiscreen” Jit is context 
>> independent (what is first and what is second).
>>
>> Best Regards,
>>
>> | Dipl.-Ing. Louay Bassbouss
>>
>> | Project Manager
>>
>> | Future Applications and Media
>>
>> |
>>
>> | Fraunhofer Institute for Open Communication Systems
>>
>> | Kaiserin-Augusta-Allee 31 | 10589 Berlin | Germany
>>
>> | Phone 49 30 - 3463 - 7275
>>
>> | louay.bassbouss@fokus.fraunhofer.de 
>> <mailto:louay.bassbouss@fokus.fraunhofer.de>
>>
>> | www.fokus.fraunhofer.de <http://www.fokus.fraunhofer.de/>
>>
>> *From:*Wayne Carr [mailto:wayne.carr@linux.intel.com]
>> *Sent:* Mittwoch, 6. November 2013 08:44
>> *To:* public-web-and-tv@w3.org <mailto:public-web-and-tv@w3.org>
>> *Subject:* Re: Second Screen Presentation Community Group Proposed
>>
>>
>> >  I would support such a group and would like to participate but the 
>> vocable "second screen" is now a misnomer as a the tablet is becoming 
>> the first screen.
>> > How about the "companion screen" instead. In any case good idea as 
>> the demands of the non traditional screening devices require new 
>> APIs. I would probably
>> > be interesting to link this work to some of the webrtc work as well 
>> as to some of the "acceleration" mechanisms proposed in the IETF 
>> which while not directly
>> >  related to the "how the device is connected" is inherently linked 
>> to user experience.
>> >
>> >  Marie-Jose Montpetit mariejo@mit.edu 
>> <mailto:mariejo@mit.edu?Subject=Re%3A%20Second%20Screen%20Presentation%20Community%20Group%20Proposed&In-Reply-To=%3C10CA5F6D-15C1-455D-AF7C-71E807940242%40mit.edu%3E&References=%3C10CA5F6D-15C1-455D-AF7C-71E807940242%40mit.edu%3E> 
>>
>>
>>
>> The folks proposing this CG do mean the tablet as the "first screen" 
>> and some other display as the second screen.    They don't mean 
>> "second screen" in the sense of a TV being the primary screen and a 
>> tablet having auxiliary, related content.  They mean a tablet and a 
>> monitor attached maybe through Miracast is the "second screen" and 
>> the tablet generates a slide show, converts it to video with Miracast 
>> that shows up on the second screen so everyone in a room could see it.
>>
>> This is intended to expose multiple displays attached to a device so 
>> a Web page could use these other screens.  But, the name is going to 
>> confuse people.
>>
>> An implementation could be the UA on the tablet rendering both what's 
>> seen on the tablet and the page that is seen on the TV and using 
>> however a second display is attached to show the content on that 
>> second display.  Initially, it would be just trying to provide access 
>> for second displays attached by hdmi, Miracast, etc.. So a very 
>> simple API.
>>
>> They wrote a draft spec that is like opening an html page in a popup, 
>> except it shows up full screen on a second display device.  
>> Otherwise, the content can be manipulated as if it were a window on 
>> the same device.
>>
>> The charter has a description of what they're trying to enable.
>> https://github.com/webscreens/presentation-api/wiki/Second-Screen-Community-Group-Charter
>>
>> Here's their draft of one way the API could be defined.
>> http://webscreens.github.io/presentation-api/
>>
>>
>>

Received on Wednesday, 6 November 2013 19:31:42 UTC