Re: Second Screen Presentation Community Group Proposed

We use multi screen also.

On Nov 6, 2013, at 3:02 AM, "Bassbouss, Louay" <louay.bassbouss@fokus.fraunhofer.de<mailto:louay.bassbouss@fokus.fraunhofer.de>> wrote:

I like more the term „Multi-Screen“ or “Multiscreen” :) it is context independent (what is first and what is second).

Best Regards,
| Dipl.-Ing. Louay Bassbouss
| Project Manager
| Future Applications and Media
|
| Fraunhofer Institute for Open Communication Systems
| Kaiserin-Augusta-Allee 31 | 10589 Berlin | Germany
| Phone 49 30 - 3463 - 7275
| louay.bassbouss@fokus.fraunhofer.de<mailto:louay.bassbouss@fokus.fraunhofer.de>
| www.fokus.fraunhofer.de<http://www.fokus.fraunhofer.de/>

From: Wayne Carr [mailto:wayne.carr@linux.intel.com]
Sent: Mittwoch, 6. November 2013 08:44
To: public-web-and-tv@w3.org<mailto:public-web-and-tv@w3.org>
Subject: Re: Second Screen Presentation Community Group Proposed


>  I would support such a group and would like to participate but the vocable "second screen" is now a misnomer as a the tablet is becoming the first screen.
> How about the "companion screen" instead. In any case good idea as the demands of the non traditional screening devices require new APIs. I would probably
> be interesting to link this work to some of the webrtc work as well as to some of the "acceleration" mechanisms proposed in the IETF which while not directly
>  related to the "how the device is connected" is inherently linked to user experience.
>
>  Marie-Jose Montpetit mariejo@mit.edu<mailto:mariejo@mit.edu?Subject=Re%3A%20Second%20Screen%20Presentation%20Community%20Group%20Proposed&In-Reply-To=%3C10CA5F6D-15C1-455D-AF7C-71E807940242%40mit.edu%3E&References=%3C10CA5F6D-15C1-455D-AF7C-71E807940242%40mit.edu%3E>


The folks proposing this CG do mean the tablet as the "first screen" and some other display as the second screen.    They don't mean "second screen" in the sense of a TV being the primary screen and a tablet having auxiliary, related content.  They mean a tablet and a monitor attached maybe through Miracast is the "second screen" and the tablet generates a slide show, converts it to video with Miracast that shows up on the second screen so everyone in a room could see it.

This is intended to expose multiple displays attached to a device so a Web page could use these other screens.  But, the name is going to confuse people.

An implementation could be the UA on the tablet rendering both what's seen on the tablet and the page that is seen on the TV and using however a second display is attached to show the content on that second display.  Initially, it would be just trying to provide access for second displays attached by hdmi, Miracast, etc.. So a very simple API.

They wrote a draft spec that is like opening an html page in a popup, except it shows up full screen on a second display device.  Otherwise, the content can be manipulated as if it were a window on the same device.

The charter has a description of what they're trying to enable.
https://github.com/webscreens/presentation-api/wiki/Second-Screen-Community-Group-Charter

Here's their draft of one way the API could be defined.
http://webscreens.github.io/presentation-api/

Received on Wednesday, 6 November 2013 14:26:29 UTC