- From: Jean-Claude Dufourd <jean-claude.dufourd@telecom-paristech.fr>
- Date: Mon, 11 Apr 2011 09:36:07 +0200
- To: Olivier Carmona <ocarmona@awox.com>
- CC: Giuseppe Pascale <giuseppep@opera.com>, "public-web-and-tv@w3.org" <public-web-and-tv@w3.org>, Matt Hammond <matt.hammond@rd.bbc.co.uk>
Dear Olivier It is clear for me too that UPnP will be central to the work of the HN TF. I listed the other technologies as a way to make sure our work is future-proof. For example, we implemented a widget+discovery+communication module with the GPAC player, based on UPnP. Description and videos and software are available at http://jcdufourd.wp.institut-telecom.fr/category/widgets-video/ and http://jcdufourd.wp.institut-telecom.fr/2011/02/11/how-to-reproduce-mpeg-u-widget-demos/ I believe our task is to create an interface (programmatic and/or declarative) from the document to service discovery and service communication protocols. As such, I proposed document-oriented use-cases. Best regards JC On 9/4/11 07:45 , Olivier Carmona wrote: > Dear Jean-Claude, > > The below list of technologies certainly sound interesting - even though those are rather WAN than LAN oriented -. I'd like to come back to understand the goal of the TF. Is the final goal to connect to the vast amount of already connected devices (paraphrasing the current requirement draft) or at creating a universal technology without any link to those devices. > > The question is not about "being attractive market-wise", but the question for every standard is at measuring the gap to reach the market, especially if the solution has to be adopted by consumer electronics vendors. > > That said, most of what I am reading here deeply relates to CEA 2014 (aka Web4CE HTML), which includes discovery, HTML presentation, etc... Now that all TV manufacturers in Europe made the step to include CE HTML browser to playback HbbTV content, is the aim of the TF "to boil the ocean" creating a totally different standard? > > Sorry for my fairly naïve and abrupt questions. > > With my best regards, > Olivier Carmona > > > -----Original Message----- > From: Jean-Claude Dufourd [mailto:jean-claude.dufourd@telecom-paristech.fr] > Sent: vendredi 8 avril 2011 17:40 > To: Giuseppe Pascale > Cc: public-web-and-tv@w3.org; Matt Hammond > Subject: Re: [HOME_NETWORK_TF] Some use cases and requirements for broadcast TV applications > > Thanks to Matt for starting the discussion on use cases. > > I propose more "technical" use cases below. My point is that there are > so many different ways of implementing Matt's excellent use cases, so my > point of view, even if less attractive marketing-wise, may avoid > misunderstandings. > > UC1: a document as host for discovered content: e.g. the document > displays content provided by a local, discovered device or service. This > is part of Matt's n°2 use case. > This UC1 asks the question: do we need a way to point to discovered > content ? > > UC2: a document as an interface to a service: the document provides a > remote user interface for a device (light switch, hifi volume control, > radio station chooser, etc) or a service on a device (remote control on > the media player software on a computer). This is one way of doing > Matt's n°3 use case. > This UC2 asks the questions: how does a document discover a service, how > does it match the service interface, and how does it send and receive > messages from the service ? > > UC3: a document as provider of a service: rendering a document on my > HbbTV set provides an EPG service on the network, that other documents > rendered on other devices (e.g. tablet) can discover and communicate > with to get the EPG information. Another way of looking at this use case > is: two documents discovering and communicating with each other. I think > this is Matt's n°4. > This UC3 asks the questions: how does a document expose a service, its > interface and respond to requests on this service ? > > UC4: a document moving across devices: e.g. I start using my phone for > the user interface to a device, then my tablet becomes available, so I > move the interface document to the tablet. > This UC4 asks the questions: how do we move a document and its current > "state" to another device ? what is the "state" ? > > UC5: a document spawning other documents on other devices and > communicating with them: e.g. the HbbTV set receives and renders a > document implementing some voting; the document discovers multiple > phones in the home, proposes to activate voting interfaces on each of > the viewers' phones, communicates with the voting interfaces, collates > votes and sends them back to the channel's voting service. It is a > variant on Matt's n°4. > This UC5 asks the questions: how does a document discover devices > running an appropriate user agent (hint: it could be a special service) ? > > I am not so sure about the next one: > > UC6: a document displaying two media from 2 different origins, one of > which local, and the two media have to be synchronized. This is Matt's > n°1. It is a refinement of my 2) in the sense that there is a need for > communicating a time offset: if the two media are live, but take > different technological paths (with different buffering times), there is > a need for a reference to be passed. > > I am not so sure, since in some video standards, it is possible to > provide absolute clock references, which would solve the problem "below" > our scope. > > Best regards > JC > > On 8/4/11 15:03 , Giuseppe Pascale wrote: >> Hi Matt, >> thanks for your contribution. >> This is very useful since one of the goals of the TF is to provide Use >> cases. >> In fact, I just added a section Use Cases to the Requirement Document >> DRAFT >> http://www.w3.org/2011/webtv/wiki/Home_Network_TF_Requirements#Use_cases_.28Informative.29 >> >> >> I have some comments (see inline). I would ask people to provide heir >> comments or additional usecases as well. >> Usecases that receive no major objection will be added in the >> Requirement document draft. >> >> So my comments: >> >> On Thu, 07 Apr 2011 13:15:46 +0200, Matt Hammond >> <matt.hammond@rd.bbc.co.uk> wrote: >>> 1) A web or native application that provides time-synchronised >>> content on a "companion" device >>> >>> Example: BBC Research "Autumnwatch Companion" [1] >>> >> I will remove this example from the "official" requirement document, >> unless people think is good to have link to real life examples. >> >>> A user watches a television programme (either live or playing back a >>> recording). >> Is "television programme" a generic enough term or should we use some >> other term like "piece of content", "video content", etc? >> >>> A laptop/tablet/other "companion" device displays a slideshow of >>> complimentary content. The slides change to match the different >>> segments of the television programme. If the programme is being >>> watched as a recording or on-demand, the user can jump to a different >>> point in the slideshow and the television will also seek to the >>> corresponding segment of the programme. >>> >> I would also include a use case where the user can go back/fwd in the >> slide show disabling synchronization >> and re-enable synchronization at any time when he wants to go "back on >> track". >> >>> Depending on the degree of timing synchronisation accuracy >>> achievable, another application is to play alternative personalised >>> audio. >>> Examples include: a director's commentary or alternative languages. >>> This might be streamed from the broadcaster's servers directly to the >>> companion device and played through headphones in synchrony with the >>> programme showing on the TV. >>> >> Maybe this can be described as a separate usecase since it probably >> requires a but more accuracy then a slide show. >> >>> The "companion" may use the API served by the TV to: >>> >> I would rephrase as "The "companion" may use an HN protocol to" >> >>> * identify which programme is being played >>> * read or write the time-index currently being played >>> * know if the user has stopped watching the programme or >>> skipped forwards or backwards using a different control >>> >>> The client may not have been the one who initiated the programme >>> viewing/streaming/playback. >>> >>> >>> >>> 2) Integration of television viewing into websites >>> >>> A broadcaster (or third party) web page is able to know what channel >>> and programme you are currently watching, and provide easy links >>> through to web pages or other content relating to that programme. >>> When reading a web page about a specific programme or series, the web >>> page is able to detect if your TV can access that programme through >>> an on-demand service or a recording. If it can, the web page can >>> offer to play it on the TV. The web page can also offer to schedule a >>> recording on your TV. >>> >>> Javascript in the web page running within the browser may use the API >>> served by the TV to: >>> >>> * identify which programme is being played >>> * be able to discover what content is available through on-demand >>> services and the broadcast programme guide >>> * discover and play programmes that have been recorded >>> (or that could be streamed from another device) >>> * schedule recordings >>> >>> >> Is this use case to be interpreted from the PoV of the "companion" >> device? >> I.e. are you describing a user browsing with his laptop a webside and >> discovering in the home network any related piece of content? >> >>> 3) Alternative remote controls >>> >>> A dedicated physical device, web page or application on a mobile >>> device could act as an alternative remote control device. The >>> interface might provide alternative, enhanced means of browsing >>> available content and programme schedules and the ability to control >>> the TV. The interface might be a dedicated simplified device with >>> physical buttons representing only the most common tasks for users >>> with physical disabilities or cognitive impairments. >>> >>> Such a remote control may wish to: >>> >>> * toggle the TV between "on" and "standby" >>> * be able to discover what channels and programmes are available >>> through on-demand services and the broadcast programme guide >>> * access basic programme metadata (title, description, genre etc) >>> * change channel >>> * change volume >>> * enable subtitles, audio description services etc >>> * book, play and delete recordings >>> * seek and pause playback >>> * play programmes from on-demand services >>> * play other media the TV can access on the home network >>> * activate/de-activate interactive services >>> >>> >> This Use Case looks fine but is probably a bit too generic. >> E.g. if the outcome of the discussion is a JS API, I don't see how a >> "simplified device with physical buttons" would benefit from it. >> If we were defining a new HN protocol, that would make sense, but I'm >> not sure we will reach that level (this is up for the TF participants >> to discus of course) >> So I would rephrase a little bit to focus only on the "web page" case >> and live out for the moment the native application and physical device >> >> >> >>> 4) Enabling on-screen applications to interact with client devices in >>> the home network >>> >>> A website or native application on a client device can communicate >>> with an interactive widget, application or service on the TV. For >>> example: a game-show/quiz may enable users to "play along", using >>> their own mobile phones in time with the broadcast programme. Scores >>> could be collated and compared on the TV screen. >>> >>> Multiple client devices may wish to communicate with the API >>> simultaneously. >>> >> I would rephrase in "Multiple client devices may wish to communicate >> with the TV simultaneously. >> >>> A client may need to: >>> * activate/de-activate a particular interactive widget, service >>> or application on the TV >>> * send to and receive data from the widget, service or application >>> >>> >> Did you imagine also gaming in this use case? Where for game I don't >> mean a quiz but a Console-like gaming >> >> >>> 5) Integrating social media with viewing >>> >>> A social media web page or application is able to know what channel >>> and programme you are currently watching and attach that contextual >>> information to your social media postings. A different user may >>> receive this recommendation and use the web page or application to >>> request that the programme be displayed on their TV. This enables >>> direct recommendation of programmes or collation of messages relating >>> to a particular programme. >>> >>> The application or web page may use the API served by the TV to: >>> >>> * identify which programme is being played >>> * change channel or play a recording or on-demand programme >>> >>> >>> >>> >>> >>> Matt >>> >>> [1] >>> http://www.bbc.co.uk/blogs/researchanddevelopment/2010/11/the-autumnwatch-tv-companion-e.shtml >>> >>> -- >>> | Matt Hammond >>> | Research Engineer, BBC R&D, Centre House, London >>> | http://www.bbc.co.uk/rd/ >>> >> > > -- > JC Dufourd > Directeur d'Etudes/Professor > Groupe Multimedia/Multimedia Group > Traitement du Signal et Images/Signal and Image Processing > Telecom ParisTech, 37-39 rue Dareau, 75014 Paris, France > Tel: +33145817733 - Mob: +33677843843 - Fax: +33145817144 > > > > > __________ Information provenant d'ESET NOD32 Antivirus, version de la base des signatures de virus 6027 (20110408) __________ > > Le message a été vérifié par ESET NOD32 Antivirus. > > http://www.eset.com > > > > __________ Information provenant d'ESET NOD32 Antivirus, version de la base des signatures de virus 6027 (20110408) __________ > > Le message a été vérifié par ESET NOD32 Antivirus. > > http://www.eset.com > -- JC Dufourd Directeur d'Etudes/Professor Groupe Multimedia/Multimedia Group Traitement du Signal et Images/Signal and Image Processing Telecom ParisTech, 37-39 rue Dareau, 75014 Paris, France Tel: +33145817733 - Mob: +33677843843 - Fax: +33145817144
Received on Monday, 11 April 2011 07:36:34 UTC