RE: [HOME_NETWORK_TF] Some use cases and requirements for broadcast TV applications



> -----Original Message-----
> From: Jean-Claude Dufourd [mailto:jean-claude.dufourd@telecom-
> paristech.fr]
> Sent: Monday, April 11, 2011 1:45 AM
> To: Bob Lund
> Cc: Giuseppe Pascale; public-web-and-tv@w3.org; Matt Hammond
> Subject: Re: [HOME_NETWORK_TF] Some use cases and requirements for
> broadcast TV applications
>
> On 8/4/11 22:37 , Bob Lund wrote:
> > I have some comments in the technical use cases, in line below.
> >
> >> -----Original Message-----
> >> From: public-web-and-tv-request@w3.org [mailto:public-web-and-tv-
> >> request@w3.org] On Behalf Of Jean-Claude Dufourd
> >> Sent: Friday, April 08, 2011 9:40 AM
> >> To: Giuseppe Pascale
> >> Cc: public-web-and-tv@w3.org; Matt Hammond
> >> Subject: Re: [HOME_NETWORK_TF] Some use cases and requirements for
> >> broadcast TV applications
> >>
> >> Thanks to Matt for starting the discussion on use cases.
> >>
> >> I propose more "technical" use cases below.  My point is that there
> >> are so many different ways of implementing Matt's excellent use
> >> cases, so my point of view, even if less attractive marketing-wise,
> >> may avoid misunderstandings.
> >>
> >> UC1: a document as host for discovered content: e.g. the document
> >> displays content provided by a local, discovered device or service.
> >> This is part of Matt's n°2 use case.
> >> This UC1 asks the question: do we need a way to point to discovered
> >> content ?
> >>
> >> UC2: a document as an interface to a service: the document provides a
> >> remote user interface for a device (light switch, hifi volume
> >> control, radio station chooser, etc) or a service on a device (remote
> >> control on the media player software on a computer). This is one way
> >> of doing Matt's n°3 use case.
> >> This UC2 asks the questions: how does a document discover a service,
> >> how does it match the service interface, and how does it send and
> >> receive messages from the service ?
> >>
> >> UC3: a document as provider of a service: rendering a document on my
> >> HbbTV set provides an EPG service on the network, that other
> >> documents rendered on other devices (e.g. tablet) can discover and
> >> communicate with to get the EPG information.
> > Agreed as long as the EPG is exposed on the home network just like any
> other service.
>
> JCD: Absolutely.
>
> >> Another way of looking at this use case
> >> is: two documents discovering and communicating with each other. I
> >> think this is Matt's n°4.
> > Disagree with the generalization. The term "document" is being used in
> a manner synonymous with a web page. I think a general mechanism to
> allow web pages to discover and communicate with one another is out of
> scope.
> JCD: If a document can provide a service (UC3) and another document
> (UC2) can use a service, this is a consequence.
>
>
> >
> >> This UC3 asks the questions: how does a document expose a service,
> its
> >> interface and respond to requests on this service ?
> >>
> >> UC4: a document moving across devices: e.g. I start using my phone
> for
> >> the user interface to a device, then my tablet becomes available, so
> I
> >> move the interface document to the tablet.
> >> This UC4 asks the questions: how do we move a document and its
> current
> >> "state" to another device ? what is the "state" ?
> > I think this is out of scope. This can be solved at the application
> level and does not seem to fit into the general model of discovering and
> communicating with services on the HN.
> JCD: I really do not understand your point of view. You agree to UC5
> below, but UC4 can be seen as a simpler variant: a document can spawn
> "itself" on another device, pass itself state information, and stop on
> the initial device.
> Best regards
> JC
>

UC4 seems like a web server function: UI from server X moves from client A to client B. This could be done by a web server with HTML5 within a home network or across the internet. Nothing new is required, I think. UC5 has the element of device A discovering a service on device B and device A creating UI presented on device B.

Bob

> >> UC5: a document spawning other documents on other devices and
> >> communicating with them: e.g. the HbbTV set receives and renders a
> >> document implementing some voting; the document discovers multiple
> >> phones in the home, proposes to activate voting interfaces on each of
> >> the viewers' phones, communicates with the voting interfaces,
> collates
> >> votes and sends them back to the channel's voting service. It is a
> >> variant on Matt's n°4.
> > If the discovered phones host services to do this then I agree this is
> in scope.
> >
> >> This UC5 asks the questions: how does a document discover devices
> >> running an appropriate user agent (hint: it could be a special
> service)
> >> ?
> >>
> >> I am not so sure about the next one:
> >>
> >> UC6: a document displaying two media from 2 different origins, one of
> >> which local, and the two media have to be synchronized. This is
> Matt's
> >> n°1. It is a refinement of my 2) in the sense that there is a need
> for
> >> communicating a time offset: if the two media are live, but take
> >> different technological paths (with different buffering times), there
> is
> >> a need for a reference to be passed.
> >>
> >> I am not so sure, since in some video standards, it is possible to
> >> provide absolute clock references, which would solve the problem
> "below"
> >> our scope.
> > I think this is out of scope. The media synchronization problem is a
> rendering issue.
> >
> > Regards,
> > Bob Lund
> >
> >> Best regards
> >> JC
> >>
> >> On 8/4/11 15:03 , Giuseppe Pascale wrote:
> >>> Hi Matt,
> >>> thanks for your contribution.
> >>> This is very useful since one of the goals of the TF is to provide
> Use
> >>> cases.
> >>> In fact, I just added a section Use Cases to the Requirement
> Document
> >>> DRAFT
> >>>
> >>>
> http://www.w3.org/2011/webtv/wiki/Home_Network_TF_Requirements#Use_cas

> >>> es_.28Informative.29
> >>>
> >>>
> >>> I have some comments (see inline). I would ask people to provide
> heir
> >>> comments or additional usecases as well.
> >>> Usecases that receive no major objection will be added in the
> >>> Requirement document draft.
> >>>
> >>> So my comments:
> >>>
> >>> On Thu, 07 Apr 2011 13:15:46 +0200, Matt Hammond
> >>> <matt.hammond@rd.bbc.co.uk>  wrote:
> >>>> 1) A web or native application that provides time-synchronised
> >>>> content on a "companion" device
> >>>>
> >>>> Example: BBC Research "Autumnwatch Companion" [1]
> >>>>
> >>> I will remove this example from the "official" requirement document,
> >>> unless people think is good to have link to real life examples.
> >>>
> >>>> A user watches a television programme (either live or playing back
> a
> >>>> recording).
> >>> Is "television programme" a generic enough term or should we use
> some
> >>> other term like "piece of content", "video content", etc?
> >>>
> >>>> A laptop/tablet/other "companion" device displays a slideshow of
> >>>> complimentary content. The slides change to match the different
> >>>> segments of the television programme. If the programme is being
> >>>> watched as a recording or on-demand, the user can jump to a
> different
> >>>> point in the slideshow and the television will also seek to the
> >>>> corresponding segment of the programme.
> >>>>
> >>> I would also include a use case where the user can go back/fwd in
> the
> >>> slide show disabling synchronization and re-enable synchronization
> at
> >>> any time when he wants to go "back on track".
> >>>
> >>>> Depending on the degree of timing synchronisation accuracy
> >>>> achievable, another application is to play alternative personalised
> >>>> audio.
> >>>> Examples include: a director's commentary or alternative languages.
> >>>> This might be streamed from the broadcaster's servers directly to
> the
> >>>> companion device and played through headphones in synchrony with
> the
> >>>> programme showing on the TV.
> >>>>
> >>> Maybe this can be described as a separate usecase since it probably
> >>> requires a but more accuracy then a slide show.
> >>>
> >>>> The "companion" may use the API served by the TV to:
> >>>>
> >>> I would rephrase as "The "companion" may use an HN protocol to"
> >>>
> >>>>    * identify which programme is being played
> >>>>    * read or write the time-index currently being played
> >>>>    * know if the user has stopped watching the programme or
> >>>>      skipped forwards or backwards using a different control
> >>>>
> >>>> The client may not have been the one who initiated the programme
> >>>> viewing/streaming/playback.
> >>>>
> >>>>
> >>>>
> >>>> 2) Integration of television viewing into websites
> >>>>
> >>>> A broadcaster (or third party) web page is able to know what
> channel
> >>>> and programme you are currently watching, and provide easy links
> >>>> through to web pages or other content relating to that programme.
> >>>> When reading a web page about a specific programme or series, the
> web
> >>>> page is able to detect if your TV can access that programme through
> >>>> an on-demand service or a recording. If it can, the web page can
> >>>> offer to play it on the TV. The web page can also offer to schedule
> a
> >>>> recording on your TV.
> >>>>
> >>>> Javascript in the web page running within the browser may use the
> API
> >>>> served by the TV to:
> >>>>
> >>>>    * identify which programme is being played
> >>>>    * be able to discover what content is available through on-
> demand
> >>>>      services and the broadcast programme guide
> >>>>    * discover and play programmes that have been recorded
> >>>>      (or that could be streamed from another device)
> >>>>    * schedule recordings
> >>>>
> >>>>
> >>> Is this use case to be interpreted from the PoV of the "companion"
> >>> device?
> >>> I.e. are you describing a user browsing with his laptop a webside
> and
> >>> discovering in the home network any related piece of content?
> >>>
> >>>> 3) Alternative remote controls
> >>>>
> >>>> A dedicated physical device, web page or application on a mobile
> >>>> device could act as an alternative remote control device. The
> >>>> interface might provide alternative, enhanced means of browsing
> >>>> available content and programme schedules and the ability to
> control
> >>>> the TV. The interface might be a dedicated simplified device with
> >>>> physical buttons representing only the most common tasks for users
> >>>> with physical disabilities or cognitive impairments.
> >>>>
> >>>> Such a remote control may wish to:
> >>>>
> >>>>    * toggle the TV between "on" and "standby"
> >>>>    * be able to discover what channels and programmes are available
> >>>>      through on-demand services and the broadcast programme guide
> >>>>    * access basic programme metadata (title, description, genre
> etc)
> >>>>    * change channel
> >>>>    * change volume
> >>>>    * enable subtitles, audio description services etc
> >>>>    * book, play and delete recordings
> >>>>    * seek and pause playback
> >>>>    * play programmes from on-demand services
> >>>>    * play other media the TV can access on the home network
> >>>>    * activate/de-activate interactive services
> >>>>
> >>>>
> >>> This Use Case looks fine but is probably a bit too generic.
> >>> E.g. if the outcome of the discussion is a JS API, I don't see how a
> >>> "simplified device with physical buttons" would benefit from it.
> >>> If we were defining a new HN protocol, that would make sense, but
> I'm
> >>> not sure we will reach that level (this is up for the TF
> participants
> >>> to discus of course) So I would rephrase a little bit to focus only
> on
> >>> the "web page" case and live out for the moment the native
> application
> >>> and physical device
> >>>
> >>>
> >>>
> >>>> 4) Enabling on-screen applications to interact with client devices
> in
> >>>> the home network
> >>>>
> >>>> A website or native application on a client device can communicate
> >>>> with an interactive widget, application or service on the TV. For
> >>>> example: a game-show/quiz may enable users to "play along", using
> >>>> their own mobile phones in time with the broadcast programme.
> Scores
> >>>> could be collated and compared on the TV screen.
> >>>>
> >>>> Multiple client devices may wish to communicate with the API
> >>>> simultaneously.
> >>>>
> >>> I would rephrase in "Multiple client devices may wish to communicate
> >>> with the TV simultaneously.
> >>>
> >>>> A client may need to:
> >>>>    * activate/de-activate a particular interactive widget, service
> >>>>      or application on the TV
> >>>>    * send to and receive data from the widget, service or
> application
> >>>>
> >>>>
> >>> Did you imagine also gaming in this use case? Where for game I don't
> >>> mean a quiz but a Console-like gaming
> >>>
> >>>
> >>>> 5) Integrating social media with viewing
> >>>>
> >>>> A social media web page or application is able to know what channel
> >>>> and programme you are currently watching and attach that contextual
> >>>> information to your social media postings. A different user may
> >>>> receive this recommendation and use the web page or application to
> >>>> request that the programme be displayed on their TV. This enables
> >>>> direct recommendation of programmes or collation of messages
> relating
> >>>> to a particular programme.
> >>>>
> >>>> The application or web page may use the API served by the TV to:
> >>>>
> >>>>    * identify which programme is being played
> >>>>    * change channel or play a recording or on-demand programme
> >>>>
> >>>>
> >>>>
> >>>>
> >>>>
> >>>> Matt
> >>>>
> >>>> [1]
> >>>> http://www.bbc.co.uk/blogs/researchanddevelopment/2010/11/the-

> autumnw
> >>>> atch-tv-companion-e.shtml
> >>>>
> >>>> --
> >>>> | Matt Hammond
> >>>> | Research Engineer, BBC R&D, Centre House, London
> >>>> | http://www.bbc.co.uk/rd/

> >>>>
> >>>
> >>
> >> --
> >> JC Dufourd
> >> Directeur d'Etudes/Professor
> >> Groupe Multimedia/Multimedia Group
> >> Traitement du Signal et Images/Signal and Image Processing Telecom
> >> ParisTech, 37-39 rue Dareau, 75014 Paris, France
> >> Tel: +33145817733 - Mob: +33677843843 - Fax: +33145817144
> >>
>
>
> --
> JC Dufourd
> Directeur d'Etudes/Professor
> Groupe Multimedia/Multimedia Group
> Traitement du Signal et Images/Signal and Image Processing
> Telecom ParisTech, 37-39 rue Dareau, 75014 Paris, France
> Tel: +33145817733 - Mob: +33677843843 - Fax: +33145817144

Received on Monday, 11 April 2011 19:53:43 UTC