- From: Matt Hammond <matt.hammond@rd.bbc.co.uk>
- Date: Fri, 08 Apr 2011 16:29:50 +0100
- To: "Giuseppe Pascale" <giuseppep@opera.com>, "public-web-and-tv@w3.org" <public-web-and-tv@w3.org>
Hi Guiseppe, Glad this is useful. Some thoughts and clarifications in response to your comments are inline below: regards Matt On Fri, 08 Apr 2011 14:03:51 +0100, Giuseppe Pascale <giuseppep@opera.com> wrote: >> A user watches a television programme (either live or playing back >> a recording). > Is "television programme" a generic enough term or should we use some > other term like "piece of content", "video content", etc? A more generic term would make sense. But even a word such as 'video' or 'content' could carry assumptions. Perhaps we should, preface its first use with a non-exhaustive list of what could constitute 'content'? For example: * audio or video stored on device(s) in the HN * television or radio programmes ... * downloaded or streamed via IP * received from live broadcasts (terrestrial, satellite, cable etc) * recorded from live broadcasts >> 1) A web or native application that provides time-synchronised content >> on a "companion" device ... >> A laptop/tablet/other "companion" device displays a slideshow of >> complimentary content. The slides change to match the different >> segments of the television programme. If the programme is being watched >> as a recording or on-demand, the user can jump to a different point in >> the slideshow and the television will also seek to the corresponding >> segment of the programme. >> > I would also include a use case where the user can go back/fwd in the > slide show disabling synchronization > and re-enable synchronization at any time when he wants to go "back on > track". +1 Aside:I omitted to explicitly state that 'slides' are not by implication static. They could be implemented as individual web pages or JS driven changes to a single page. The content itself could be a mix of static and interactive content, as you would expect for anything web based. >> Depending on the degree of timing synchronisation accuracy achievable, >> another application is to play alternative personalised audio. >> Examples include: a director's commentary or alternative languages. >> This might be streamed from the broadcaster's servers directly to the >> companion device and played through headphones in synchrony with the >> programme showing on the TV. >> > Maybe this can be described as a separate usecase since it probably > requires a but more accuracy then a slide show. +1. It does indeed! In case it is of interest: We have found that it was not sufficient to just query the playback time index (e.g. seconds since start of programme). To get close to lip-sync accuracy in our own experiments, we needed to implement simple NTP-style time clock synchronisation between the devices and timestamp the query responses with that synchronised clock. >> 2) Integration of television viewing into websites ... > Is this use case to be interpreted from the PoV of the "companion" > device? > I.e. are you describing a user browsing with his laptop a webside and > discovering in the home network any related piece of content? Yes, the experience as described is about giving a web based experience on a "companion" device the ability to query and control the TV. The particular use case we are thinking of requires the ability to recognise that a specific piece of content is available to the TV device. This might be because it is currently being broadcast, or because it features in the TV programme guide data for the next few days, or because it has been recorded onto a device on the home network. A specific example might be a particular series of cookery programmes: A broadcaster may maintain web pages relating to each show and the recipes. If the home page can discover and recognise that a particular episode is currently being watched on the user's TV then it can prominently feature links to web pages for that episode. If there is a recording of a particular episode available on the home network, then the web pages it can offer to the user to instruct the TV to play that episode. It might also offer to instruct a TV with recording capabilities to record the rest of the series. >> 3) Alternative remote controls ... > This Use Case looks fine but is probably a bit too generic. > E.g. if the outcome of the discussion is a JS API, I don't see how a > "simplified device with physical buttons" would benefit from it. > If we were defining a new HN protocol, that would make sense, but I'm > not sure we will reach that level (this is up for the TF participants to > discus of course) > So I would rephrase a little bit to focus only on the "web page" case > and live out for the moment the native application and physical device I agree, a focus on only JS APIs will not benefit those classes of device. This was included because I believe that existing protocols are not sufficient. But as you correctly say - this is for the TF as a whole to determine in the ongoing discussions. >> 4) Enabling on-screen applications to interact with client devices in >> the home network ... > Did you imagine also gaming in this use case? Where for game I don't > mean a quiz but a Console-like gaming We had not considered this space, but I see no reason to exclude it. Our own experiments have communicated higher level concepts than button pushes between client and TV screen. In the quiz example - the client could be retrieving quiz questions and waiting to be told to present the question to the user, then relaying back the choice that was made. -- | Matt Hammond | Research Engineer, BBC R&D, Centre House, London | http://www.bbc.co.uk/rd/
Received on Friday, 8 April 2011 15:30:29 UTC