W3C home > Mailing lists > Public > public-web-and-tv@w3.org > June 2014

Re: Drafting Comments on 'Triggered Interactive Overlay' use case

From: Yosuke Funahashi <yosuke@funahashi.cc>
Date: Wed, 11 Jun 2014 05:45:06 +0900
Message-ID: <53976E52.7090605@funahashi.cc>
To: "HU, BIN" <bh526r@att.com>, "public-web-and-tv@w3.org" <public-web-and-tv@w3.org>
Hi Bin,

Thanks a lot for getting back to me. Your comment helped me better understand 
your points.

Kaz, could you please give us your comment from MMI's viewpoint which has more 
generic or abstract interfaces?


On 6/9/14, 8:09 AM, HU, BIN wrote:
> Yosuke,
> Thank you very much for the quick analysis and comments. Let me further explain what the use case is so that you may have more information.
> While Media Resources In-band Tracks CG can enable binding the video and triggered interactive overlay service built on MPEG2-TS, the scope here is too narrow. The intention of our use case is to address triggered interactive overlay service built on variety of streams, including:
> - In-band scenario: events, and contents or content sources are encoded within in-band tracks of video streams. This also includes:
>    * MPEG-TS
>    * Other standard video encoding
>    * Perhaps proprietary encoding of video streams
>       . In terms of special / proprietary encoding, the implementation of API needs to hook up with the proprietary plug-ins. However, the API exposed to webapps should behave the same as MPEG-TS or other standard encoding.
> - Out-band scenario: events, and contents or content sources are pre-provisioned within the platform, and API needs to trigger the events with the content/content sources based on triggers from platform.
>    * The same handling of implementing the API as that of special / proprietary encoding.
> So in summary, we need a generic, perhaps also more declarative API to enable the webapps to be able to:
> - receive the specific events (and details of event definitions)
> - details of data elements from the event triggering to achieve those use case scenario and requirements.
> And webapps can act accordingly upon receiving the event, and related data elements to achieve those requirements.
> The implementation of the generic API depends on whether the events/contents are in-band or out-band, and whether the standard encoding is used or the special/proprietary encoding is needed, and whether or not the plug-ins are available for out-band and/or special/proprietary encoding.
> Hope it helps and let me know what you think
> Thank you
> Bin
> -----Original Message-----
> From: Yosuke Funahashi [mailto:yosuke@funahashi.cc]
> Sent: Sunday, June 08, 2014 9:01 AM
> To: public-web-and-tv@w3.org
> Subject: Drafting Comments on 'Triggered Interactive Overlay' use case
> Hi Bin and Kaz,
> Regarding the 'Triggered Interactive Overlay' use case Bin proposed, in the
> previous call we agreed that we quickly check (or do a straw gap analysis on) it
> against on-going works in the Media Resource In-band Tracks CG and the MMT WG,
> before further digging into this use case in the IG.
> Here is my draft comments on the UC's five derived requirements from the CG's
> point of view. I don't think I fully understand the use case, so, Bin, could you
> check my comments and give me feedbacks please? Kaz, could you please make
> comments from MMI viewpoints?
> - - - - - - - - - - - - - - - - -
> 1. Triggered interactive overlay must have the ability to bind to the video it
> is associated with, including live, time-shifted programming or linear commercial.
> [YF] The CG's APIs to MPEG2-TS [1] enable web apps to access TS descriptors,
> through which you can get event ids that VCRs use for their recording timers.
> You can use this id to bind the video and triggered interactive overlay if your
> service is built on MPEG2-TS.
> Hybrid TV systems use AIT to manage the life cycle of applications, which is
> also accessible through the CG's APIs. However, there are different life cycle
> models and bindings between apps and channel and/or programs for each hybrid TV
> standard. We may need to define generic APIs if we go with AIT.
> If you need higher level APIs, CRID or TV-Anytime may be a better solution from
> an architectural viewpoint but it requires you operate a TV-Anytime system.
> Regarding linear commercial, I think there is no global standard about it in the
> broadcasting industry and each broadcaster or service provider implements the
> binding between apps and commercials by defining triggers in private data. So if
> we pursue this, we need to define generic APIs for the binding.
> 2. Triggered interactive overlay must have the ability to be kept alive /
> displayable to the TV viewer within the bounds of the video content length it is
> associated with, i.e. overlay display goes away when content it is bound to ends
> or user switches channel to different video content.
> [YF] Ditto.
> 3. Triggered interactive overlay must have the ability to be displayed to the TV
> viewer over key event(s) within the video content
> [YF] I think you can implement this by writing js scripts with existing web
> standards. Bin, do you mean we should define more declarative APIs to acheive this?
> 4. Triggered interactive overlay must have the ability to display or tear down
> based on channel, and / or time, and / or date, or other criteria
> [YF] I think you can implement this by writing js scripts with existing web
> standards and the CG's APIs.
> 5. Triggered interactive overlay must have the ability to be kept alive/
> displayable based on industry wide signaling technology.
> [YF] I think the CG's APIs satisfy this requirement: The CG's API is an attempt
> to define how user agents should expose in-band tracks as HTML5 media element
> video, audio and text tracks so that Web applications can access the in-band
> track information, through the media element, in a interoperable manner across
> user agent implementations.
> - - - - - - - - - - - - - - - - -
> I'm looking forward to hearing from you.
> Best regards,
> Yosuke
> [1] http://www.w3.org/community/inbandtracks/wiki/Main_Page

Yosuke Funahashi
co-Chair, W3C Web and TV IG
Chair, W3C Web and Broadcasting BG
Researcher, Keio Research Institute at SFC
Special Adviser, Tomo-Digi Corporation
Received on Tuesday, 10 June 2014 20:45:38 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:57:21 UTC