Minutes from Media Timed Events Task Force call, 15 July 2019

Dear all,

The minutes from the last Media Timed Events Task Force / WICG DataCue call are available [1], and copied below.

The next call is planned for Monday 19th August.

Kind regards,

Chris (Co-chair, W3C Media & Entertainment Interest Group)

[1] https://www.w3.org/2019/07/15-me-minutes.html


W3C
- DRAFT -
MEIG Media Timed Events TF
15 Jul 2019
Agenda

Attendees

Present
    Chris_Needham, Kaz_Ashimura, Rob_Smith, Andreas_Tai, Nigel_Megitt

Regrets

Chair
    Chris

Scribe
    kaz

Contents

Topics
    Agenda
    Use Case and Requirements document
    OGC event
    DataCue explainer
    Changing cue start and end times
    Next steps
    AOB
    Next call

Summary of Action Items
Summary of Resolutions
<scribe> scribenick: kaz

Chris: note that people are encouraged to join the WICG for the follow-up discussion

<scribe> Agenda: https://lists.w3.org/Archives/Public/public-web-and-tv/2019Jul/0014.html

# Agenda

Chris: mentions the preliminary agenda topics
.... 2. Review open issues and PRs on the use cases and requirements document [4]
.... 3. Review WICG explainer document [5]
.... 4. Start to capture a list API issues and decisions to be resolved
.... 5. Next steps
.... also OGC work

Use Case and Requirements document
<cpn> https://github.com/w3c/me-media-timed-events

Chris: We have some of open issues. Don't want to go through them all today..

https://github.com/w3c/me-media-timed-events/issues MTE issues

https://github.com/w3c/me-media-timed-events/pulls PRs

Chris: Would like to resolve PR 49, clarify terminology.

https://github.com/w3c/me-media-timed-events/pull/49 PR 49

Chris: Started from an initial comment around "timed event",
.... and it wasn't clear what and "event" is: DOM event or something else?
.... Different usages exist, so not entirely clear in the document.
.... We used the term event, coming from emsg as a carrier for timed information.
.... So I suggested an alternative: "timed metadata", which is used elsewhere,
.... e.g., in the Apple HLS documentation and the DASH-IF doc.
.... Nigel pointed out that isn't good fit in the document for one of the use cases,
.... where we have messages that are more like notification commands,
.... so not really metadata.
.... I'm OK with using "metadata" but aware it doesn't fit all our use cases

<cpn> https://pr-preview.s3.amazonaws.com/w3c/me-media-timed-events/pull/49..html

Chris: Are we OK to use timed metadata, or should we change it?

Andreas: I didn't have time to follow this TF's work and this issue, but it is of interest.
.... In general, also reluctant to use "metadata" for this context.
.... In TTML, metadata is something that's additional data or annotations about something else.
.... But here the media timed events have some application driven by data.
.... So from my impression the original wording "media timed event" is quite good :)

Chris: Thanks. That's a possible option, to go back to the initial term.

Andreas: Nigel, you also have some opinion?

Nigel: The term metadata gets misused, mostly. It had an original meaning: data about the structure of data.
.... Given it's widely misused, people won't be misled by its use here as data about the video.
.... However, the distinction is between something just giving information that could be safely ignored, or information that's required for processing or presentation.
.... The control message example here is essential for the media player to work, these messages can't be ignored.
.... In general what this document describes is a mix of informative extra data and other, required things.

Chris: Interesting, I'm not familiar with the disctinction there between essential and non-essential.

Nigel: We think a DataCue is a solution here, so we could be misleading by calling it metadata cues,
.... as we already have the "metadata" kind in TextTrack.

Chris: Yes, although the DataCue proposal extends the TextTrack with kind = metadata.
.... identifying the gap

Nigel: So from that point of view, it would be misleading not to call it metadata.

Chris: The DASH-IF document is called "application events and timed metadata processing model",
.... seems good, as it covers both. Maybe could follow this.

https://dashif-documents.azurewebsites.net/Events/master/event.html DASH-IF document

Andreas: WebVTT has metadata cues, this isn't subtitles.

Chris: What we're proposing with DataCue is a new way of handling metadata cues.
.... With VTTCue you serialise the data to a string,
.... but with DataCue we can convey the information in its more native form.

Rob: IIUC, if an evemt is missed, it could cause the media playback to break?

Nigel: [explains the DASH manifest refresh use case]

Chris: I don't think there is a single term to describe both.
.... Nigel's proposal is to call it timed data, which is more general, could cover both aspects.
.... I'm reluctant to change the document, to maintain commonality with usage elsewhere.
.... I'd need to go through each usage.
.... Possibly we could add something like application events for the control messages.

Andreas: Don't have a strong view. More important what it does. Terminology is always difficult.

Chris: Indeed. I'd like to move to the next stage. For conversation with the implementers, we need an explainer.
.... That still needs work, can copy from the IG note, but expains the API from a user point of view.

Rob: Another question, is this becoming a design issue?
.... I didn't consider metadata as critical to playback, do we need to flag this in the design?

Chris: Everything would be delivered, so I don't think the case were items are optional,
.... so may be present but not surfaced to the application, should arise.
.... Nigel, do we need to discuss more?

Nigel: I'm kind of happy to leave it to the editor, you've had some useful input.

Chris: I will give it more thought, could go back to the original wording.

# OGC event

Rob: A couple weeks ago there was OGC technical committee meeting in Belgium.
.... I was invited because I'm a member of the Spatial Data on the Web group, a joint W3C and OGC group.
.... I've presented to them before, but this was the first time in person
.... I gave three presentations and a short introduction.
.... Went well, there's interest in what we're doing.
.... There's an opportunity for them to make contribution.
.... We had a breakout on video metadata search.
.... There's a GitHub issue with the conclusions, you can have a look:

<RobSmith> https://github.com/w3c/sdw/issues/1130

Rob: OGC have an innovation program, similar to WICG.
.... They investigate new problems, develop solutions, etc.
.... OGC members sponsor efforts. Flood monitoring, traffic, etc.

Chris: Were there any specific suggestions, or requirements for us to take into account here?

Rob: Not at this stage, but I've made them aware that this activity is going on, and
.... opportunity to contribute, and now is a good time.

Kaz: OGC guys attended the WoT workshop in Munich,
.... there was some discussion of collaboration with W3C for IoT use cases.

https://www.w3.org/WoT/ws-2019/minutes.html draft minutes from the 2nd WoT workshop

Kaz: The use cases on issue 1130 are very interesting from the ME viewpoint and the WoT viewpoint,
.... further collaboration among OGC, ME and WoT is encouraged.

Rob: Please respond to the GitHub issue, the SDW chairs would be interested in the collobration, now rechartering.

# DataCue explainer

https://github.com/WICG/datacue/blob/master/explainer.md DataCue Explainer

Chris: [talks through the API example (emsg event with an ad insertion cue)]
.... Application has to interpret the data in the ArrayBuffer.
.... There is a library that parses the SCTE-35 message, it's quite a lot of code.
.... You identify the type of the message using inBandMetadataTrackDispatchType.
.... It's a field in TextTrack in HTML, for the purpose of identifying the stream for inband cues.
.... For emsg, there is a schema id and value,
.... "schemeIDUri" defined as "urn:scte:scte35:2013:bin" here.
.... Proposal is similar to the HbbTV model. You have a TextTrack per type of timed metadata cue.
.... How to use for application generated events?
.... In that case, you can simply create a TextTrack with kind=metadata.
.... No need to set inBandMetadataTrackDispatchType, as the events aren't sourced by the UA.
.... inBandMetadataTranckDispatchType is inconsistently implemented in browsers, I don't think it's in Firefox at all.
.... I think Safari has it for HLS timed metadata. Not sure about Chrome.
.... There is an issue, a lot of detail in HTML about ids

https://github.com/WICG/datacue/issues/12 WICG Issue 12

Chris: We may need to extend this in HTML.
.... It links to the Media Fragment URI spec. I couldn't see how the id field works in that case.
.... I can write a element and give it an id, so it's unclear how the id behaves in the two cases:
.... if I create a element or by creating a TextTrack object via the API.
.... This seems to be underspecified to me. Not sure how the spec corresponds with what browsers actually do.
.... If you have time, please look at the issue.

# Changing cue start and end times

Chris: Rob and I had discussion on expected behavior for changing cue start and end times.

https://github.com/WICG/datacue/issues/9 WICG Issue 9

Chris: I have done some experiments on how text track cues behave in practice when you adjust their start and end times, will write up the results.
.... I found some differences between browsers.
.... For example, if you have a cue, and the mediaElement's currentTime is in the middle of the cue,
.... then you increase the cue endTime, in one browser you get a cue onenter event, in another you don't.
.... Also, if you have a cue, and the mediaElement's currentTime is in the middle of the cue,
.... then you change the cue endTime such that the mediaElement's currentTime is now outside the cue,
.... in one browser the cue still remains in the activeCues list, and in another it gets removed.
.... Not sure if it's a spec issue or an implementation issue, check "time marches on".

Nigel: When it was moved, was the onexit handler fired?

Chris: I need to double check my results on that. I'll write it down and share with you.
.... Is there web platform test support for the TextTrack API? I found some VTTCue tests, but not for TextTrack more generally.

Nigel: I only know about VTTCue, don't know about the rest.

# Next steps

Chris: I want to get implementer feedback as soon as we can,
.... to understand people's interests, where we have consensus, and get input.
.... We'd mainly use the explainer for that.
.... My plan is to send out a doodle poll, you all are welcome to join.
.... Want to open up a conversation, see that we're going in the right direction.
.... Want to do that during the next week or so.

# AOB

Chris: Anything else for today?

(none)

# Next call

Chris: August 19th

[adjourned]

Summary of Action Items
Summary of Resolutions
[End of minutes]
Minutes formatted by David Booth's scribe.perl version 1.152 (CVS log)
$Date: 2019/07/17 07:24:39 $

Received on Wednesday, 17 July 2019 10:23:01 UTC