- From: Kazuyuki Ashimura <ashimura@w3.org>
- Date: Mon, 8 Jan 2018 17:54:02 +0900
- To: public-web-and-tv@w3.org
available at: https://www.w3.org/2018/01/04-me-minutes.html also as text below. Thanks, Kazuyuki --- [1]W3C [1] http://www.w3.org/ - DRAFT - Media and Entertainment IG Call 04 Jan 2018 [2]Agenda [2] https://lists.w3.org/Archives/Member/member-web-and-tv/2017Dec/0003.html Attendees Present Anders_Klemets, Chris_Needham, Chris_O'Brien, Diby_Roy, Francois_Daoust, John_Luther, Kaz_Ashimura, Mark_Vickers, Nell_Waliczek, Peter_tho_Pesch, Stefan_Pham, Stephan_Steglich, Tatsuya_Igarashi, Chris_wilson, David_Dorwin, John_Pallett Regrets Chair Chris, Igarashi, Mark Scribe kaz Contents * [3]Topics 1. [4]Introduction 2. [5]360-degree video - Stefan * [6]Summary of Action Items * [7]Summary of Resolutions __________________________________________________________ Introduction Chris: [explains the IG's work to the attendees] ... mentions previous calls on the TV Control API, Presentation API, etc. ... topic is web support for A/V media in general ... and support for 360 video during this call today ... our goal for this call is to identify gaps in the web platform APIs needed to better support 360 video on the web ... there are different approaches for 360 support ... several libraries are available ... but do we see any particular gaps which W3C should provide better support? ... also, I recognise that there is the WebVR CG, so we should talk about how this IG can usefully contribute to the work happening there ... Stefan and Stephan would like to explain their 360 video approach at Fraunhofer Fokus ... and then follow this with open discussion on any gaps identified ... does anybody have other specific topics for today? (none) 360-degree video - Stefan <cpn> Slides: [8]https://www.w3.org/2011/webtv/wiki/images/f/fb/2018-01-04-Fr aunhofer-FOKUS-360-for-W3C-M%26E.pdf [8] https://www.w3.org/2011/webtv/wiki/images/f/fb/2018-01-04-Fraunhofer-FOKUS-360-for-W3C-M&E.pdf Stefan: 360-degree video playout franhofer fokus ... how our solution works ... [360-degree client side processing] ... using WebGL APIs ... this shows the client side processing, can't use canvas with protected content ... on the server side there's an api for fetching segments ... (at request segments) ... then play segments using MSE API ... [360-degree server side pre-rendering] ... explains the graph ... [360-degree server side processing] ... download segments using XMLHttpRequest ... Fetch API/Streaming and WebSocket ... playback using a video element with MSE ... [360-degree server side processing] ... overall picture ... will provide the slides later ... [improvements for W3C APIs] ... for MSE ... there's an internal buffer managed by the underlying media player ... which causes additional delays ... some side effect as well, eg old frames are played after clearing the SourceBuffer ... a 2nd improvement is multiple SourceBuffers attached to a single MediaSource ... and switching between them ... and the third improvement is replacing segments in a single SourceBuffer ... currently we buffer segments into JavaScript and copy just-in-time into SourceBuffer ... we also see a gap with WebVR/EME ... WebGL and Canvas for video transformation doesn't work ... need for a secure media path for transformation ... possibly could mean extension for those APIs ... that was our proposal ... there is a link including demos: [9]https://www.fokus.fraunhofer.de/go/360 ... any questions? [9] https://www.fokus.fraunhofer.de/go/360 Diby: one video and switching it based on location? Stefan: yes Chris: would like to hear comments from browser vendors? Nell: there have been some changes in the WebVR CG ... we're renaming the group to the Immersive Web CG ... there are handful of approaches ... the need for support for EME video is recognized ... maybe layer-based model? David: you have to have WebVR support for secure textures as well ... currently can't use canvas for EME protected video ... so you must rely on WebGL ... in some OpenGL implementations, can't read data back, causes crashes John_Pallett: geometry is a fundamental part ... are any groups defining what the projections are? Stefan: MPEG and ISOBMFF have specified projections David: projection information is needed in the metadata, can point to some work from google ... hyper-rectangular and cube map are already obsolete, we need a more extensible approach for native browser support ... content creators are experimenting with different kinds of projections, not just hyper-rectangular and cube map <ddorwin> The metadata work I referenced: [10]https://github.com/google/spatial-media/blob/master/docs/sp herical-video-v2-rfc.md [10] https://github.com/google/spatial-media/blob/master/docs/spherical-video-v2-rfc.md John_Pallett: converting content from one projection to another causes distortion David: this can't be done on-the-fly, and is lossy John_Pallett: it's more than just a transformation (transcoding) loss Chris: there was a workshop on WebVR in Brussels in December ... Ada from Samsung mentioned looking to standardise support for 360 video ... their solution is to extend video element with attributes ... comments on this approach? David: that solution was proposed due to lack of metadata ... prone to breakage, we would want to push for a metadata based solution ... pasted a link above on metadata ... OMAF? <DibyRoy> OMAF (Omnidirectional Media Application Format) <cpn> [11]https://mpeg.chiariglione.org/standards/mpeg-a/omnidirectio nal-media-application-format [11] https://mpeg.chiariglione.org/standards/mpeg-a/omnidirectional-media-application-format Peter: relationship with VR Industry Forum? Stefan: some of the members are also part of the forum Peter: ok Chris: there are a few things to follow up on is this something we should discuss within the W3C? ... e.g., the HTML Media Extensions WG ... there are also Web Incubator CG, Web VR CG, etc. ... what is the next step? ... where is the right place? Nell: transition from the WebVR CG to the WebXR work ... Immersive web CG David: that's the right place for the WebVR specific bits ... we're incubating proposals for MSE/EME as well in WICG Nell: can take an action to let you know about the new discussion Chris: thank you very much ... any other questions/comments? Mark: can you tell us more about the low latency issues in MSE at WICG? David: I'm not much involved in that ... can try to find out about it Mark: great David: browsers implemented to smooth video ... heuristics not standardized yet Mark: explains the purpose/goal of the Media and Entertainment IG's work ... collaboration with related W3C groups, etc. ... the audience for our IG is more general, how to link out to work going on elsewhere, to drive people to it? Chris: to summarise, we have several action items ... 1. Nell to send information to the Interest Group on how to initate new work in the Immersive Web CG ... 2. David to let the Interest Group know about any work happening on MSE, eg, at WICG ... 3. When we have the details from Nell, Stephan can raise the WebXR-specific issues there, e.g., protected video support ... 4. When we have the details from David on MSE, we can raise the more general MSE related issues there. ... any other specific issues? ... btw, question on the style of the discussion, webex call or github with the Immersive Web CG? Nell: both. we have a regular biweekly call on Tuesdays at 1pm Pacific time, and a GitHub repo ... logistically, the easiest way may be issues on GitHub Chris: ok Nell: WebVR and EME ... and projections including context information ... new discussions in a few weeks after putting issues Chris: great ... tx all for joining the call ... the next call will be at the beginning of February, will discuss and announce topic soon [adjourned] Summary of Action Items Summary of Resolutions [End of minutes] __________________________________________________________ Minutes formatted by David Booth's [12]scribe.perl version 1.147 ([13]CVS log) $Date: 2018/01/08 08:45:43 $ [12] http://dev.w3.org/cvsweb/~checkout~/2002/scribe/scribedoc.htm [13] http://dev.w3.org/cvsweb/2002/scribe/
Received on Monday, 8 January 2018 08:55:14 UTC