Minutes of Silver XR meeting

== Summary ==

  * No meeting next week because of XR Access Symposium
  * We want to do more outreach to other groups working in this area and
    share our User Needs document
  * First draft of Functional Outcomes.  We disagreed about the
    editorial format, but not the concepts
      o   1) Auditory information, including speech and key sound
        effects, are translated into alternative formats (e.g. captions)
        so media can be consumed when sound is unavailable or limited
      o 2) Auditory meta-information, including sound directionality, is
        conveyed to the viewer to that contextual information is
        available when sound is unavailable or limited
      o 3) Captions and caption meta-data are capable of being presented
        in alternative methods (e.g. second screen) to make the
        information more accessible when visual access is unavailable
      o   4) Customisation of caption style and position is available to
        support users that would benefit from tailored presentation options
      o 5) The amount of time that a given caption (and associated
        meta-data) spends on screen can be personalised in order to give
        additional time to locate the sound that is being presented
  * ChrisP provided links to some research from BBC on visual
    presentation of captions. He will invite an expert from BBC to our
    next meeting on the 27 July.

== Log of Meeting ==

(scribe didn't engage the Minutes bot)

Meeting: Silver XR Subgroup
chair: MikeCrabb

Topic: No meeting next week

MC: Last week we discussed the XR Access Symposium
I am registered for that
https://xraccess.org/symposium/
the symposium and that day 2 session
MC: No meeting next week so that people can attend the Symposium
... Bill Davidson-Curtis is part of the Guidelines & Policies group. He 
encouraged us to  attend.
<michaelcrabb> https://www.rit.edu/directory/wadnet-wendy-dannels
... Also Wendy Donalds in RIT
... she is the other lead for Guidelines & Policies

Topic: Adding people to this meeting

MC: any progress on making contact with Immersive Captioning?
JS: We also need to include Jason White of Research Questions Task Force 
(RQTF)
<CharlesHall> I am inviting other industry people to join the Inclusive 
Design for the Immersive Web Community Group – getting ready to register 
to have a meeting in TPAC

Topic: Functional Outcomes
<michaelcrabb> 1) Auditory information, including speech and key sound 
effects, are translated into alternative formats (e.g. captions) so 
media can be consumed when sound is unavailable or limited
<michaelcrabb> 2) Auditory meta-information, including sound 
directionality, is conveyed to the viewer to that contextual information 
is available when sound is unavailable or limited
<michaelcrabb> 3) Captions and caption meta-data are capable of being 
presented in alternative methods (e.g. second screen) to make the 
information more accessible when visual access is unavailable
<michaelcrabb> 4) Customisation of caption style and position is 
available to support users that would benefit from tailored presentation 
options
<michaelcrabb> 5) The amount of time that a given caption (and 
associated meta-data) spends on screen can be personalised in order to 
give additional time to locate the sound that is being presented
<CharlesHall> this editorial format seems closer to success criteria or 
acceptance criteria, that {x} must {y}.
<michaelcrabb> 
https://docs.google.com/document/d/1gfYAiV2Z-FA_kEHYlLV32J8ClNEGPxRgSIohu3gUHEA/edit#
<michaelcrabb> When sound is unavailable or limited auditory information 
including speech and key sound effects are translated into alternative 
formats so that users can understand content.
<CharlesHall> the draft definition of Functional Outcome that is in the 
Functional Needs work: A statement that describes a singular objective 
of a user has been met – usually in the context of a task or overall 
goal – that may need to name or cite a functional need.
<michaelcrabb> 
https://w3c.github.io/silver/subgroups/xr/captioning/index.html
<michaelcrabb> User Needs: 
https://w3c.github.io/silver/subgroups/xr/captioning/xr-captioning-user-needs.html
JS: What about Minimize Photosensitive Seizure Triggers annd Usage with 
Limited Cognition?
MC: Limited cognition is included in #5, and Photosensitive is #4.
MC: What isn't included is Privacy.  That is important when 
autocaptioning speech to text
... Our group is now using Microsoft Teams, where everything said is 
captured in a searchable transcript.  Handy, but privacy concerns.
  1) We need captions
  2) We need meta data of sound effects
  3) Second screen
  4) Customization of captions
  5) Amount of time
MC: I think we need to wait on Functional Outcomes until we get a 
recommendation about the editorial format

Topic: Outreach to BBC Blue Room
CP: I have reached out to Spencer Marsden from BBC in Blue Room. He 
creates XR content for Blue Room.  Captioning is part of what they do.
[discussion of some contacts with Blue Room]
JS: Suggest that you share the User Needs document with them and ask for 
their input.
CP: I will invite him to the meeting on the 27th
<Crispy__> 
https://www.bbc.co.uk/rd/blog/2017-03-subtitles-360-video-virtual-reality
<Crispy__> 
https://www.bbc.co.uk/rd/blog/2014-10-tvx2014-short-paper-enhancing-subtitles
<Crispy__> 
https://www.bbc.co.uk/rd/blog/2018-01-accessibility-object-based-media
MC: There is a lot of work there on positioning subtitles

Received on Monday, 13 July 2020 14:20:49 UTC