Re: Minutes of Mobile Accessibility Task Force teleconference of 28 May 2015

Late regrets as I was out of the office yesterday. 

Henny

—
Henny Swan
User Experience and Design Lead
The Paciello Group
Twitter: iheni
Skype: ihenix





> On 28 May 2015, at 20:46, Jeanne Spellman <jeanne@w3.org> wrote:
> 
> Minutes: http://www.w3.org/2015/05/28-mobile-a11y-minutes.html
> 
> Text of minutes:
> 
>   [1]W3C
> 
>      [1] http://www.w3.org/
> 
>                               - DRAFT -
> 
>             Mobile Accessibility Task Force Teleconference
> 
> 28 May 2015
> 
>   See also: [2]IRC log
> 
>      [2] http://www.w3.org/2015/05/28-mobile-a11y-irc
> 
> Attendees
> 
>   Present
>          jeanne, marcjohlic, MikeShebanek, Detlev, kim, JonAvila
> 
>   Regrets
>          Kathy, Jan, Michael_Pluke
> 
>   Chair
>          Kimberly_Patch
> 
>   Scribe
>          jeanne
> 
> Contents
> 
>     * [3]Topics
>         1. [4]Best Practices - Understandable
>         2. [5]3.6 Provide instructions for custom touchscreen and
>            device manipulation gestures
>     * [6]Summary of Action Items
>     __________________________________________________________
> 
>   <trackbot> Date: 28 May 2015
> 
>   <scribe> scribe: jeanne
> 
> Best Practices - Understandable
> 
>   [7]https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/Understandable
>   _Techniques
> 
>      [7] https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/Understandable_Techniques
> 
>   <Kim> [8]http://w3c.github.io/Mobile-A11y-TF-Note/
> 
>      [8] http://w3c.github.io/Mobile-A11y-TF-Note/
> 
>   <jon_avila> Response from Lisa Seeman This is an early draft of
>   some of what you are talking about:
>   [9]https://w3c.github.io/coga/issue-papers/links-buttons.html
>   (We already have some changes to it, but I have not uploaded
>   them yet.)
> 
>      [9] https://w3c.github.io/coga/issue-papers/links-buttons.html
> 
> 3.6 Provide instructions for custom touchscreen and device
> manipulation gestures
> 
>   Detlev: There are hints in iOS, that allows you to supplement
>   instructions - at least to labels.
>   ... it may be iOS only
> 
>   Marc: It is iOS specific - we have been adding info the label
>   of the element, so that it will work on both platforms. It adds
>   to the text being read.
> 
>   Jon: The hint is misunderstood. It is supposed to be the result
>   of the action.
>   ... adjustable controls will automatically give directions from
>   iOS Voiceover
> 
>   Mike: I think that the hint should be more than the result, it
>   should tell the user how to perform the action.
> 
>   <jon_avila>
>   [10]https://developer.apple.com/library/ios/documentation/UserE
>   xperience/Conceptual/iPhoneAccessibility/Making_Application_Acc
>   essible/Making_Application_Accessible.html#//apple_ref/doc/uid/
>   TP40008785-CH102-SW6
> 
>     [10] https://developer.apple.com/library/ios/documentation/UserExperience/Conceptual/iPhoneAccessibility/Making_Application_Accessible/Making_Application_Accessible.html#//apple_ref/doc/uid/TP40008785-CH102-SW6
> 
>   Jon: That is contrary to Apple documentation -- perhaps we
>   should write a Technique to describe the best practice
> 
>   Mike: We should go back to Apple to ask them to change that
>   documentation. I believe it is native.
> 
>   Jon: We are talking about custom instructions.
> 
>   Jeanne: Lisa's note is interesting. I think we need to say that
>   user agents must give instructions for actions that it
>   understands. The author is responsible for custom
>   widgets/gestures
> 
>   Mike: THe use case I see a lot is that often there is the
>   ability to swipe an object from off screen to on screen. The
>   problem is that there is not an icon to indicate that the
>   material is available.
> 
>   Jon: It fits into the Help success criteria in WCAG, sometimes
>   there are instructions when you launch an app to show the
>   complex swipes.
> 
>   Kim: Some apps have overlays that show you the gestures that
>   can be used in the app. Then there is a button that shows how
>   to turn it on and off. I have seen camera apps that use complex
>   gestures
> 
>   Mike: It is difficult to make accessible to a screenreader user
>   -- to link the gesture with the instructions
> 
>   Detlev: There are some examples of using ARIA for this with
>   delayed text@@
> 
>   Jon: ARIA describedby can be used. The ARIA tooltip is
>   difficult to get working when there is no hover, but
>   describedby will always be spoken even if it is offscreen.
> 
>   Kim: There are some apps that use different voices. It seems
>   like a different layer of visual.
> 
>   Jon: That is difficult to make work -- CSS doesn't work. The
>   user would have to set up the different voices
> 
>   Detlev: If there are hints, you could choose to not to render
>   the hints. Like there is a technique to read short link text or
>   longer link text.
> 
>   Jon: This is a good technique that could be used in other web
>   apps.
> 
>   Jeanne: This seems like a user agent issue
> 
>   Jon: JAWS has different verbosity settings. There is scripting
>   to turn on and off JAWS hints. There is no programmatic way to
>   communicate that to a webapp.
>   ... there was a Freedom Scientific proposal to add a help
>   attribute to W3C, but it never went anywhere.
> 
>   <jon_avila>
>   [11]https://lists.w3.org/Archives/Public/w3c-wai-gl/2003JulSep/
>   0346.html
> 
>     [11] https://lists.w3.org/Archives/Public/w3c-wai-gl/2003JulSep/0346.html
> 
>   Technique for adding hints to iOS applications
> 
>   Technique for turning hints on and off
> 
>   scribe: these should also have a optional visual layer as well,
>   as they are not only for audio.
> 
>   This is also a good use case. Reminder that we have a use case
>   wiki page, please add this to it.
> 
>   Detlev: sometimes hints should be different depending on
>   whether or not you are using a screenreader, which could lead
>   to forking. It's probably not doable.
> 
>   Kim: Why can't the user specify that they want hints for
>   screenreader or switch use
> 
>   Detlev: I don't think it is doable, because it requires the
>   author to write different types of hints.
>   ... It is hypothetical at the moment.
> 
>   Kim: It's good for testing.
> 
>   Jeanne: There are serious privacy issues for using sniffers.
> 
>   Jon: you could use tell what the user is doing by looking to
>   see whether they are using keyboard events.
> 
>   <Kim>
>   [12]http://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/Mobile_Accessi
>   bility_Use_Cases
> 
>     [12] http://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/Mobile_Accessibility_Use_Cases
> 
>   Jon: if we put it in terms of behaviours instead of
>   disabilities, that is more helpful to the user and helps with
>   the privacy issues.
> 
>   Jeanne: I would like to see a visual indicator that there is
>   information off screen.
> 
>   Kim: iOS has elegent dots and a number of how many screens.
> 
>   Mike: SOme designers will reveal a corner or sliver of
>   something off screen to give people an idea there is more
>   material available to swipe.
> 
>   Detlev: an animation affordance hint that indicates there is
>   more material. I am not sure we want to recommend it because it
>   is visual only, and not available to other modes.
> 
>   Kim: or if you aren't paying attention
> 
>   Detlev: I don't think we want to ban it
> 
>   Jeanne: it could be an example in a technique - as long as it
>   is available in other modalities
> 
>   Jon: another one is the labels that shake when an error is made
>   -- the shake doesn't give you any information about the error.
> 
>   Kim: It seems like there are two techniques here -- information
>   vs information available.
> 
>   Jon: the shaking could be put under existing SC, make them new
>   failure conditions
>   ... could we use 1.3.3 Sensory Characteristics for instructions
>   such as shake or other failures
>   ... we have Predictable, but we don't have affordances or
>   discoverable
> 
>   Detlev: a small tab on the left, if you have no way of focusing
>   with the screenreader, you would have to make it focusable. It
>   would be an elegent solution that would be available to other
>   modalities.
> 
>   Kim: We should indicate 1.3.3 here
> 
>   JOn: 2.1.1 keyboard -- if there is a visual indicator, then it
>   has to be focusable.
> 
>   Dtlev: and the swipe must be keyboard accessible, but we have
>   already discussed this.
> 
>   Kim: What about Form Datatypes?
> 
>   Detlev: Don't use placeholder text alone, with no label.
>   ... is there a current WCAG technique on Placeholder Text
> 
>   Jon: Not in current WCAG, but it is in HTML
> 
>   Detlev: Placeholder text will be read when you enter the field,
>   but if anything is entered, then the placeholder text will not
>   be read after that, unless it is marked up with ARIA label. I
>   don't know how to do that with native apps.
> 
>   Jon: There are ways to do it like, labelfor. Talkback announces
>   it after the field rather than before the field. It will remain
>   even after something is typed.
> 
>   Kim: there are a lot of programs that have a Back button even
>   in the app. If they can go back and see that placeholder text.
>   THere are a lot of good reasons to go back, for speech users
>   who accidently trip an unexpected command. Lot undo, but for
>   visual view.
>   ... that could sidestep placeholder text.
> 
>   Technique for Placeholder text and correctly labeling it if it
>   is used.
> 
>   Detlev: Need a Technioque for ways of marking up groups of
>   labels because Fieldset does not work on some mobile
>   environments in iOS. It makes elements hard to use.
> 
>   Jeanne: I think this is a bug to file against iOS.
> 
>   Jon: Already done. But there is also an aria technique to
>   handle this.
> 
> Summary of Action Items
> 
>   [End of minutes]
> 
> 
> 

Received on Friday, 29 May 2015 10:12:20 UTC