Re: Seeking thoughts on real world application of SC 1.4.2 Audio Control on iOS

Gregg,

>>
>> Just two quick points...
>>
>>  1. I was ONLY speaking about whether and how close we were to a
>>     situation in which iOS apps (only) might be able to leverage/rely
>>     on built-in iOS and VoiceOver behavior to fulfill 1.4.2.
>>
>
> iOS  /  Voiceover does not meet 1.4.2.

I'm not disagreeing with this - hence the "how close" comment.  Note 
also the terminology.  Firefox doesn't meet 1.4.4.  An app that works 
with the built-in Firefox zooming feature meets 1.4.4.  That's the 
possible parallel I was exploring.

>
>>  2. I'm pretty sure that the volume lowering functionality is an
>>     OS/VoiceOver-level function, ...
>>
> Correct.   But it doesn’t meet 1.4.2

Again, no disagreement; the functionality is in the right direction, but 
not sufficient to cover all of 1.4.2 (today).

>> ...and NOT something that is explicitly built into the 
>> podcast/music-playing app.  Given how closely integrated VoiceOver is 
>> to the platform, and its privileged status as a special application, 
>> this is certainly the way I would implement that functionality.  And 
>> perhaps a future version will extend the functionality so that 
>> complete muting is an option.
> That still wouldn’t necessarily  do it for all iOS apps.  It would 
> only take care of it for the VoiceOver use on the platform.    Other 
> apps that self serve voice.

Ahhh!  Now we are getting somewhere interesting. Understanding SC 1.4.2 
<http://www.w3.org/TR/UNDERSTANDING-WCAG20/visual-audio-contrast-dis-audio.html> 
specifies only one disability use case, as set forth in the first 
sentence: "Individuals who use screen reading software can find it hard 
to hear the speech output if there is other audio playing at the same 
time".  VoiceOver is the only screen reader (I'm aware of) for iOS.  So 
long as the audio-generating app is compatible with VoiceOver and any 
volume lowering (and potential future muting) functionality, the app 
should be able to rely on that VoiceOver/platform functionality to 
address that accessibility issue, right? (acknowledging that today that 
functionality is insufficient to address all of 1.4.2).

Certainly a self-voicing app that is also a self-non-speech 
audio-playing app would have the responsibility of lowering/muting the 
non-speech audio when the self-voicing functionality is running.  Is 
that what you were referring to?  Or something else? But, even then, 
such an app might not meet the letter of WCAG (e.g. if the self-voicing 
app didn't provide a feature for turning down/off the speech volume it 
was providing separate from the OS volume, since even the built-in 
speech of the app ISN'T the speech of VoiceOver and could therefore 
interfere with VoiceOver speech).

> <snip>
>>
>> One of things I'm most concerned about is actually in the other 
>> direction - characteristics of a platform that make it difficult or 
>> perhaps even impossible for an application on that platform to meet 
>> all of WCAG A/AA.
>>
> Interesting -- what are some things you see that might do this?

Well, this is where we start approaching the situation of "closed 
functionality".  Certainly at the extreme end is a platform for which 
there is no AT and no or insufficient built-in accessibility features to 
enable folks with various disabilities to have access. But at the other 
end...

Had AT not developed in the past decades, with analysis techniques for 
discerning things like columns of text and headings in the command line 
/ text application world of DOS/UNIX/Linux, then I would suggest that 
those platforms lacked the ability to meet things like 1.3.1 and 1.4.2.  
They lacked a means to convey structure and relationships.  They lacked 
a way to convey anything that is programmatically determined.

Certainly an individual application, which has such structure 
internally, might be able to first define and then convey that 
information to AT.  That is pretty much exactly what we did with the 
Java platform running on Windows - conveyed structure and relationships 
and object information via the Java Access Bridge to AT that added 
support for that bridge (since at the time we started that work, Windows 
lacked sufficient support in the OS for such things).

But we are then going beyond the realm of "telling an app developer how 
to meet SC x", and into the realm of "developing enabling mechanisms to 
compensate for a lack in the platform so that in the future app 
developers will be able to meet SC x".  And there is little an app 
developer may be able to do should someone want to procure an app on 
such a platform ahead of such development. Except, perhaps, write the 
self-voicing, self-brailling, self-voice-recognizing, self... etc. etc. 
app.

In fact, that becomes another WCAG2ICT question...  Exactly what AT 
features must an non-Web app build into itself in order to satisfy all 
of the SCs?  We have our Closed Functionality list, but we have 
essentially said it is beyond the scope of WCAG2ICT to define this. 
So... if the platform lacks, say, a screen reader, then how can an app 
be "accessibility supported" ever?  Seems to me it can't, and it falls 
to whatever regulatory structure incorporates WCAG for non-Web software 
to describe that case.  Full stop.


>> If an agency decides to deploy such a platform, there may be little 
>> to nothing an application vendor can do to offer an application that 
>> meets all of WCAG A/AA on that platform.
>>
> Well, they can always just build all of the accessibility in.  and for 
> some platforms that is needed   (but that is not a good situation or 
> platform).
>>
>>   And unlike the web (where an agency might install a web user agent 
>> from "Jim's Storm Door and browser company" that doesn't magnificent 
>> content or expose a DOM or work with any AT), platform-specific 
>> binaries don't have the ability to claim "accessibility supported" by 
>> noting other platforms that their app is accessible on.  Because 
>> "platform-specific" means there is only one platform that can be used 
>> to evaluate it...
>>
> OK.     Can you give me some examples of this?   There are lots of 
> platforms that don't provide some or many access features.  Apps need 
> to provide them directly on these platforms. (there are 10s of 
> thousands of kiosks that do this).
>
> this is best thought of as    "the app is responsible to provide all 
> access to the app -- but if the operating system (or browser or other 
> intermediate platform) can be relied up to provide some of the access 
> features for the app -- then then app can use those to meet its 
> requirements.   but otherwise it is responsible.


Yes, but...  As I noted above, handling such a situation seems to be 
beyond the scope of WCAG itself.  WCAG was written for an environment 
for which there was AT and user agents and platforms that had 
accessibility services, etc. etc.  When WCAG is applied to native, 
non-Web apps running on a platform without these things, it seems to me 
WCAG itself falls silent on some key issues and we have to look to other 
parts of any regulatory framework that is incorporating and applying 
WCAG to handle the situation.


Peter
-- 
Oracle <http://www.oracle.com>
Peter Korn | Accessibility Principal
Phone: +1 650 5069522 <tel:+1%20650%205069522>
500 Oracle Parkway | Redwood City, CA 94065
Green Oracle <http://www.oracle.com/commitment> Oracle is committed to 
developing practices and products that help protect the environment

Received on Thursday, 23 May 2013 17:50:17 UTC