FWIW, the Firefox OS screen reader, which also powers parts of the
Firefox for Android accessibility support, is written in JavaScript, but
it hooks into the platform-independent version of the Gecko
accessibility API, of which the IA2 and ATK support is based.
Marco
On 07.11.2014 20:45, James Craig wrote:
> Do you recall who mentioned that? It doesn’t sound familiar.
>
> Possible mentions that you might be recalling:
>
> 1. Native (non-JavaScript) screen readers like VoiceOver sometimes
> operate on views by getting or setting the value directly. So the
> “select text” intention could come on the form of something like
> setSelectionForRange.
>
> 2. Someone mentioned ChromeVox which, to my knowledge is the only
> screen reader that relies entirely on the Web Browser (this may no
> longer be an accurate statement) rather than on a platform
> accessibility API. Initially it was dependent on the DOM, but I think
> it now has more hooks into browser internals, not just the client-side
> DOM.
>
>
> On Nov 7, 2014, at 11:05 AM, Ben Peters <Ben.Peters@microsoft.com
> <mailto:Ben.Peters@microsoft.com>> wrote:
>
>> You mentioned at TPAC that a good use case for Intentions would be
>> building a javascript screen reader that could listen to and fire
>> Intentions. Could one of you send me a use case for that? Example use
>> cases can be found on the Editing Explainer
>> (http://w3c.github.io/editing-explainer/#use-cases). Thanks!
>>
>>
>>
>> Ben
>>
>>
>>