W3C home > Mailing lists > Public > public-webapi@w3.org > March 2006

Re: in defence of listener discovery (ISSUE-32, ACTION-84)

From: Maciej Stachowiak <mjs@apple.com>
Date: Thu, 23 Mar 2006 16:37:15 -0800
Message-Id: <18B8B9C8-83F3-4D13-92EA-3B77E87A1D15@apple.com>
Cc: Al Gilman <Alfred.S.Gilman@IEEE.org>, WebAPI WG <public-webapi@w3.org>, wai-liaison@w3.org
To: Jonas Sicking <jonas@sicking.cc>

On Mar 23, 2006, at 3:31 PM, Jonas Sicking wrote:

> Al Gilman wrote:
>> Question:
>> Why does the accessibility community want the hasEventListener?
>> Actually, as I read the current Note, it is willTrigger and not  
>> hasEventListener
>> that is the key method to enable the functionality needed for UI  
>> adaptablility.
> Actually, my main question is not why the these functions are  
> needed, but why they are needed in the DOM API. I'm thinking from  
> the perspective of a browser here, so let me know if these APIs are  
> intended for UAs other then browsers.

I agree with what Jonas said in his message. What are the assistive  
technologies that would rely on the DOM for this? At least for Safari/ 
WebKit, our integration with accessibility is all done through  
internal APIs.

I also think that knowing "will an event listener fire" is not the  
most important thing for assistive software to know, more important  
is some concept of "role". For instance, if an element has a  
mousedown listener, is that because it is a button, a drag source, an  
editable area, a toggle, something that makes a sound on click in a  
capturing listener but passes the click through? My experience with  
Safari's accessibility implementation is that adding willTriggerNS,  
hasListenerNS DOM APIs would not help much.

> So basically, my question is: Who is the intended audience of these  
> APIs, and how is it expected that that audience use them.


Received on Friday, 24 March 2006 00:37:35 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:16:20 UTC