- From: Jim Allan <jimallan@tsbvi.edu>
- Date: Thu, 1 Apr 2010 11:03:34 -0500
- To: "'James Craig'" <jcraig@apple.com>, "'WAI-UA list'" <w3c-wai-ua@w3.org>
- Cc: "'David Poehlman'" <poehlman1@comcast.net>
James, My concern is the article mentioned including a JavaScript on a (every) page to sniff user actions to see if a touch-based device is in use. It seems to me that the browser should know what kind of interface is in use on the platform. There should not be a need for JavaScript on every page to check for interface/object behavior. When there is something that an author has to do on every page perhaps there is an requirement that can be placed on the UA. If it is an accessibility issue then it should go in UAAG20. If it is not an accessibility issue then ??? My thinking: There is a large overlap/blur between what the author can do with JavaScript and what the browser does natively that puts a larger and larger burden on the author to do the right thing, but also has a greater chance of breaking accessibility. The article made it sound like the browser on a touch device is providing incorrect interface information to the user...it is displaying a select box such that the user is not aware of all options (scrolling) that are available. Because of this, the author needs to write code to check for touch behavior and substitute interface rendering and behavior to ensure the user gets all of the information needed for proper interaction. There should be something the browser can do so there is not the need for every page to check if there is a touch interface, and hopefully nudge thing closer to being a bit more accessible. Or have missed the point of the article and your response completely. This is deeply complex and easily confusing (at least to me). Any clarification/explanation appreciated. Jim > -----Original Message----- > From: w3c-wai-ua-request@w3.org [mailto:w3c-wai-ua-request@w3.org] On > Behalf Of James Craig > Sent: Wednesday, March 31, 2010 5:29 PM > To: WAI-UA list > Cc: Jim Allan; David Poehlman > Subject: Device-indepence in DOM3 and ARIA2 (Was: browser detection of > touch browsing) > > On Mar 31, 2010, at 12:52 PM, David Poehlman wrote: > > > it might be worth running by apple. > > There is a joint DOM 3.0 and ARIA 2.0 issue [1] to address device- > independence. The email thread subject is "Deprecation of > DOMAttrModified" and it was posted to the following lists in the > beginning of February 2010: w3c-wai-pf, www-dom, public-hypertext-cg. > Please add to the discussion as you see fit. > > Device-independence for ARIA widgets relies on the AT to be able to > convey user intent through the UA rather than direct user action. For > example, if VoiceOver is running on the iPhone and the user lands on an > ARIA treeitem which allows both a default action (activation or > selection) and a secondary action (such as expansion or collapse), the > user can trigger the default action with a standard activation event > (DOMActivate is being deprecated but a standard 'click' event > encompasses the default action). The only currently standardized option > for device-independent access to the secondary action relies on DOM > mutation events, which are problematic for many reasons and are being > deprecated. There are several mutation replacement proposals being > discussed in the DOM3 WG, and we have made them aware of the > accessibility implications that will be required for true device- > independence. The larger issue is that, even once we get this DOM > mutation replacement in browsers, web developers will still be required > to make a shift in direction towards a more locale-, platform-, and > device-independent style of coding. > > The above explanation is my understanding of what we need for device- > independence, but I'm not sure exactly what Jim requested in his email. > Jim, you may have intended something different by, "the browser should > communicate or translate the [events] so the functionality of the site > is not impaired." If so, please elaborate. > > Thanks, > James Craig > > 1. http://www.w3.org/WAI/PF/Group/track/issues/352 (probably a member- > only link) > > > > > > On Mar 31, 2010, at 2:48 PM, Jim Allan wrote: > > > > This is an interesting article. Speaks to device independence. > Perhaps we > > need something in GL 4. Alistair talks about a JavaScript to do the > > sniffing. Though it seems the browser should communicate or translate > the > > user keypress/movement/etc javascript or server so the functionality > of the > > site is not impaired. > > > > http://alastairc.ac/2010/03/detecting-touch-based-browsing/ > > > > Thoughts? > > > > Jim Allan, Accessibility Coordinator & Webmaster > > Texas School for the Blind and Visually Impaired > > 1100 W. 45th St., Austin, Texas 78756 > > voice 512.206.9315 fax: 512.206.9264 http://www.tsbvi.edu/ > > "We shape our tools and thereafter our tools shape us." McLuhan, 1964 > > > > > > > > > > -- > > Jonnie Appleseed > > with his > > Hands-On Technolog(eye)s > > reducing technology's disabilities > > one byte at a time > > > > > >
Received on Thursday, 1 April 2010 16:04:12 UTC