- From: Leonard R. Kasday <kasday@acm.org>
- Date: Fri, 14 Jul 2000 10:55:21 -0400
- To: w3c-wai-er-ig@w3.org, w3c-wai-au@w3.org
Here's a draft of the minutes. Please look it over, especially the things you said, and post corrections to the lists. Len p.s. For those of you on both ER and AU, sorry for the duplicate email. ========================================== Minutes ER/AU Meeting 7/11/2000 Present KD: Karl Dubost LK: Leonard Kasday (Scribe) JT: Jutta Treviranus (Chairing this joint meeting) MC: Michael Cooper HB: Harvey Bingham DB: Dick Brown CB: Chris Ridpath HS: Heather Swayne WL: William Loughborough CM: Charles McCathieNevile FB: Fred Barnet ---------------------------------- 1. INTRODUCTIONS KD: Has translated for WAI. concerned about quality of tools and software in the world. new job to be defined, will try to improve implementation of recommendations in software and documentation etc. Prefer to have some comments from us, what we think about quality assurance, what's your experience, how to achieve. 7 JT: Karl is new w3 conformance manager JT: I'm chair authoring tools MC: I'm project mgr bobby at cast. mainly with er group at WAI HB: On number WAI groups, consulting on digital talking books CR: working on a-prompt, editing AERT DB program mgr access HS: member access and disability group at MS FB: from html writers barnett guild WL: smith-kettlewell, part of ER and AU WC: with er, staff contact and editor, also editor GL BM: cast, Brian Methany bobby LK: Institute on Disabilities/UAP at Temple, Chair Evaluation and Repair Interest group CM: Charles McCathieNevile, Staff contact AU JT: anything to add to agenda? WC: add to agenda integrate au and er JT: lets explain tools for ER 2. ROLE OF THE CONFORMANCE MANAGER LK: we give up/down or tools or just guidelines? KD: will not give advice on commercial products at first certainly, cause big job, very difficult, and perhaps not best method for first approach. Easier to try to convince people on recommendations and specs. Recommendations and specs very important. Important for WAI, because influences html writing, much larger than accessibility, could help convince people to do more job for recommendations in general. Will set up mailing list for all groups to have right way to do my job. JT: will set up listserv? KD: will try to have mailing list WL: start at home, best thing is to get w3 and WAI web presence to make sure that all of w3 site is conforming. HB: good learning experience for those responsible for recommendations. JT: w3c and WAI should be first sites. Start at home KD: sure JT: and move to Amaya LK: include tools like change control tools KD: also, hearing impairment. Do you mean tools on site? CM: yes, we have tools like change access control, mailing list tools, other internal tools. Accessibility is being worked on. One of my roles is to interface to the system team. So send to me or sysrec@w3c. A couple of areas where conformance runs beyond w3c pages, we should be exemplar, not run behind. Lot of call for "how do we do this?" This is what we're specifying in contract, how to make sure it's being tested? Can't just look at code with Emacs. So pressure coming from people asking, what's tools, where's the software, how do we know what's good? In au we want to be able to say: here's what we know about the various tools? We've don't by informal assessments. No checkpoint by checkpoint comparisons, haven't said, "doesn't pass". But we are in position to make decisions to particular checkpoints. We need a reproducible process so that I and developer get same answer. And that anyone would have come up with. People are asking the questions. We're not sure how far we go in answering, but are doing a disservice by not answering. JT: in other words, we need conformance objective conformance, for AU that's a big body of work? The amount of work to make tool is beyond what we can do. One of the hopes was that you would be able to help with effort or process. In terms of time you have, how much would you have to devote to tool, or architecture of tool? KD: don't know my time allotments, but will know in a few days or weeks. Will try to be available to all groups and help everyone. CM: we could look at how many activities across w3c, could expect 4% of time. But what is w3 approach of this? Has you looked at AU conformance work? I.e. our tests. KD: No, not yet. Will look. LK: can integrate obj sub KD: be fully conformant, not just with technical specs, need to manage. CM: can write syntactic html which is bad HTML also. E.g. can be fonts, perturbations, etc, all wrong. It will all parse, but goes against w3 html really says. So it's wrong. So it's not just in accessibility where you can't test accessibility. KD: So are you saying recommendations need to include WAI? CM: yes, in part. E.g. in SVG, halfway approach, conformant SVG must meet authoring level A. That kind of spec in w3 in general is helpful. HTML is not correct just because it has right pointy brackets. So it's not just accessibility that needs human checking. LK: Can you give examples not having to do with accessibility. CM: 7 paragraphs with bullets instead of list is not really meet specs. Or using blockquote to indent. KD: In addition to accessibility, there's usability for people without disability. maybe we should have recommendations that it be usable. If you make document with white character it's generally inaccessible. LK: What about white box vs. black box testing? In white box testing you know what code is running and do tests on the code. In black box testing you just have user operate and observe what happens. So e.g. if you are testing the HTML code that's white box texting, if you're just using a browser and screenreader that's black box testing. DB: both done at MICROSOFT. Hard to do black box testing. depend on the assistive technology CM: most developed while only way to be sure is to complete black box, so subset to most efficient point, sometimes can do code test. Would like to automate the test also to read browser and read the results. Line is whether you're looking at underlying code or looking through a browser. usability testing important and expensive. So how to do things from start to see if things are right? Also, w3 doesn't certify anything. If anyone stands up and claims conformance other people can say opposite, so people are reluctant to say it if they are not supported. So they ask if they can hang back and wait. SUMMARY JT: so to summarize, 1. would like WAI conformance in ever tying in w3c 2. every evaluation as automatic and human judgment... challenge is how to do conformance by machine, 3. and how to do objective, repeatable human judgment. Part of this is compromise, because of all differences in browsers. And representative sampling of white box sampling. LK: What do you mean by sampling? E.g. do you mean instead of looking at all alt text just use statistical sample? JT: or try sample browsers instead of all KD: Do we need test suites to make better tools. CM: will start develop test suite for authoring tools. Fairly large piece of material. Another big part: building processes, the development groups have more experience. Need process: these are steps to go thru so that when they do testing they've got it right. Important here in Australia Done in ER? WC: EO has looked into it some. Have doc on where to begin? Is it GL or ER? Next version of GL will say that minimum requirements are easily testable. WL: EO is link to outside for everyone. CM: what's your experience in developing process where tools come out accessible, (question to dick and heather) Heather: we're learning how to do that properly, e.g. didn't do first time in ie4. How do we test and agree product is ready to go... is that what you're asking: : CM: no, what process you need to be doing? E.g. for lots of stuff, like all stuff under MSN. DB: talking about sites? CM: both. Easy to see in sites. DB: process more along for applications, cause we've had own guidelines that predate WAI. Still developing systematic process. Still pretty young Heather: agreed. In application we have check and balance, program manager/developer/testers . We teach all how to conform. DB: in MSN side, checklist is one of the big things we use. JT: one of most powerful tools for large site is to create accessible templates. DB: Yes, that helps, but much more needed. We also try to do things for things common across site families. WL: Also, every time you do this stuff, keep in mind everyone who will do whatever you want. FURTHER SUMMARY JT: so other points 1. how to give guidelines on process 2. how to evaluate when done. HB: how many other groups have you been getting this info from? KD: You're the first. Will be talking about schema later. Theoretically easier cause maybe more automatic. WAI most difficult because of human judgment. Just starting here at Sophia. JT: any last thing? LK: don't put all load on accessibility. WL: Must be usable to be accessible. MC: yes, but that's always true. WL: diverts efforts if trying to make sites usable by all. DB: if we focus too much on solutions for all takes effort from accessible. JT: also difference in degree. usable different, cause less effect WL: but if I can't order on web, it's a complete barrier. MC: As definition includes cognitive more they come closer together. ACTION ITEMS JT: action items (to CM and WC) talk about database tools CM: will work on Thursday with system team for database for checking tools. Should be on line pretty soon. Will then work with Ian on techniques unified document for techniques for /* document would come from database to produce doc */ create views whereby user can specify what view they want So for authoring tool, can create view that gives techniques that I need, create database so we can create views and collectively do work. JT: CM... you're working on testing tool and instrument in addition... is that true CM: /* dropped */ ---------- JT: Would ER look at things other than content. LK: resolution was just doing content JT: are you working on testing instrument, not just database? /* CM returns */ CM: yes. LK: say more about testing instrument? JT: tester for AU tools to make elements of human judgment as objective as possible. CM: here's how to do test, is document, in some cases can automate. JT: it's the AERT for authoring tools. NEXT MEETING JT: anything else? Next combined meeting first Tuesday of august, august 1, 2: : 30 Eastern Time email this to lists -- Leonard R. Kasday, Ph.D. Institute on Disabilities/UAP, and Department of Electrical Engineering Temple University 423 Ritter Annex, Philadelphia, PA 19122 kasday@acm.org http://astro.temple.edu/~kasday (215) 204-2247 (voice) (800) 750-7428 (TTY) The WAVE web page accessibility evaluation assistant: http://www.temple.edu/inst_disabilities/piat/wave/
Received on Friday, 14 July 2000 10:53:56 UTC