- From: Jim Allan <jimallan@tsbvi.edu>
- Date: Mon, 5 Jul 2010 13:26:59 -0500
- To: <kim@redstartsystems.com>, "'WAI-UA list'" <w3c-wai-ua@w3.org>
- Cc: "'Judy Brewer'" <jbrewer@w3.org>
- Message-ID: <007201cb1c6f$a84d0ac0$f8e72040$@tsbvi.edu>
Excellent summary, Kim. Thanks for doing this on short notice. Sounds like you had a positive impact. Jim From: w3c-wai-ua-request@w3.org [mailto:w3c-wai-ua-request@w3.org] On Behalf Of Kim Patch Sent: Thursday, June 24, 2010 2:09 PM To: WAI-UA list Subject: W3C Workshop on Conversational Applications Greetings, A summary of my experience bringing accessibility/usability input to the W3C Workshop on Conversational Applications -- Use Cases and Requirements for New Models of Human Language to Support Mobile Conversational Systems (http://www.w3.org/2010/02/convapps/agenda.html) follows. (I used a court reporter microphone to take minutes by speech for one of the sessions, and several people were interested in how that all worked.) Cheers, Kim 5-minute paper presentation: - I Stressed that users who are disabled are a good testbed and that accessibility improves usability, e.g. disabled user who can't use her hands, able user who's talking on the phone while making dinner. - Mentioned three specific users they could picture and I could refer back to -- Jill, who has repetitive strain injuries and finds it painful to type, Don, who is a quadriplegic, and Jason, who has memory problems after a traumatic head injury (these are all people I know) - Summarized the use cases in the paper Use cases: We all came up with use cases and spent a lot of time discussing them -- these were mine. The first three are probably the most important. They were among the 19 we voted on. 1. Speech commands are hard to remember, especially when they're inconsistent across the tasks the user wants to carry out 2. Users need a central way to discover, adjust, organize and share speech commands (or any type of input command). Users need to be able to use this to easily standardize like commands across programs. 1. Processes sometimes change the mouse/cursor focus in ways that confuse people. This is especially bad for non-mouse users because the next action is less likely to include cursor placement. 2. Users need a way to tell the computer not to change focus, change focus for critical actions and/or to put the focus back 1. Users are afraid to use the system because something might go wrong 2. Users need an ability to undo everything, including actions. In cases where actions are difficult to undo, such as submit, users need the ability to mark such actions, for instance, make all submit buttons red. 1. It's difficult to get users to correct speech commands correctly. 2. A non-cumbersome process like Try again, Change to. This would also better allow software makers to collect information that will improve recognition. 1. It frustrates people when the computer gets things wrong that they wouldn't get wrong 2. Users need a mechanism to add their intelligence when needed 1. People who can't use their hands need to be able to use speech for everything 2. Users need standard ways to open, move among and close programs (or better yet, a good default that can be adjusted). 1. The speech recognizer sometimes get overwhelmed, for instance when someone else nearby starts talking, and is unavailable to the user as it deals with the extra noise. This frustrates users. 2. Users an easy way to zero out the speech recognizer. A keyboard shortcut would be best. Highlights: None of my use cases made the top five that were written up as proposals at the end of the workshop, but elements of the first and second ended up in the written proposals. As our small group talked about the third one, Matt mentioned he was going to push something similar in another working group. People resonated with the idea that the user needs more control, including the ability to disable and enable sets of speech commands on the fly. Others were proposing mechanisms that would allow systems to do this. I pointed out that that the user needs similar control. There were several times during the workshop where people mentioned to me or generally that they hadn't realized usability was so important. At the end when we all went around and said a few words about the workshop three of the dozen people called out usability as something they had gained more insight about. At one point we were talking about the importance of having good defaults rather than prescribed wording. Paolo suggested that it would be good to tap the accessibility groups for defaults like these. -- ___________________________________________________ Kimberly Patch President Redstart Systems, Inc. (617) 325-3966 kim@redstartsystems.com www.redstartsystems.com - making speech fly Blog: Patch on Speech Twitter: RedstartSystems ___________________________________________________
Received on Monday, 5 July 2010 18:21:37 UTC