IoT Accessibility Use Cases

Mark,

Thanks for the invitation to attend the RDWG meeting for the discussion 
of IoT.  Henny Swan (TPG), Susann Keohane (IBM) and I have been 
discussing IoT Accessibility with an informal goal of creating IoT 
Accessibility use cases to share with groups working on IoT and WoT.

This is a quick synopsis of our ideas to date:

1) Make the output of the sensors standardized and available to the 
accessibility APIs:
Examples:

  *   smart watch that has multi functions like tracking activity or
    blood pressure - the device itself may not have an accessible UI but
    the user can access the data through a mobile app or Website.
  * self-driving cars where the interface in the car may not be
    accessible but the car itself could be managed through an app.
  * home thermostat which is challenging due to many of the devices
    having a multi-function UI of displaying the temp and setting the temp
  *

    Smart thermostat out of reach of a wheelchair user, the thermostat
    data needs to be able to be sent to a mobile or web app.

  *

    Blind person needs the data and controls sent to a  mobile or web
    app that is accessible on their mobile device, even if the original
    controller is not accessible.

Interaction with the sensors needs to be available to the AAPIs

  * Example of the automated home where an alert needs to blink the
    lights for someone who is deaf. The user needs to be able to program
    a pattern of light blinking to have a unique visual indication to
    replace different audio alerts.  (e.g. the door alarm should have a
    different light blinking pattern than the oven timer.)

Data sent to the user must be text and not an image of text

  * Example of a smart thermostat with an image of the temperature which
    cannot be spoken when sent to a mobile apps

Data sent should be modality independent (e.g. a battery charger should 
send "discharged, charging, full" not "red, yellow, green"

Signage:

Airport sign: A blind person is looking for a restaurant.  Web App of 
the airport. Sign is in a location it knows, and has data about finding 
the restaurant.  The data on the sign location could be sent to a user’s 
navigation app on their phone.  Microsoft has a new glasses that will do 
in-building navigation.  Buzz a wearable for someone who is deaf, or 
give text directions to a wearable.


Cognitive issues:

  * Important to have simplified interfaces for controllers with clear
    resets when a user disables something important.  (e.g. an elderly
    person accidentally turning off the heat in a smart house on a cold
    winter night. )

Controller Interfaces:

  * Providing skins for controller apps with varying accessibility needs
    -- magnified for low vision, low contrast, high contrast, simplified
    for different purposes.

We think we are just starting to skim the surface.  I'm glad to see 
there is a WAI group that is looking at these issues.

jeanne

Received on Monday, 13 July 2015 17:57:57 UTC