Re: Current WoT Specs and Accessibility

Right, by “alexa” I really meant “any voice assistant”, it was just that I have only personally investigated the Amazon service, although my understanding is they all work similarly.
At any rate, I have copied my response to the issue and we should probably follow up the conversation there...
Michael McCool

From: Janina Sajka <>
Date: Wednesday, May 4, 2022 at 9:09 AM
To: Mccool, Michael <>
Cc: Kazuyuki Ashimura <>, <>, W3C WAI Accessible Platform Architectures <>, <>, <>, <>
Subject: Re: Current WoT Specs and Accessibility
Thanks for this prompt and substantive response, Michael. I wish only to
emphasize that the use case should see Alexa as a stand-in for any voice
based system, e.g. Siri, Google Home, etc, because these are used in
descriptions of devices and people make purchasing decisions believing
this will make those devices accessible to them. Sometimes this turns
out to be true, but other times not. We're concerned to be able to
document and point to the software that sets up the voice interaction,
e.g. a lamp that requires you set up an account on Smart Life
(inaccessible) in order to get Alexa working.

In other words, I don't think we're looking for yet another layer.
Rather, we want to expose the key layer already present, e.g. Philips
Hue is, I believe, quite accessible, though I have no direct experience
of using it.



Mccool, Michael writes:
> Janina,
> The WoT chairs discussed this and we will discuss more later today.  I will make some brief comments here but also forward to the public WoT mailing list for more input and mention it in the main WoT WG call today.
> I looked into Alexa integrations a while ago and my understanding is that some code needs to be written and uploaded to the Alexa service to integrate Alexa with a cloud service supporting an IoT device for it to work.  WoT TDs expose a standardized interface so in theory some “middleware” code could be written to allow ANY IoT device described with a WoT TD (aka Thing) to be integrated into Alexa, and WoT Discovery allows TDs to be accessed via search mechanisms (which Alexa middleware could, in theory, also support).  However, we have not written up this use case or prototyped it, and doing it “properly” would involve some work, and it needs for the IoT device to be visible on the public internet (local connections alone are not sufficient; and there are security implications to making them visible on the internet); but it is possible.  We should probably explore what it would take to do this in our next charter, and see if there are any gaps we need to fill (such as metadata in TDs supporting voice interactions).
> Michael McCool, WoT co-chair
> From: Janina Sajka <>
> Date: Wednesday, April 27, 2022 at 4:11 PM
> To: Mccool, Michael <>
> Cc: Kazuyuki Ashimura <>, <>, W3C WAI Accessible Platform Architectures <>, <>
> Subject: Current WoT Specs and Accessibility
> Dear Michael, All:
> I am mindful that APA has yet to sign off on WoT documents. I'm hopeful
> we can quickly resolve the question that has arisen that would satisfy
> our concerns.
> For that to happen I need to briefly recall our last formal interactions
> around the time of TPAC 2020. WoT had a series of use cases articulated,
> and APA's Research Questions Task Force provided comments on
> accessibility aspects of those use cases.
> One question that arose from WoT was what specific accessibility use
> case might need to be covered? As I recall, we were unable to articulate
> anything specific at the time. We now do have a use case which is indeed
> consequential, even though it's strictly informational.
> We have discovered that many IoT devices are marketed as "works with
> Alexa" or "with Google Home." Since Alexa and Home are generally
> considered good for accessibility support, the customer may conclude
> purchasing the device will provide something they can use, but this
> isn't necessarily so because there is any additional level of software
> required to manage the device which isn't always disclosed by device
> marketing.
> I'm referring to the ecosystems of device management such as Philips Hue
> or SmartLife, just to name two examples. Among ourselves we have begun
> referring to this level of device management as "middleware," having no
> notion whether there's a more appropriate term current in WoT. We'll be
> happy to change our terminology, so please advise on that.
> The point, however, is that these middleware apps--whatever their proper
> class name is--may, or may not be accessible. When they're inaccessible,
> there's very little chance the customer enticed by "works with Alexa"
> will ever succeed at getting the device to work with Alexa without the
> assistance of a nondisabled third party helper.
> So, our concern is that this "middleware" layer needs to be
> systematically (and programatically) exposed, so that accessibility
> advocates can advocate for accessibility support in "middleware," and so
> customers can purchase with informed confidence.
> Is this data layer exposed in the APIs your group is developing? If not,
> how can we meet this accessibility use case going forward?
> --
> Janina Sajka (she/her/hers)
> The World Wide Web Consortium (W3C), Web Accessibility Initiative (WAI)
> Co-Chair, Accessible Platform Architectures


Janina Sajka (she/her/hers)

The World Wide Web Consortium (W3C), Web Accessibility Initiative (WAI)
Co-Chair, Accessible Platform Architectures

Received on Thursday, 12 May 2022 16:41:56 UTC