RE: Silver Requirements Issues

On the measurability / process points, I think Jason covered one method well.

Allowing for some requirements to be scored rather than true/false may make sense for some, the requirement for Silver should be to explore that with an eye on how conformance and accessibility-supported will work (or whatever concepts they become).

I think it has mostly been me talking about using process, and I will try to pull together a decent bit of text that could be used in the requirements doc.

As a quick overview, my thinking comes from the discussions about some of the previously proposed COGA SCs, and what the right tool for the job is.

Some people have raised usability testing as potential way to pass something, which I don’t think would work, even for organisations that can afford it. Useful previous post:
https://lists.w3.org/Archives/Public/w3c-wai-gl/2017AprJun/0654.html


There are some ISO standards (e.g. 9001, 27001) which I have second-hand knowledge of, but my understanding is that you have to do an evaluation of something (e.g. the security risk factors) and document your mitigations. It isn’t a concrete pass/fail, but you have to justify your decisions. There are some strong default practices as well, e.g. all our docs have version history at the front.

So in an accessibility context the requirements for plain-language might involve a process rather than an outcome.

The guidance could start with when you need to consider it. E.g. Does the navigation have over 20 items? Do you run usability testing on your site already? (Other positive/negative factors).

If your site is in scope (rather than your page perhaps?), then you will need to document how you mitigated the issue, which could be by using a dictionary of common words, running usability testing with particular audiences, or another method.

In that way there could be a *layer* of guidance that does not apply to all websites, and encourages good, user-centred-design practice when appropriate.

If anyone is more familiar with this approach please chip in, I’m trying to get more details for my own understanding. (Grrr, paywalls on standards, what’s that about?!)

Cheers,

-Alastair


From: White, Jason J

A second element of measurability, as I understand the proposed Requirements, is to move away from a simple tripartite pass/fail/not applicable distinction. How this would work in the over-all conformance scheme would need to be determined, of course.

For example, if numerical scores are associated with various accessibility requirements, then a conformance level could be defined in terms of achieving certain scores in individual requirements, or in aggregate across a set of requirements, or both. The first case effectively reconstructs a pass/fail arrangement, but the second is more complex (e.g., the average, or total score across four different requirements must exceed a threshold in order for web content to conform at a given level). This raises the question of what the permissible scores would mean in various cases, all of which “pass” so far as over-all conformance is concerned.

I’m also definitely interested in process-based requirements as a complement to content-based ones.
________________________________

Received on Tuesday, 10 July 2018 23:48:49 UTC