Re: Costs of testing with Silver

My view is that this basic premise of the values and ethics held by the organization / publisher of “Accessibility is the standard quality without additional cost” is the goal of our professional community of accessibility advocates. The role of convincing others that accessibility is a fundamental value that grows business does not belong to the guidelines that inform how best to achieve it. That role is outside of guidelines. It belongs to us as people.

Once the guidelines are not “very very very difficult to understand” (and subsequently test conformance to), the role of advocacy should become easier. But advocacy still focuses on people and having values that support people – not guidelines.

Most of the points made thus far on the cost consideration seem to make or at least suggest several concurrent assumptions:

  1.  A website / content is produced by an organization / company / business
  2.  There are 3 sizes for such an organization: large, small and very small
  3.  That organization does not have a value or ethic or mandate to be accessible
  4.  If that organization has a legal obligation to be accessible, they adopt both the conformance and checklist mentality to explicitly meet the requirements of conformance (and not of people)
  5.  The organization’s process for achieving conformance occurs at the end through testing and remediation and does not move left to planning and design and mitigation
  6.  The organization’s cost for this process is fixed or at least measurable
  7.  The number of tests required and subsequently the associated time and cost increases if there are any additional or variants to current criteria in order to conform
  8.  The time in “minutes tested per page” is a single event

These seem to be assumptions that should be validated. The web (and other tangent domains that Silver may cover) is far more variable than this. Values and ethics are variable. Cost is variable. People are variable.

I am not suggesting that supporting people can be done without cost.
But I am suggesting that if Silver as a standard considers cost, then there must be a method or formula or framework that understands cost and that whatever the result of that calculation, it is not used to limit any criteria or conformance model or needs of people. If Silver suggests that {x} level of support has {y} estimated associated cost, it must do so in a way that guides people toward ideas for reducing cost and not toward reducing the level of support.

I hope that we can address the original concern without introducing anything that could imply that less support = less cost.

I am looking forward to continued conversation and possible solutions.

Cheers,


Charles Hall // UX Architect, Technology

charles.hall@mrm-mccann.com<mailto:charles.hall@mrm-mccann.com?subject=Note%20From%20Signature>
w 248.203.8723
m 248.225.8179
360 W Maple Ave, Birmingham MI 48009
mrm-mccann.com<https://www.mrm-mccann.com/>

[MRM//McCann]
Relationship Is Our Middle Name

Ad Age B-to-B Agency of the Year, 2018
Ad Age Agency A-List 2016, 2017
Ad Age Creativity Innovators 2016, 2017
North American Agency of the Year, Cannes 2016
Leader in Gartner Magic Quadrant, 2017, 2018



From: Makoto Ueki <makoto.ueki@gmail.com>
Date: Saturday, September 8, 2018 at 3:20 AM
To: Silver Task Force <public-silver@w3.org>
Cc: Wilco Fiers <wilco.fiers@deque.com>
Subject: Re: [EXTERNAL] Costs of testing with Silver
Resent-From: Silver Task Force <public-silver@w3.org>
Resent-Date: Saturday, September 8, 2018 at 3:20 AM

Hi Wilco and all,

I've just joined this mailing list. This is very interesting thread. Let me join the discussion.

Wilco Fiers wrote:
> The question is, how do we enable organisations with a small budget to still use Silver?

This is a tough question.

Wilco Fiers wrote:
> Another perspective I want to add is that organisations are already using "light" versions of WCAG.

I like the "light" version/level approach. Actually I've been using the "basic/minimum 10 points you should do at least" which I picked up from WCAG SC/Techniques for the "beginners". It works very well. I got inspired by WAI's "Easy Checks".

In Japan, we don't have any legal pressure on making web content accessible to PwD, even for public sectors. So it has been greatly challenging for us to promote web accessibility in Japan.

I'd say that it is not easy for any organizations to make WCAG conformance claim even at Level A. People tend to think that it is "All or Nothing". When it comes to web accessibility, they won't do anything about accessibility if they are not likely to make it to achieve the Level A conformance. Because Level A is the lowest level.

That's why I like the approach which will make many more web pages more accessible.

Wilco Fiers wrote:
> Ive created a Google spreadsheet to help us see how the cost of audits disproportionally impact low budget websites:

This is very interesting. Thanks so much for sharing this.

I'd say that "Minutes tested per page" and "Hourly rate a11y tester" depends on the web pages. If the web page is "simple", then "Minutes" and "Hourly rate" can be reduced. And "small" web sites with a small budget tend to be "simple".. So "% cost of testing" can be smaller.

Wilco Fiers wrote:
> You argue that it's economically viable for web developers to do a free WCAG 2 audit for every $5k website, as a standard practice. Not just "easy check" style testing, but a full WCAG 2 audit. Can you show me any organisations that do this today?

The situation in Japan is going to split up into two groups in terms of "standard practice". One is "Accessibility is the standard quality without additional cost". The other is "Accessibility is the additional value with extra cost". It depends on their values, their way of thinking.

What I've seen so far in Japan is the latter attitudes in most cases. There is a large web design/development company in Japan which assures their customers of "Level A" quality without extra cost. But it's an exceptionally rare case.

On the other hand, web professionals in Japan still don't understand what to do for making their web content accessible. Most of them imagine they need to do very special things which they won't do normally. And those things will work only for PwD. WCAG 2 is very very very difficult to understand. That is the root of all issues we have today.

Once they understand that the basic things like "Easy Checks" stuff which are NOT special, they might take the former attitude, "standard quality without additional cost".

Anyway, I prefer the "light" version/level approach which would allow many more web pages to be more accessible to wider range of users than having Level A as the lowest hurdle. That would enable organisations with a small budget to still use Silver.

I hope my English works :-)


Cheers,
Makoto

2018年9月7日(金) 18:56 Wilco Fiers <wilco.fiers@deque.com<mailto:wilco.fiers@deque.com>>:
Hey all,
@Mike Crabb: I think this is very interesting stuff. I am aware that work is already happening that could be used to solve the problem I've outlined. Having different requirements based on the type and complexity of the content you are testing makes total sense to me. I am looking forward to seeing those modals, and I think it's very much worth the effort to try and work out how we can have testing with Silver to average out around 5% of the total website budget.

@John F. I ask that you keep an open mind to this idea of adjusting requirements based on the complexity and type of content. As Mike suggested, work is already happening on this. Lets at least try to solve the issue, instead of rejecting it out of principle.

You argue that it's economically viable for web developers to do a free WCAG 2 audit for every $5k website, as a standard practice. Not just "easy check" style testing, but a full WCAG 2 audit. Can you show me any organisations that do this today?

Wilco

On Fri, Sep 7, 2018 at 5:26 AM Victoria Clark <fromtheturtlesback@gmail.com<mailto:fromtheturtlesback@gmail.com>> wrote:
Hello all,

This is the first time I've responded to a thread but, boy, was this a whopper of a discussion! I like the idea of a tiered system of conformance, as this hierarchy is something I have seen used across multiple organizations. I'm used to a hierarchy based on level of access: blockers (one or more PwD would be blocked from digital content), poor ease of use (not blocked, but it is difficult, takes longer, and/or is confusing), and enhancements/usability. I like the added layer of including certain functions being required of a user agent vs. the development.

On Thu, Sep 6, 2018 at 1:03 PM Alastair Campbell <acampbell@nomensa.com<mailto:acampbell@nomensa.com>> wrote:
Hi Everyone,

I think we would struggle to put together a document that provides well defined levels for types of organisation, or even size of project. Many project are updates to an existing web-estate, so lots of small project could then avoid requirements.

I’ll try and be solution focused, and suggest that:


  *   We lead with the user-requirements as ‘guidelines’ (as I suggested previously), with general and per-technology specific criteria underneath that guideline.
  *   Each guideline could have levels, like A/AA/AAA, except that it cuts the criteria into levels instead of the guideline. E.g. WCAG 1.3.1 for HTML could be split into:

     *   Guideline: The design is represented with appropriate structure and metadata.
     *   HTML Gold: Every element uses the right tag/attributes, and are appropriately nested (manual test).
     *   HTML Silver: Headings and lists are used and correctly nested, labels and for/ID relationships are valid.
     *   HTML Tool Bronze: The CMS provides a headings feature for content authors, and warns about full lines of bold text.
     *   HTML bronze: Headings and lists are used (with some pre-defined auto-wcag style tests)
(A quick, off-the-top-of-my-head example.)

  *   The requirement is not split between levels, but the amount of effort needed to achieve it might be.
  *   Some requirements are weighted more towards user-agents and authoring tools. If Wix/Squarespace/Wordpress et al provided options for (more) accessible output, smaller organisations would have less testing to do.
  *   One for John: At bronze the requirement for focus styles could be placed on the user-agent, but for silver/gold the requirement could be for the site.
  *   There could be a ‘slice’ of the criteria that are aimed at sites using a good tool provider, reducing the testing ‘surface area’.
NB: The tool provider would need to say that they fulfil the other requirements, so it becomes a marketing & procurement issue rather than a site development issue.
  *   I take John’s point that we have little or no leverage with the user-agents, but if we lead with the user-requirement, and provide ‘techniques’/  methods across websites/UA/authoring tools, it will make it much clearer where the effort needs to be applied!
  *   If we go down the route of levels for organisation capability, then it should be tied to other activities they are doing. For example, usability testing could be a valid method if the organisation already runs such testing in general, and that puts them above the small scale. This supports my recurring point that some things should be process-based rather than content based.

Cheers,

-Alastair



--
Wilco Fiers
Senior Accessibility Engineer - Co-facilitator WCAG-ACT - Chair Auto-WCAG
Error! Filename not specified.
This message contains information which may be confidential and privileged. Unless you are the intended recipient (or authorized to receive this message for the intended recipient), you may not use, copy, disseminate or disclose to anyone the message or any information contained in the message.  If you have received the message in error, please advise the sender by reply e-mail, and delete the message.  Thank you very much.

Received on Saturday, 8 September 2018 16:43:41 UTC