Re: Scoring and Dashboards

 Hello,

My concern with dictating the speed at which scores "deteriorate" is that
different situations have a different "decay" rate.  As we've discussed in
the Silver meetings, an accessible site will become inaccessible because:
1) the code changes without accessibility testing and remediation, 2) the
content changes without accessibility testing and remediation, or 3) the
browser, user agent, or AT is updated and no longer works well with the
code.    For a tool that is built in and used in an intranet the speed it
becomes inaccessible will be much slower (perhaps years) than a public
website with content updated daily and new code releases every week
(perhaps days). I personally do not see how the AG or WAI can dictate a set
time frame of when something becomes less valuable from an accessibility
point of view.

I continue to believe that the best way to address this is to require a
date tested for all tests and include a statement in the
understanding documents about the importance of testing frequency and the
risk of including older tests in dashboards or conformance claims.  A great
feature for dashboard providers would then be to allow customers to adjust
their settings so that  reports to reflect their development and deployment
situation.

Regards,

Rachael

On May 11, 2020, 12:10 PM -0400, John Foliot <john.foliot@deque.com>, wrote:

Hi All,

During our calls last week, the use-case of monitoring conformance
dashboards was raised.

One important need for *on-going score calculation* will be for usage in
these scenarios. After a bit of research, it appears that many different
accessibility conformance tools are today offering this
feature/functionality already.

Please see:

https://docs.google.com/document/d/1PgmVS0s8_klxvV2ImZS1GRXHwUgKkoXQ1_y6RBMIZQw/edit?usp=sharing

...for examples that I was able to track down. (Note, some examples today
remain at the page level - for example Google Lighthouse - whereas other
tools are offering composite or aggregated views of 'sites' of at least
'directories' [sic].)

It is in scenarios like this that I question the 'depreciation' of
user-testing scores over time (in the same way that new cars depreciate
when you drive them off the lot, and continue to do so over the life of the
vehicle).

Large organizations are going to want up-to-date dashboards, which
mechanical testing can facilitate quickly, but the more complex and
labor-intensive tests will be run infrequently over the life-cycle of a
site or web-content, and I assert that this infrequency will have an impact
on the 'score': user-test data that is 36 months old will likely be 'dated'
over that time-period, and in fact may no longer be accurate.

Our scoring mechanism will need to address that situation.

JF
--
*John Foliot* | Principal Accessibility Strategist | W3C AC Representative
Deque Systems - Accessibility for Good
deque.com
"I made this so long because I did not have time to make it shorter." -
Pascal

Received on Monday, 11 May 2020 16:22:21 UTC