W3C home > Mailing lists > Public > public-silver@w3.org > May 2020

Re: Scoring and Dashboards

From: Jeanne Spellman <jspellman@spellmanconsulting.com>
Date: Fri, 22 May 2020 13:48:30 -0400
To: public-silver@w3.org
Message-ID: <cd2ab77f-689c-8034-0763-fb4f8cca140c@spellmanconsulting.com>
I completely agree that clients want dashboards and they want to know 
how they compare with themselves over time. No question.

I think the important issue here, is do they want the features of their 
dashboard set as a standard requirement?  I think that all accessibility 
tool makers provide different features and they use the difference 
between those features to both to distinguish themselves competitively 
and to meet the needs of specific industry sectors.

I think we could do harm to the industry if we started writing 
requirements of what was needed in a dashboard and how to do it.

I think our job is to write the accessibility standards that industry 
needs harmonized around the world, without telling tool makers how to 
build their tools and what features to put in them. I don't think W3C 
belongs in the dashboard business. Deque, Tenon, Site Improve, (to only 
name a few who responded to this thread) are in that business and know 
best what they want to give the customers they serve.  Different 
industries or  sectors want different dashboards.

I would never argue the importance of dashboards and measuring 
performance over time.  I don't think standardization is needed there, 
and could actually stifle innovation.


On 5/11/2020 12:08 PM, John Foliot wrote:
> Hi All,
> During our calls last week, the use-case of monitoring conformance 
> dashboards was raised.
> One important need for *on-going score calculation* will be for usage 
> in these scenarios. After a bit of research, it appears that many 
> different accessibility conformance tools are today offering this 
> feature/functionality already.
> Please see:
> https://docs.google.com/document/d/1PgmVS0s8_klxvV2ImZS1GRXHwUgKkoXQ1_y6RBMIZQw/edit?usp=sharing
> ...for examples that I was able to track down. (Note, some examples 
> today remain at the page level - for example Google Lighthouse - 
> whereas other tools are offering composite or aggregated views of 
> 'sites' of at least 'directories' [sic].)
> It is in scenarios like this that I question the 'depreciation' of 
> user-testing scores over time (in the same way that new cars 
> depreciate when you drive them off the lot, and continue to do so over 
> the life of the vehicle).
> Large organizations are going to want up-to-date dashboards, which 
> mechanical testing can facilitate quickly, but the more complex and 
> labor-intensive tests will be run infrequently over the life-cycle of 
> a site or web-content, and I assert that this infrequency will have an 
> impact on the 'score': user-test data that is 36 months old will 
> likely be 'dated' over that time-period, and in fact may no longer be 
> accurate.
> Our scoring mechanism will need to address that situation.
> JF
> -- 
> *John Foliot* | Principal Accessibility Strategist | W3C AC Representative
> Deque Systems - Accessibility for Good
> deque.com <http://deque.com/>
> "I made this so long because I did not have time to make it shorter." 
> - Pascal
Received on Friday, 22 May 2020 17:48:44 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 20:31:48 UTC