Re: Can Silver have normative technology specific requirements?

Hey David,
Do you think, with such a guidelines / methods solution, that AG could
reliably produce and maintain enough methods for enough technologies? That
is my biggest concern about this. Yes, technology specific measurable
methods would be fantastic, but it is very resource intensive to develop
and maintain them. That's effectively what ACT Rules are, and it's a
complex and resource intensive task to keep those up to date, let alone
develop new ones.

W

On Thu, Nov 22, 2018 at 9:14 PM David MacDonald <david100@sympatico.ca>
wrote:

> Patrick asks:
> > I thought the point of non-normative techniques was that what really
> counts is fulfilling a success criterion, and that to do that, there can
> usually be more than one particular way of doing that...hence the
> techniques being non-normative ("here's one way of doing it" versus
> "this is the one true way you HAVE to do it"). If anything, with more
> standards/technologies coming around now, this holds even more true than
> before. Unless I'm missing a nuance here?
>
> David:
> Its true that one advantage of separating the SCs from the techniques was
> that it left room for authors to create their own techniques. However, that
> could also be done if we were to integrate the techniques and success
> criteria. We could have say, 5 technology specific methods and then a sixth
> method could look like a technology agnostic WCAG 2 SC to cover any outlier
> situations. Most people will ignore the last cryptic technology agnostic
> method ard follow the easy to understand technology specific methods.
>
> The main reason we separated SCs from techniques, as I remember, was to
> overcome the problem of having a long term stable standard which didn't
> need revisions in an environment where technology is changing. But the new
> W3C requirement of frequent updates provides us with the opportunity to not
> have them separated because the constantly updated standard can keep up
> with the constantly updating technology.
>
> Cheers,
> David MacDonald
>
>
>
> *Can**Adapt* *Solutions Inc.*
>
> Tel:  613-806-9005
>
> LinkedIn
> <http://www.linkedin.com/in/davidmacdonald100>
>
> twitter.com/davidmacd
>
> GitHub <https://github.com/DavidMacDonald>
>
> www.Can-Adapt.com <http://www.can-adapt.com/>
>
>
>
> *  Adapting the web to all users*
> *            Including those with disabilities*
>
> If you are not the intended recipient, please review our privacy policy
> <http://www.davidmacd.com/disclaimer.html>
>
>
> On Thu, Nov 22, 2018 at 1:24 PM Patrick H. Lauke <redux@splintered.co.uk>
> wrote:
>
>> On 22/11/2018 16:22, David MacDonald wrote:
>> > Brainstorming has begun about what the next major version of the WCAG
>> > may look like. This is an attempt to contribute to that process, with
>> > its historical perspective as a consideration.The universal response to
>> > the WCAG appears to be that it has made an amazing impact on global
>> > accessibility and is a unified standard around which the global
>> > community can rally. However, it is difficult to understand and it’s
>> > technology agnostic language sometimes seems cryptic to those who are
>> > implementing it on their sites. Almost all of the criticisms of the
>> WCAG
>> > 2.0 can be boiled down to “its hard to understand” and “it needs to
>> make
>> > room for soft requirements that are hard to test”.
>> >
>> > To explorehow we got here, let’s go back to 1999, with the release of
>> > WCAG 1.0. It was a breakthrough standard in which design concepts such
>> > as colour, and HTML specific requirements were all mixed together in a
>> > very flat level standard. It was a huge success and began to get legal
>> > recognition. One of the things that led to its quick adoption was that
>> > it was easy to understand, and it made a big difference for people with
>> > disabilities.However, it was very prescriptive on designers and was
>> also
>> > vulnerable to changes in technology. Legal frameworks and standards
>> > historically move much slower than technology.
>> >
>> > We endeavoured to solve this problem in WCAG 2.0. The W3C process was
>> > that normative documents went through a long rigorous process to become
>> > a standard(years). There is also a category of supporting documents
>> that
>> > were non normative and easier to update. WCAG 2.0 extractacted the
>> > characteristics of the 1.0 requirements into technology agnostic
>> > normative success criteria, withseparate non normative technology
>> > specific techniques to meet those success criteria.It was a huge
>> success
>> > and WCAG 2.0 has survived 10 years. But its longevity and stability
>> came
>> > at a high cost. It had 4 layers, 3 layers were normative (Principles,
>> > Guidelines, Success criteria) an one layer (technology specific
>> > techniques) that were non normative.
>> >
>> > In the last few years the W3C has evolved in its approach to standards
>> > for a more iterative approach perhaps inspired by AGILE. There is a
>> > requirement now for standards to be frequently updated on 18 month - 2
>> > year cycles. No more 10 year development cycles such as we had in WCAG
>> > 2.0. At first I was concerned that this change in approach would cause
>> > problems for future of WCAG.
>> >
>> > However, during TPAC I realized that frequent updates to the standard
>> > could solve the dilemma of separation of normative success criteria and
>> > non normative technique where users had to look in 2 places. Frequent
>> > published standards could keep up with technology. So we might be able
>> > to integrate the techniques into the *normative* part of the standard
>> > and merge them with the testable/measurable Success Criteria, into what
>> > the Silver Task Force is calling “methods”. These would be normative.
>> > The WCAG 2.0, 12 guidelines would expand in their role and become
>> > general guidelines under which these methods could be grouped. So
>> > instead of 4 layers of guidance which cause the reader to look in
>> > several places to know what to do, there would be only 2 layers
>> > (Guidelines and Methods). It would overcome the problem of having to
>> > have technology agnostic success criteria which are hard to understand.
>> > The methods would say what to do and how to do it and also be the unit
>> > of measurement of conformance. There would be only one place to lookin
>> > order to know what to do to meet the requirement.
>> >
>> > So I'm suggesting we explore moving integrating the techniques with the
>> > SCs to become *normative* methods that are updated using the regular 18
>> > month-2 year cadence of the normative document cycle.  The general
>> > information such as "Make code correspond to visual layout" would be
>> > guidelines under with all of the methods rest.
>>
>> I thought the point of non-normative techniques was that what really
>> counts is fulfilling a success criterion, and that to do that, there can
>> usually be more than one particular way of doing that...hence the
>> techniques being non-normative ("here's one way of doing it" versus
>> "this is the one true way you HAVE to do it"). If anything, with more
>> standards/technologies coming around now, this holds even more true than
>> before. Unless I'm missing a nuance here?
>>
>> P
>> --
>> Patrick H. Lauke
>>
>> www.splintered.co.uk | https://github.com/patrickhlauke
>> http://flickr.com/photos/redux/ | http://redux.deviantart.com
>> twitter: @patrick_h_lauke | skype: patrick_h_lauke
>>
>>

-- 
*Wilco Fiers*
Senior Accessibility Engineer - Co-facilitator WCAG-ACT - Chair Auto-WCAG

Received on Thursday, 22 November 2018 20:40:14 UTC