W3C home > Mailing lists > Public > public-comments-wcag20@w3.org > December 2004

Web Content Accessibility Guidelines 2.0 - comments on the "Baseline Technology Assumption"

From: Tina Holmboe <tina@greytower.net>
Date: Wed, 1 Dec 2004 21:21:00 +0100 (CET)
Message-Id: <200412012021.iB1KL04W024429@asterix.andreasen.se>
To: public-comments-wcag20@w3.org


  I would like to express my concern that this concept has even made it
  into a document such as the WCAG 2.0 WD. The idea that a baseline
  technology - aka. "lowest common denominator" - can be defined goes
  against the very platform and client independence that the World Wide
  Web is meant to incorporate.

  It is a reality that most developers work with some sort of baseline
  technology in mind. In the past, specifically for WCAG 1.0, such a
  baseline has however not made it into the WCAG.

  This means that as long as the guidelines are met, developers can
  choose their own baseline. This, in turn, mean that the baseline follow
  accessibility requirements - a technology that cannot meet certain
  guidelines becomes a "feature", not a "requirement". With this new
  proposal, the requirements become dependent on the lowest common
  denominator -specified- by the W3C.

  The WCAG 2.0 WD mentions client-side scripting explicitly: "For
  example, WCAG 2.0 would assume that user agents and assistive
  technologies can effectively interact with scripted content....".
  However, the document also state that: "The design principles in this
  document represent broad concepts that apply to all Web-based content.
  They are not specific to HTML, XML, or any other technology. This
  approach was taken so that the design principles could be applied to a
  variety of situations and technologies, including those that do not
  yet exist."

  Taken together, these two statements paint a bleak picture. For
  instance, requiring that a user-agent can effectively interact with
  scripted content -including- such technologies as do not yet exist
  mean that UAs such as Google need not only support Java- and
  ECMA-script, but effectively parse/execute these scripts on spidering
  a website, but also be ready to support any number of other
  client-side scripting languages.

  In effect: a randomly selected UA would fail to conform if it did not
  support an equally randomly selected client-side scripting language,
  whether standardized or not. ISO/IEC 13816:1997 ISLISP springs to
  mind. Given a DOM API, a site can in all honesty claim that their
  content is accessible if implemented in ISLISP.

  The implications are staggering. The first conclusion one is forced to
  draw is that the W3C WAI is effectively ignoring those who today (a) do
  not have an UA with support for client-side scripting, (b) those who for
  one reason or another, whether by choice or not, turn scripting off, and
  (c) search engines who provide many groups of disabled people a way of
  locating information.

  The next conclusion is that the "Baseline Technology Assumption" is
  opening the way for some frightfully sloppy thinking. It is said
  that:

   "The result would be a more stable WCAG 2.0 as well as better integration
    with UAAG to put the responsibility for the appropriate parts of the
    accessibility issue on the appropriate parts of the Web technologies (user
    agents versus Web content)."

  What I read here is the following: "If the technology we, as developers,
  want to use is not supported by the user-agent of individual X but is listed
  in the priority 1 checkpoints of UAAG 1.0, then this is a problem we can
  shuffle off on to X, and still claim that we are accessible."

  One consequence can be seen quite clearly by observing the following
  requirement from UAAG 1.0, priority 1:

   "Render content according to format specification (e.g., for a markup
    language or style sheet language)."


  The assumption, here, is that the UA handle the style sheet language as
  specified. This, in effect, mean that the author does not need to worry
  about graceful fallback *at all*. For the WCAG to require that an author
  test in -all- user agents is clearly impractical - but for the WCAG to
  effectively give authors a free approval-of-accessibility via the faulty
  assumption that any and all UA that do not comply with certain level of
  technology need not apply is as bad.

  I suggest, at the strongest, that the "Baseline Technology Assumption" is
  removed from the WCAG 2.0 WD. Keep firmly in mind that even in a wellfare
  state with good support for the disabled, these groups are not automatically
  provided with the very latest in hard- and software - and the reality of
  many users are far, far from such an ideal situation.

  Any "Baseline" that remains in WCAG 2.0 must not go beyond

     * HTTP.

     * HTML - at the very least giving the user access to a linearized version
       of the content.

  Any and *all* other technologies must be "value added", and not in any
  form or shape demanded.

  I am embarassed to feel the need of reminding the WCAG 2.0 WD authors what
  the first 'W' in 'WWW' represents. Old users - figuratively and literally -
  and old equipment will not disappear from reality during the first half
  of 2005 to conveniently make way for this new set of guidelines.


     - Tina Holmboe, Greytower Technologies (UK) Ltd.
     - David Dorward
Received on Wednesday, 1 December 2004 20:21:09 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Sunday, 17 July 2011 06:13:18 GMT