- From: Bruce Bailey <bbailey@clark.net>
- Date: Mon, 27 Mar 2000 15:15:58 -0500
- To: <joelsanda@yahoo.com>, <w3c-wai-ig@w3.org>
Dear Joel, More comments in line... > I think a good part of my criticism with WCAG #6.3 > lies in what I argue are tensions in the WCAG 1.0 > Guidelines. What tension? I think the WCAG read like poetry! (smile) > One of the assumptions of the WCAG is that accessible > web pages don't have to look different than current > web pages do. In other words, we don't have to give up > all the eye candy web users are used to - like Flash, > Shock, MouseOvers, and so on. I'm not a big fan of > that stuff, but those technologies are probably here > to stay. Right, but authors are free to change their pages if they choose to. The main idea is that "artistic vision" need not be sacrificed for accessibility. This ties in nicely with your next observation... > Indeed, WCAG 11.4 says a separate page should be build > only "If, after best efforts, you cannot create an > accessible page, provide a link to an alternative > page...". Big problem, coupled with WCAG #6.3. Right, because one of the earliest and most vocal objections to creating accessible web pages was the perceived need to create two versions of everything. This turns out not only to be unnecessary, but also ill advised. Still, this is option pursued by many, not the least of which is Microsoft. Adhering to the guidelines by creating a "text-only parallel site" is not a bad thing, in fact it is perfectly acceptable. (It's just that most people here seem to think it is the amateur's way out.) If nothing else, it gets you by 6.3 -- but you still have to figure out how to make your content work without JavaScript! > Here's the problem: Most complex applications deployed > over the web use JavaScript for a variety of > functions, and most of that is client side. For anyone > to adopt the WCAG Guidelines, they'd have to > re-engineer significant portions of their application. > They'd also have to re-engineer server configurations > and development teams/build schedules/testing to > accommodate server-side scripting languages. A lot of > resources. Charles McCathieNevile has already given you a better reply than I on this point. Now, I know just enough JavaScript and Perl to be dangerous, but its pretty obvious to me that complex applications would usually be better implemented on the server side. If someone is comfortable with Basic, they are going to try and use that programming language to create applications, rather than taking the time to learn C. This common enough human behavior leads to all sorts of difficulties. This explains the practice, but it doesn't excuse it. > We've spent the better part of a year developing our > next generation back end, which uses JavaScript - > client side - for all sorts of functionality. A good > part of it in response to disabled user testing. Too bad you didn't spend that time learning Perl or PHP! (wink) > For us to adopt the WCAG, we'd have to go back and > reinvent our already working wheel - as will nearly > all the complex (forms that write to a database, > eCommerce, and so on) sites. Sorry, the WCAG was very close to its current working form more than a year ago! > Adoption of the WCAG will require not only convincing > folks accessible web design is good and will benefit > them, but also they should regress their sites against > Lynx. That's an extremely hard, and IMHO, presumptuous > claim to make. I am amazed that site developers are willing to test their content against "17 versions of Navigator" but not a single non-Netscape / non-MSIE browser! Lynx sees the web in much the same way that search engines, web crawlers, and other AIs do. By designing a site to spec the first time, you get compatibility with the next generation of web surfing devices -- this includes the browsers that are already being built into cell phones, PDAs, and (soon) your car dash board! Is this really that hard a claim to pitch? > In a similar vein with the real world, the ADA never > mandated getting rid of stairs, but rather making > accommodations that permitted the use of assistive > technology. I think requiring Lynx compatibility (or > suggesting it as a litmus test) is akin to saying > architects have to drop the use of stairs. Instead, > architects made accommodations to their design > standards to accommodate the existing technologies, > with the assumption most people are going to use the > same level of assistive technology - wheel chairs or > similar modes of transportation. The analogy doesn't quite hold. Is it just for the ADA that we have elevators? There are lots of examples of new architecture where ramps are the PREFERRED means to go up and down a level. Lynx is not the litmus, the W3C HTML 4 specification are! (It does happen that Lynx handles some HTML 4 aspects better than IE 5, but that's beside the point.) > The flip side of this is to say that web designers > have to not only code for 4.x+ browsers, but make > their sites work in a DOS browser, or come close to > working in that browser. The WCAG addresses author content. It says nothing about what browser a person is using. The WCAG does NOT require that you make your site work with, for example, JAWS 3.5 and MSIE 5.01. You don't have to test your pages with JFW 2.0 and NN 2.02 either! > Hence - the tensions I think exist in the WCAG 1.0. To > use CSS, proper table markup (HTML 4.0), and > synchronized multimedia the user agent MUST be a 4.x > browser, and that will more than likely have to be IE, > since Netscape supports so little of the CSS and HTML > Recommendations. Indeed, to use CSS-P (to avoid tables > for layout) you HAVE to use JavaScript for Netscape to > correctly render CSS-P formatted page elements. The WCAG (and the W3C HTML specifications for that matter) are fixed, stable documents. Yes, it is a problem for users that these standards are not supported well by the major market browsers. As a content provider, you pretty much have to choices. (1) You try and code to satisfy the undocumented and arbitrary capriciousness of the two most popular browsers. N.B., this is a moving target, so if you choose this path, you should expect to re-code your entire site every six months or so. (2) You code to the specifications of the formal published grammars. Customers using your site now may get different cosmetic results six months from now (when they upgrade to an HTML 4 compliant browser), but your pages remain stable. Everyone -- no matter what browser they are using -- gets the same relative content. > And the NOSCRIPT tag doesn't help - some version of > Netscape 4.x don't support that and display the > contents of NOSCRIPT on the page. You are correct that certain browsers are broken. Neither Navigator nor MISE display NOSCRIPT content when the user disables scripting. There is nothing a content author can do about that. The WCAG does not address this issue. (The UA group does, but that's another story!) The WCAG is a stable reference document, it can not possible hope to address how to accommodate non-standards compliant behavior on the part of the user agent! > In conclusion, (finally), the tensions amount to a > mandatory adoption of WCAG #11.4 (second page built to > the WCAG), which in essence means at least two > (Netscape and IE), if not three (Netscape and IE and > Accessible), web sites will have to be built if an > organization wants to conform to the WCAG. It is certainly not the intent of the W3C that authors code once for IE and once for NN. The fact that many sites do this is, of course, misguided. But if one is willing to code twice, then one has no good argument for coding a third time! If one wants to code only once, choose to follow to the formal HTML 4.01 specifications and forget about the transitory behavior of IE and NN! > These tensions, user expectation of what pages should > look like, and the wide use of client side JavaScript > (great idea given the unpredictability of bandwidth) > spell, I argue, a very long and steep uphill battle > for wide spread WCAG adoption. The WCAG has an uphill battle, but I don't think JavaScript envy is the main problem! > Apologies for the long post - residual grumpiness that > I am unable to debate this at CSUN because my company > didn't send me this year! <GRIN> I have great empathy with this sentiment! I should point out that the usability testing you are doing is a very good thing and quite commendable. Most folks, even some interested in access, don't bother much (or at all) with this step. It is very important. The fact remains though, that most shops don't have access to screen readers or relationships with blind customers who could beta test for them. If you follow the WCAG, however, such resources are not necessary. Even those resources at your disposal, you have discovered how easy it is to fall afoul of the formal guidelines! Doing real-world usability testing is much more onerous than adhering to the WCAG, however! You mentioned 17 versions of Netscape. IE has a number of releases to. For that matter, so does JAWS, and JFW is only one of several different screen readers! Lets be optimistic, and decide that their are only 10 versions of IE and NN that you need to test. Well, their are at least that many versions of popular screen readers out there. Are you going to do 100 (10x10) usability test to make sure that your pages render as you think they do? Hmm, how much testing did you do? How confident are you of the results? How many different screen readers did you use? On the other hand, would you prefer to have 100 (or more) versions of the WCAG so that content authors pick the combination of browser and assistive technology they were interested in accommodating? Checkpoint 6.3 cuts hard and sharp and deep. Do YOU want to try and write clear unambiguous guidelines for when JavaScript is accessible and when its not? I wouldn't go near that task with the proverbial ten foot pole!
Received on Monday, 27 March 2000 15:19:07 UTC