- From: Paul Bohman <paulb@cpd2.usu.edu>
- Date: Fri, 24 Aug 2001 15:24:58 -0600
- To: "Charles F. Munat" <chas@munat.com>, "WAI Guidelines WG" <w3c-wai-gl@w3.org>
[C Munat] Experiment design is difficult, and even the best scientists often discover too late that they have forgotten to account for some variable. [Paul] Very true. As Kynn said in that post: "If you are doubtful of my figures, I urge you to try the analysis yourself to see what numbers you might get" [C Munat] Assuming that Kynn got the split right, it is still possible (even likely) that the difference is the result of differences in specificity. [Paul] This is a very good point. One which Kynn made as well: "Another thing to add is that certain disability types may have. . . needs which can be expressed in fewer check-points." [C Munat] There are plenty of other reasons why the numbers could come up the way Kynn's did. [Paul] Very true. Interpretation is the hardest part about any sort of study. [C Munat] So what does this mean? It means that Kynn's experiment tells us NOTHING. [Paul] Charles, Your strength is in your command of the concept of _understatement_ <smile>. I disagree with your assertion that it tells us nothing. The numbers and percentages are the result of one person's inquiry into the composition of the guidelines. From what I can tell, Kynn is not necessarily advocating that we increase or decrease the percentage of requirements related to any specific disability (although he may have opinions on the issue). Mere percentages do not convey the full meaning of the data they are meant to represent. Still, his analysis shows a few interesting facts. For example: for whatever reason, there are more guidelines which benefit those who are blind than those with other disabilities. Whether this is a result of the nature of the disability, the ease of validation, the amount of knowledge about the disability, the advocacy of blind people, or whatever else, this is still something worth looking at. In fact, just seeing the numbers causes us to reflect on the matter. We may disagree with the findings, the methodology and so forth, but, as you have said in other posts, it is important to have this sort of information because it makes us think. [C Munat] We can't use these results because we haven't designed the experiment properly, and there may be serious problems with our results. If we depend on these results in any way, we could be making a big error. [Paul] Replication of studies, or alternative versions of studies are the usual way of remedying this kind of complaint. [C Munat] But the biggest problem with this sort of informal experiment is that it may fool us into thinking that we DO know something. [Paul] Good point. We have to be careful about jumping to conclusions. [C Munat] Worse, this sort of experiment gives support to partisanism. [Paul] If the results are interpreted that way, then yes, you're right. [C Munat] Here is what I recommend: Instead of looking for bias in the WCAG, why don't we look for needs that haven't been addressed? [Paul] It seems to me that this analysis can be used as one of many tools to do just that. [C Munat] Has it occurred to anyone that it might take more checkpoints to address the needs of one group than it does another? (Kynn acknowledges this in his comments about photo-epileptics.) Who cares how many checkpoints address this group or that group? This isn't a contest to see whose is longer. [Paul] Good point. We should be very careful about using Kynn's methodology as a litmus test for equality across disability types in the guidelines. In fact, I can say right now that it is my opinion that we should NOT use his methodology for this purpose. [C Munat] The real question is HAVE WE FAILED TO ADDRESS ANY NEEDS? If we have, then please state the specific problem, and, if possible, some solutions. SOLUTIONS ALONE DON'T CUT IT. . . . [Paul] This sounds like a good idea to me. Perhaps I can spend some time on this one. I'd be interested to see others do the same. Paul Bohman Technology Coordinator WebAIM (Web Accessibility in Mind) www.webaim.org Utah State University www.usu.edu
Received on Friday, 24 August 2001 17:24:57 UTC