- From: Andy Heath <andyheath@axelrod.plus.com>
- Date: Wed, 02 Oct 2013 01:39:52 +0100
- To: Charles McCathie Nevile <chaals@yandex-team.ru>
- CC: Madeleine Rothberg <madeleine_rothberg@wgbh.org>, Charles Myers <charlesm@benetech.org>, "a11y-metadata-project@googlegroups.com" <a11y-metadata-project@googlegroups.com>, "public-vocabs@w3.org" <public-vocabs@w3.org>
I agree with Chaals (McN) on this - to say "none" doesn't allow for other kinds of hazards to be thought of later (though Chaals - re-indexing must I guess be something you do sometimes). For example, one area we didn't do (yet) is cognitive - many resources are already labelled (in the informal content description) with stuff like "may contain upsetting material" - these kinds of terms might be beyond the scope of accessibility Metadata but I would not personally want to set that in concrete yet. What this discussion does illustrate is that when new terms come along there will be a need to re-index (or re-interpret) what we know about a resource - also as technology advances and we are just able to determine more about particular resources and do more with them we may need to do that. For that reason, I'm not sure how I feel about the proposed AccessMode=X+Y (e.g. AccessMode=Visual+Auditory) model because *any* structure like this in the Metadata expressed as associated with the resource may constrain what we can "know" about it as the kinds of transformations and the kinds of inference we can make increase. It also bothers me that some of this information may be inferred from the context at the time. For example it might be in the case above that the auditory modality is only an equivalent for the visual in some environments where say particular supports are available, maybe particular AT - or it may be that at a later stage as technology improves we might augment that auditory with extra information discovered by an automated analysis of that resource and related resources used by others, or from feedback from other users. So information available only at delivery time might change that relationship. If its coded in the explicit Metadata we are stuck. Similarly there may be information that is available only at search-engine indexing time (i.e. not available at authoring time). Surely there is scope here for search engines to compete in the market place on the indexing algorithms and the way the matching to user preferences and specific contexts is done ? andy > On Wed, 02 Oct 2013 00:10:42 +0200, Charles Myers > <charlesm@benetech.org> wrote: > >> Charles McN had a great idea when he brought this up. But it may >> actually be a bit simpler to specify. >> Rather than sav >> >> >> * noFlashing >> * noMotionSimulation >> * noSound >> >> in addition to the three properties we have today >> >> * flashing >> * motionSimulation >> * sound >> >> we might just want to have a state of "none" (saying that you checked >> and that there are no hazards that you are aware of). >> >> That would change the spec to >> >> * flashing >> * motionSimulation >> * sound >> * none (or noHazard) >> >> which makes it cleaner. I think that saying the negative to each of >> the three properties would be a bit tedious. And, of course, not >> having the property means that it has not been checked. > > Yeah, but we would want to be pretty sure that "none" really means none, > and nobody will identify a new hazard in the future that we didn't > notice. While I suspect we are "close enough" in practice, I prefer to > be really really conservative in this case. > > I am thinking of the case where we discover that flashing at 3-7Hz is a > problem, but certain colour changes in a given frequency that don't > actually come across as flashing also turn out to cause problems. If we > can figure out what they are and define them in 2023 I'd hate to have a > million resources that say they have no hazards, when in fact we could > get a few thousand of them properly marked with the particular hazard > they do contain. > > We need to be aware that this is messy. "Invisible metadata" will have a > certain rate of error that can increase over time - 1/3 might not be an > unreasonable guess although I hope it is much lower than that. As I > argued earlier, this is still better than 80%, if we only get 20% of > active hazards marked under the current approach. But it still implies a > real level of risk to real people. "No risk" is a 'brave' statement, and > I am not sure that I believe we know enough to make it reasonably > accurately. > > Adding the noFlashing, noSound, etc seems to me a reasonable thing to do. > > cheers > > Chaals > >> On Oct 1, 2013, at 1:38 PM, Madeleine Rothberg >> <madeleine_rothberg@wgbh.org<mailto:madeleine_rothberg@wgbh.org>> >> wrote: >> >> Chuck has updated the issues list to include the discussion of whether >> accessHazard should state positive or negative information. See that post >> and my comments, which are also below, at: >> [http://www.w3.org/wiki/WebSchemas/Accessibility/Issues_Tracker#accessHazar >> >> d_-_Ok_as_is.2C_or_should_it_be_negated_in_sense.3F] >> >> I believe we need both accessHazard=flashing and accessHazard=noFlashing, >> etc.. This is because there are three cases we'd like to distinguish: >> >> 1. checked and it's fine >> 2. checked and it is NOT fine >> 3. didn't check >> >> "Didn't check" can be signified by no metadata -- this will be most of >> the >> content on the Web. In cases where someone has checked, let's record both >> positive and negative states. >> >> -Madeleine >> >> -- >> You received this message because you are subscribed to the Google >> Groups "Accessibility Metadata Project" group. >> To unsubscribe from this group and stop receiving emails from it, send >> an email to >> a11y-metadata-project+unsubscribe@googlegroups.com<mailto:a11y-metadata-project+unsubscribe@googlegroups.com>. >> >> To post to this group, send email to >> a11y-metadata-project@googlegroups.com<mailto:a11y-metadata-project@googlegroups.com>. >> >> For more options, visit https://groups.google.com/groups/opt_out. >> > > andy andyheath@axelrod.plus.com -- __________________ Andy Heath http://axelafa.com
Received on Wednesday, 2 October 2013 00:40:23 UTC