- From: Al Gilman <Alfred.S.Gilman@IEEE.org>
- Date: Mon, 9 Aug 2004 11:34:15 -0400
- To: <wai-xtech@w3.org>
** summary: WCAG is refining Success Criteria to be applied as a minimum standard for all web content. PF is reviewing the infrastructure of W3C technologies for how they support these capabilities *plus* more aggressive strategies for assistance which may be applied selectively where the content is known in advance to be targeted for education of high-risk learners, or is of a government or safety nature where there is high criticality to very widespread understanding. So we should recognize at least the two following 'better' performance cases - binding or enabling traceability to "the right definition" is better for service to minor-language speaker groups including the users of sign and symbol languages. - likewise, binding into a thesaurus network with synonym relationships denoted in machine-interpretable ways is better than just retrieving dictionary or glossary entries without this comparative information. WCAG may have limited the 'programmatically located' success criterion to retrieving a short list of dictionary or glossary entries. The range of techniques that PF wants to learn how to support through metadata exceeds this in terms of also seeking what techniques to use to meet the above two 'better' conditions. Details in the context of the thread below. Al At 3:20 PM +0300 8/8/04, Lisa Seeman wrote: > > > 1. To what extent, if implemented, would this concretely benefit >> > people with cognitive disabilities? > >To me, if a page is inaccessible if someone can theoretically >understand the concepts in a page, but can not because of the page >presentation/ format. > >Many people could theoretically understand the concepts behind a page >but can not understand because they have a reading . word/ language >related disability. > >Symbolic makes it accessible to them. >A lexicon enables symbolic. This is the link where I believe that the use cases PF is currently looking to support by format features and the 'programmatically located' success standard of WCAG deliver different levels of performance, and we need to be aware of this. The difference has to do with the quality of the semantic backup that is bound to the "term use" in the web content [1]. If the binding links to a "unique sense" then more users will succeed after machine transliteration than if the binding only retrieves a short list of candidate semantic backups. In many cases for the author to resolve ambiguities is readily achievable. So since there is a non-zero volume of use case where there is material benefit and the difference is readily achievable, it is in our set of uses to consider in reviewing format capabilities. In addition, if the reference is to something which is supplied with thesaurus relationships as well as with "definitions" that will also make a material improvement in the outcome when translating to symbol languages, sign languages, or any other transliteration where there is not a large market supporting high-capital-investment translation services. >Hence lexicon enables accessibility... > >QED > >> > 2. The proposal only addresses word (sometimes called lexical) >> > meaning, not sentence meaning. Are there any testable strategies >> > available today or in the near future that can help to clarify or >> > disambiguate larger components of a text? > >Sentence ambiguity (synaptic ambiguity) is normally coursed by a word >ambiguity. > >E.G., take the sentence " tighten the draw with the leaver. " >If the word "with" means "using" the sentence has one meaning. >If "with" means "connected to" the sentence has a second meaning. >Resolve the word ambiguity , resolve the sentence ambiguity. > >In putting together SWAP I did consult with word linguists such as >Daniel Berry, who's' expertise is in defining the different forms of >ambiguity found in natural language. > >I would more conclusive research to be done on it. But that will >always be. >We should not run from what solves a problem 90% because we are not all >the way there yet. > >Note: This does not address implied meanings such as sarcasm. For that, >SWAP allows an annotation /RDF statement were you can explicitly state a >secondly or implied meaning , and give it a type (like sarcastic) > >I would be happy to see that type of stuff included too, especially for >semantic pragmatic disorder . >What we are really doing is more important , we are generalizing the >concept of an equivalent, away from text equivalent to different types >of equivalents for different types of content and content sections. >Where different types include literal equivalent , text equivalent , >detailed descriptions, summary, secondary meaning.... > >All of which is what RDF is made for. > > > > >> > 3. From Gregg's proposal it appears that the author is >> specifying the >> > dictionaries. However, as a user I might want to take control of >> > this, for example to select dictionaries that offer translations >> > into my preferred language. It is important that if user >> agents or >> > assistive technologies implement this, they provide override >> > facilities. >err -user agent can do what it likes, but it will help the agent to know >what was meant. > >> > 4. Is there a reasonable range of online dictionaries out there for >> > various languages? > >> SWAP: A commercial product made by UBAccess - >> http://www.ubaccess.com - >> which is what Lisa Seeman does as >> a day job. > >Yes - when I am not volunteering for the W3... > >Importantly - look at Wordnet in Princeton has a great lexicon (and is >in RDF :) > >We at ISOC IL (Israel accessibility) are building a Hebrew one with >vowels, and pronunciation for accessibility. > >With the right system any lexicon can be used. > >> Babylon: http://www.babylon.com >etc etc...
Received on Monday, 9 August 2004 15:34:55 UTC