- From: Detlev Fischer <detlev.fischer@testkreis.de>
- Date: Mon, 22 May 2017 16:22:27 +0200 (CEST)
- To: lisa.seeman@zoho.com
- Cc: w3c-wai-gl@w3.org, public-cognitive-a11y-tf@w3.org
lisa.seeman schrieb am 22.05.2017 15:55: > It looks like we are more comfortable with this direction - but we would need some testing tools before CR > SO far as I know the IBM tool is not free, and the Microsoft tool requires a subscription. > A way to move forward is put it in the next version of wcag 2.1 and reach out to the companies for a free version of the tool. In my view, any automatic tool checking the commonality of words by applying some generic algorithm will be bound to produce incorrect results in all cases where you have a site covering a specific domain with specific terms (i.e., very often). Synonyms where you can replace one term with another without also introducing a shift of meaning are the exception, not the rule. Then you have the homonym problem (same term meaning different things in diffeent contexts / domains) A tool that offers a meaningful analysis would have to be capable of inferring the respective domain and its vocabulary and adapting its algorithm accordingly.
Received on Monday, 22 May 2017 14:23:27 UTC