- From: Steve Lee <stevelee@w3.org>
- Date: Tue, 28 Apr 2020 12:11:39 +0100
- To: John Foliot <john.foliot@deque.com>
- Cc: public-cognitive-a11y-tf <public-cognitive-a11y-tf@w3.org>
Thanks John, that raises a number of thoughts. This SC requires the specification of 2 things 1) Which controls are important The SC is restricted to only controls in a process but even then the critical controls that the SC applies to are left to the designer to identify and treat appropriately (see 2). There is no concept of critical controls in HTML or Accessibility APIs so if the SC is trying to distinguish these (and so for all users) something more is also needed for programmatic access. I don't think this is the intention as the SC is largely restricted to certain controls to make it more palatable. Ideally ALL interactive controls would be treated the same. This SC is also limited to visual access as defined. Thus my conclusion is that personalisation to identify these controls is not required by this SC. 2) Visual affordances that they are controls that are interactive Non Visual AT Users can already identify ALL controls as "role" is exposed for native HTML controls or (hopefully) with ARIA for custom controls. So we are filling a gap for visual users of the browser presented UI who have cognitive accessibility requirments. As we are finding, the definition of suitable visual affordances is nuanced and problematic for a number of reasons once designers move beyond native controls and styling (ie pretty much every web site). The conflict is between what designers want to achieve and what each users find familiar or obvious when identifying controls. This then is a clear case for personalisation. We don't want to limit designers visual ambitions but could provide a mechanism for them to markup their intentions to enable inclusive access. The personalisation feature that John highlights enables identifying any set of controls which can then be styled appropriately by personalisation tools. # In summary Putting both these requirements together I must admit John's solution of using personalisation makes much sense to me. However, Personalisation specification, implementation and adoption by designers is a long way off so we won't get this important cognitive requirement addressed soon. In order to get an SC in WCGA 2.2 to address these cognitive needs we have to quickly find a way to specifying an appropriate set of visual styles to identify controls. We also have to meet the stringent requirements on SCs due to WCAGs use both technically and legally. Some other thoughts * Some of the Design Patterns are visually focussed in their current form. This is somewhat implicit and possibly due to the cognitive gaps we've identified and the 'programmatic' emphasis of relevant SCs to address AT usage. Visual Indicators SC lands right in the middle of this issue too. While existing SCs like Colour Contrast are visual in nature SCs and personalisation need to support ALL accessibility requirements as a whole. * This is where personalisation fits in with WCAG. There is ARIA but it is fairly low level ensuring (limited) behaviours and structure are communicated so AT does not lose any information. Personalisation is about content and presentation choices to support user preferences. Also, ARIA will not easily be extended and is seen as a stop gap that should not be used unless necessary, eg for differently implemented semantics. Personalisation is desirable for all content in order to better to support user preferences such as cognitive. * Personalisation is supplemental (like ARIA) and so requires additional developers' buy-in or it is not implemented. We need to figure out how to make it one of those things designers just do. SCs do carry more weight. Steve On 27/04/2020 16:47, John Foliot wrote: > Steve writes: > > > it became apparent that when some of the existing SCs say > something should '*be programmatically available*' for AT, the very > same thing is desirable for cognitive a11y too. > > Hi Steve, > > Exactly! That is why we're working on a mechanism > <https://w3c.github.io/personalization-semantics/content/index.html#simplification-explanation> > to do just that within the Personalization Task Force. That is also why > I keep arguing that this SC is taking a (IMHO) flawed approach - rather > than attempting to dictate "designs'' (aka 'visual indicators'), we > should instead be mandating 'tagging' critical content with the > appropriate semantics, and let helper-agents facilitate "the final mile" > (i.e. close the gap for the individual user). > > I recognize that *today* we lack appropriate tooling to take advantage > of these emergent semantics, but attempting to force visual indications > of critical content on all pages, to me, also leaves out a crucial > point: what type of visual indicator is best for *all* users(?) and this > Draft SC pre-supposes that it is a single solution (and one that would > also need to conform to other SC, such as color contrast, reflow, space > between controls, size of controls, etc.). Additionally, for critical > components on a page (the type that would require such specific visual > identification) 'visual indicators' also neglects to address the needs > of non-sighted users with similar COGA-related needs - again, > programmatic 'tagging' would be far more efficient there as well. > > Steve included a spreadsheet that included the following: > > Ensure the Most Important Things on the Page are Easy to Find > > > > 1.3.5 Identify Input Purpose (programmatic only) > > 1.3.6 Identify Purpose (programmatic only) > > > This, in a nut-shell, is also the current problem: finding important > things / making it easier to find important things. We knew about this > need when we were working on WCAG 2.1, but because we lacked the > appropriate tools then, we had to settle for only one of the three > 'critical' types of component (inputs), and/but not 'actions > <https://w3c.github.io/personalization-semantics/content/index.html#action-explanation>' > nor 'destinations > <https://w3c.github.io/personalization-semantics/content/index.html#destination-explanation>', > of which the visual indicators SC is attempting to now also address. > (NOTE: the Personalization work has also already identified 'categories' > of critical actions and destinations - so that this is scoped > appropriately - and if anyone feels that there remains gaps there, > please file an issue with that Task Force. > <https://github.com/w3c/personalization-semantics/issues> > > I do not for a second mean to suggest that this SC isn't attempting to > address the needs of some users (and specifically COGA users), but I > assert that the current approach is a brute-force attempt to address a > very nuanced problem, and once the Personalization work gains more > traction, will make achieving the goal significantly easier, while still > not imposing overly onerous visual design requirements on the content > creators. For example, we know that many users need or prefer > high-contrast designs, or 'reversed color' designs as well (light text > on dark background), yet we don't mandate that - we leave it for helper > tools to meet those needs. I argue we're looking at essentially the same > problem here, and should deploy the same types of solutions. > > JF > > On Mon, Apr 27, 2020 at 9:31 AM Steve Lee <stevelee@w3.org > <mailto:stevelee@w3.org>> wrote: > > On 26/04/2020 22:15, David Fazio wrote: > > +1000, Abi. What a wonderful reply. > > I totally agree > > On point 1, when I was recently compiling a list of SCs relating to our > Patterns [1], it became apparent that when some of the existing SCs say > something should 'be programmatically available' for AT, the very same > thing is desirable for cognitive a11y too. This is one example. > > On another very general point I noticed today that the Android Facebook > App now has a post button that is pure tex, not a hint of button or > affordances. Sigh! > > Steve > > 1: > https://docs.google.com/document/d/1y7IJFGejE6bCsG34hkrsVgLG2DVfYO-GjwCpyTnSwns/edit?usp=sharing > > > > > > To your point “ 3. For desktop users, it is possible to move > focus and identify which elements are interactive. For example they > can tab to a link in text or an icon and find it is interactive.” > > > > This puts an unacceptable amount of mental stress on users with > cognitive disabilities, resulting in mental fatigue that can shut > users down and increase likelihood of errors exponentially hence the > need for salient visual indicators. > > > > This message was Sent from my iPhone. Please excuse any > typographic errors. > > > >> On Apr 26, 2020, at 1:57 PM, James A. <A.James@soton.ac.uk > <mailto:A.James@soton.ac.uk>> wrote: > >> > >> Going back to the question on research to support the need for > visual indicators for people with cognitive disabilities, it is > well known within the group that there is a significant lack of > research in this area. When we attempt to identify research for > specific interface/interaction requirements, we have to infer from > the documented needs of users and the limited studies available. And > in the area of visual indicators studies struggle to keep up with > the quickly changing interface paradigms and designs trends. This is > underlined by the recent 2019 literature review into Mobile Devices > and People with Learning [intellectual] Disabilities by Williams and > Shekhar [1] which concluded: > >> > >> "Surprisingly, very little research appears to have been > undertaken on screen manipulation – the actions of tapping, swiping > or pinching... Perhaps in keeping with the lack of a rigorous and > cumulative body of research, findings from the limited research > undertaken make very few suggestions as to how devices could be made > easier; only larger "buttons" (virtual, one assumes), a simplified > interface, more training and simpler vocabulary being prominent. > More research is clearly needed in both the touch element of data > and command input, and issues around menu placement, length and > hierarchy in a mobile environment. There is virtually nothing in the > literature on this – only Kumin et al remark that "drop-down" menus > were difficult for their cohort of adults with Down syndrome to > navigate." > >> > >> In 2019 Williams also published a literature review on how > people with learning disabilities interacted with web pages but this > only covered interacting with links in menu and not interactive > elements within content or forms [2] and again concluding there are > few systematic studies with these users. > >> > >> In non-disability focussed research, a literature review on > touchscreen interfaces found that for all users "Visual cues on > icons assist with targeting tasks" and "Icon depth cues equal to 10% > of target size suggested". For users with motor-controlled > disabilities (MCD) they found: "Overall performance is lower among > MCD users than general population" "Interface changes that improve > performance for the general population will likely improve > performance for MCD users as well". > >> > >> A study into older users [4] concluded "Although textual buttons > are common, they might not always convey the right affordances to > older adults, and can mislead users to regard those buttons as > non-actionable information. Consequently, also make sure that both > the icon and the text trigger the same action; they should be > working as a single element > >> > >> On the point of whether it is appropriate to recommend a new SC > when there is little research, I have always approached this > proposal as closing a gap in the current WCAG 2.1 which has become > more vital now that the majority of web interactions are undertaken > on touch screen devices. This is because: > >> > >> 1. Users with disabilities who do not use sophisticated > assistive technologies are unable to access information on the role > of interactive elements which is provided to other users. That is > they are unable to access the information that is provided through > complying with 4.1.2 Name, Role, Value. In reality this means that a > screen reader users gains more information about how to interact > with an item that a visual impaired or dyslexic user relying on text > to speech. I've seen this happen in user research sessions where > screen reader users have been able to access a form field which > lacked any border when other users with moderate visual impairment > or cognitive disabilities were not aware that the element was a form > field. > >> > >> 2. A strict reading of WCAG 2.1 does not require text links to > have any indicators (only that they do not rely on colour). This > seems to be gap that should be resolved. The current proposals for > the visual indicators SC only require content creators to use what > is common best practice such as using underline or bold with colour > to indicate a link. > >> > >> 3. For desktop users, it is possible to move focus and identify > which elements are interactive. For example they can tab to a link > in text or an icon and find it is interactive. There may also be > contextual support through hovering to see tooltips. However, touch > screen users do not have access to these functions and are reliant > on visual indicators. We do not provide similar support to focus > indicators for touch screen users, despite the majority of users > using these devices to access web content and apps. > >> > >> 4. Non-text contrast (1.4.11) states that "For active controls > any visual information provided that is necessary for a user to > identify that a control is present" has contrast. First, this means > that in WCAG 2.1 we have pretty much defined visual indicators and > second, this has lead to a situation where designers are actively > encourages them not to provide them as they then must make sure they > contrast. I have had this conversation with a number of designers, > particularly as iOS and Android own indicators do not comply with > 1.4.11. > >> > >> On the points Patrick raised about the implications of adding > SCs which then become law, outside of the US, most regulations and > jurisdictions take into accounts that it is challenging for complex > websites to meet all accessibility requirements. Instead they > encourage transparency through providing accessible alternatives and > ongoing improvements while working towards full compliance. As US > legal cases are nearly all still referring to WCAG 2.0, it would be > very disappointing that the AG felt unable to fulfil its remits to > widen support for users with cognitive disabilities due to concerns > that this could be interpreted as a legal design requirement. > >> > >> Best wishes > >> > >> Abi James > >> COGA, > >> University of Southampton > >> > >> > >> [1] Williams, P., & Shekhar, S. (2019). Mobile devices and > people with learning disabilities: a literature review. > International Journal of Computer Science and Mobile Computing, > 8(2), 34-43. > >> [2] Williams, P. (2019). A Tangled Web? How People with Learning > Disabilities Negotiate the World Wide Web: The Accumulating Evidence. > >> [3] Orphanides, A. K., & Nam, C. S. (2017). Touchscreen > interfaces in context: A systematic review of research into > touchscreens across settings, populations, and implementations. > Applied ergonomics, 61, 116-143. > >> [4] A. C. De Barros, R. Leitao, and J. Ribeiro, “Design and > evaluation of a mobile user interface for older adults: navigation, > interaction and visual design recommendations,” Procedia > >> Computer Science, vol. 27, pp. 369–378, 2014 > >> > >> > >> -----Original Message----- > >> From: Patrick H. Lauke <redux@splintered.co.uk > <mailto:redux@splintered.co.uk>> > >> Sent: 26 April 2020 11:14 > >> To: w3c-wai-gl@w3.org <mailto:w3c-wai-gl@w3.org> > >> Subject: Re: Research for Visual Indicators > >> > >> On 26/04/2020 00:04, David Fazio wrote: > >> [...] > >>> We can simply give a list in addition to Rachael’s suggestion, that > >>> gives designers room for creativity. It feels like this > proposed SC is > >>> being scrutinized to an unreasonable degree. > >> > >> Purely from my perspective, I'd say that this is because for > better or worse, WCAG is now essentially pulled into legislation, > wholesale, in many places. So essentially, saying something > normatively fails results in AGWG effectively saying that it should > be "illegal" to do something. > >> To me, there's a much heavier burden now on not just compiling > SCs that are flawed/leave gaps, but also in not defining SCs that > are unnecessarily restrictive. We've already seen many of the > gaps/ambiguities from 2.0 and 2.1 (in SCs themselves, and in how > there can be unexpected interactions between different SCs). This is > particularly true when SCs are targetted to Level A or AA (less so > with AAA). > >> > >> And of course, the fundamental tension that seems to always > persist between needing to be specific enough so that normative > definitions are clear-cut enough, and the sisyphean task of trying > to explicitly provide (often pseudo-scientific) hard threshold > values and complete lists of "dos" and "don'ts" that are measurable. > Versus more "human judgement" > >> subjectivity which is easier to define but leaves a lot of gray > area. > >> > >> While I'm generally critical of SCs that are too handwavy and > leave too much room for interpretation, I can also see how trying to > be over-specific is problematic, particularly when it starts to try > and make judgements on things like visual design (coming from a > group whose membership is, admittedly, not made up of visual design > practitioners or experts in the field, and where decisions on things > like cut-off-values for things are often just kind of fudged - > thinking for instance of how we arrived to the CSS px value for > Target Size, or the discussions around those timing thresholds in > various 2.0 SCs, etc). > >> > >> P > >> -- > >> Patrick H. Lauke > >> > >> > https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.splintered.co.uk%2F&data=01%7C01%7Ca.james%40soton.ac.uk%7C4ecc461471df47cec40a08d7e9caa061%7C4a5378f929f44d3ebe89669d03ada9d8%7C0&sdata=St1%2BRJQ3SHyQUj47sJSRn%2FyEOJXi7y%2BvNCahmmiSf1k%3D&reserved=0 > | > https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fpatrickhlauke&data=01%7C01%7Ca.james%40soton.ac.uk%7C4ecc461471df47cec40a08d7e9caa061%7C4a5378f929f44d3ebe89669d03ada9d8%7C0&sdata=DP57qf%2B%2FTt6ubQTedZ0EXg7v0GqGvxETLeEZ9jYviGQ%3D&reserved=0 > >> > https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fflickr.com%2Fphotos%2Fredux%2F&data=01%7C01%7Ca.james%40soton.ac.uk%7C4ecc461471df47cec40a08d7e9caa061%7C4a5378f929f44d3ebe89669d03ada9d8%7C0&sdata=%2F2BPfK%2B442fY33tl%2BKlALMrPJYiTXV9hOC5vSh1SyKs%3D&reserved=0 > | > https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.deviantart.com%2Fredux&data=01%7C01%7Ca.james%40soton.ac.uk%7C4ecc461471df47cec40a08d7e9caa061%7C4a5378f929f44d3ebe89669d03ada9d8%7C0&sdata=91%2BFptXySzuaMYwQX9IQcqiYA9sXRRDu2irQqzFi%2BIQ%3D&reserved=0 > >> twitter: @patrick_h_lauke | skype: patrick_h_lauke > >> > > > > -- > *John Foliot* | Principal Accessibility Strategist | W3C AC Representative > Deque Systems - Accessibility for Good > deque.com <http://deque.com/> > >
Received on Tuesday, 28 April 2020 11:11:45 UTC