Re: Reasons of using Canvas for UI design

I have tried to 'take a step back' and ask some basic questions, and look for some basic principles, and then come up with some (probably basic) points and ideas.

It seems that using Canvas is most interesting and useful when the application (represented by the HTML context including Canvas, its supporting scripts, and so on) is offering some user interaction model that is NOT available when using HTML and/or SVG.  These are the applications that must 'work' from an accessibility point of view, in my opinion. An obvious example is the particle simulation one;  another thought experiment I played with is below.  The issue is that if the application offers something more than just a visual (e.g. one can learn something, build something, or affect something) it ought to be accessible to the visually-impaired user.

The canvas and what's drawn on it are just the visible manifestation of the application; it's what those pixels mean, and what interaction with the application means, that we need to make accessible. So rather than asking 'how do we make canvas accessible?' I think we need to ask 'how do we make applications that use canvas accessible?'.

Ideally, the accessibility of those canvas-using applications is mostly enabled by making the applications work at all; if there are extra, special, provisions for accessibility, we know from experience that some authors won't take the trouble to use those provisions, and accessibility suffers.  I don't know how to achieve that for canvas-based applications, but it's worth keeping in mind.

In a canvas-based application, it is the scripts (and their supporting infrastructure) that constitute the application; the canvas surface is just the visible rendering. So I think it is the scripts that should bear the duty of providing accessibility.  Writing scripts that do hit testing is currently somewhat of a pain;  it may well be that if we can provide access to optimized hit testing, for scripts, we can both ease the job of writing the applications and also providing accessibilutyt.  However, I do not think that the accessibility sub-system should be interacting directly with the 'model' that such a hit-testing support system might build. Rather, the scripts should provide the final answer, supported (if they wish) by such a sub-system.

One thought experiment I took was this: the canvas represents a bio-reactor, in which various populations of algae, funghi, bacteria etc. are competing. The user can interact with it using a 'virtual pipette' -- adding to the pipette by picking up material in one place, and dropping material from the pipette somewhere else (e.g. by right and left click-and-hold). All the while, the organisms are reproducing, spreading, dying, shrinking, co-existing, etc. In this, there are no 'paths' to test against; rather the application is modelling a fluid situation. The user can learn what the effect of dropping various populations in the midst of others, is. Apart from a legend "right-click and hold to pick up, left-click and hold to drop" (outside the canvas) how does the application convey what is being picked up, what's in the pipette, and what's going on in the reactor, to an accessibility-needing user?  "There is a check-box under the mouse, which says "Remember me"" comes nowhere close. This application is not path-based, is not using 'standard controls' and so on.

Applications that can use standard controls should use them!  Even overlayed on the canvas if need be.  If they can use (or get some assistance from) path-based hit-testing, let's develop a support system that they can use for that. If they are breaking new ground, let's ask what the scripts need to be able to do, to make the resulting application accessible.  I feel sure that if we can answer that question, many cases will suddenly be seen to be workable.



David Singer
Multimedia and Software Standards, Apple Inc.

Received on Thursday, 28 July 2011 22:04:47 UTC