[a11y-metadata-project] Video of Access for All demo

Hi everyone,

As Chuck Myers mentioned, we made a video of the Teachers' Domain demonstration of Access for All in a digital library. I tried to explain how Teachers' Domain uses access modes and accessibility features to show accessibility metadata, with or without having user preferences stored. Chuck wrote up a blog post about the video, including code samples and a complete transcript. The video also has captions. If you have questions or suggestions, please let us know.

http://www.a11ymetadata.org/accessibility-metadata-in-action-at-teachers-domain/

-Madeleine

From: Charles Myers <charlesm@benetech.org<mailto:charlesm@benetech.org>>
Date: Monday, September 23, 2013 10:16 AM
To: Gerardo Capiel <gerardoc@benetech.org<mailto:gerardoc@benetech.org>>
Cc: Charles McCathie Nevile <chaals@yandex-team.ru<mailto:chaals@yandex-team.ru>>, Liddy Nevile <liddy@sunriseresearch.org<mailto:liddy@sunriseresearch.org>>, Rich Schwerdtfeger <schwer@us.ibm.com<mailto:schwer@us.ibm.com>>, "a11y-metadata-project@googlegroups.com<mailto:a11y-metadata-project@googlegroups.com>" <a11y-metadata-project@googlegroups.com<mailto:a11y-metadata-project@googlegroups.com>>, Alexander Shubin <ajax@yandex-team.ru<mailto:ajax@yandex-team.ru>>, Andy Heath <andyheath@axelrod.plus.com<mailto:andyheath@axelrod.plus.com>>, Dan Scott <dan@coffeecode.net<mailto:dan@coffeecode.net>>, Dan Brickley <danbri@google.com<mailto:danbri@google.com>>, Egor Antonov <elderos@yandex-team.ru<mailto:elderos@yandex-team.ru>>, Emmanuelle Gutiérrez y Restrepo <emmanuelle@sidar.org<mailto:emmanuelle@sidar.org>>, Jason Johnson <jasjoh@microsoft.com<mailto:jasjoh@microsoft.com>>, George Kerscher <kerscher@montana.com<mailto:kerscher@montana.com>>, Madeleine Rothberg <madeleine_rothberg@wgbh.org<mailto:madeleine_rothberg@wgbh.org>>, Matt Garrish <matt.garrish@bell.net<mailto:matt.garrish@bell.net>>, "public-vocabs@w3.org<mailto:public-vocabs@w3.org>" <public-vocabs@w3.org<mailto:public-vocabs@w3.org>>
Subject: [a11y-metadata-project] A request to set up some times for group conference calls to resolve the mediafeature/ accessmode issues for Accessibility Metadata

First, my apologies... after our conference calls in Moscow two weeks ago, I've been a bit preoccupied on other activities (including a general framework for understanding the metadata framework and loads of data into a Learning Registry, working out good search terms).  All good, but it took away from the process of getting to closure on the schema.org proposal. I'm back.

To get a sense of urgency back into this, I'd like to get some conference calls scheduled to start hashing things out (and to get the email thread resuscitated). My proposal would be that we plan two calls for this week (Tuesday and Friday) and then one for next Monday.  I do not expect that we'll need three, but I'd rather plan ahead for people's calendars and then cancel if we've accomplished our goals. Our issue with calls is finding a useful time , as we span North America and Western Europe, with possibilities in eastern Europe and Australia.  We've tended to do calls in the early morning in California (7-9 AM PDT, pushing to 6 AM if required), which accommodates western Europe well. I plan to put out a "doodle" on this in a few hours, but would first like to know who cares to be in the calls... it doesn't make sense to invite the w3c public vocabs list.

The specific requests are:

  *   Let me know if you want to be on the calls (reply just to me, not /all)
  *   Let me know if you feel that you must be in the call, or consider yourself interested more than crucial
  *   Let me know if there are any times that are better or worse (and the timezone you're in)

When we do get these scheduled, I will send the information out to both of the mailing lists and this group of people.

I'll be sending out a set of emails today that include

  *   a recording of the demonstration that Madeleine did on the second day of Moscow, showing accessMode and mediaFeature in action
  *   a general framework for accessMode and mediaFeature for content, which will provoke some discussion, I'm sure.
  *   Moving the current spec to a baseline, .6, including a few changes in bookformat
  *   An updated, edited and more useful issues list on the public vocabs wiki to drive the next calls

On 9/16/2013 9:10 PM, Gerardo Capiel wrote:
Chaals,

I agree that mediaFeature is the most important property here and I think you're right that we need to be thinking about how we can make implementation easy for developers of search software. Your point on accessMode is also particularly strong when we consider that many times users have already filtered out a lot of content by the type of resource they have chosen to search for.  For example, if a user is doing a video search on Google on Yandex, they have already narrowed the accessMode possibilities to visual and auditory combined.  If they are deaf, they are most likely only looking to filter their video search by mediaFeature equal to captions or transcript.  And if they are blind, they are most likely only looking to filter on mediaFeature equal to audioDescription or transcript.  The transcript choice is a fallback choice for both cases.

During the development of the proposal we did a lot of work on use cases and also relied on significant prior work.  Chuck Myers is compiling that work in a more consumable form and will be following up shortly.

Cheers,

Gerardo

Gerardo Capiel
VP of Engineering
benetech

650-644-3405
Twitter: @gcapiel<http://twitter.com/gcapiel>
Fork, Code, Do Social Good: http://benetech.github.com/



On Sep 11, 2013, at 1:05 PM, Charles McCathie Nevile <chaals@yandex-team.ru<mailto:chaals@yandex-team.ru>>
 wrote:

On Wed, 11 Sep 2013 18:43:06 +0400, Gerardo Capiel <gerardoc@benetech.org<mailto:gerardoc@benetech.org>> wrote:

Common use cases we have not really discussed much are based on what happens in integrated classrooms where teachers are trying to serve both students with no disabilities and student(s) with different disabilities.  Below are some use cases based on my experiences with users of our Bookshare web library and it's related applications:

[snipped some really good use cases]

Some of these use cases indicate a need for the user to know about the combination of access modes the resource uses and what adaptations exist within the resource to support the assistive technology used by the end user.  I believe that merging accessMode and mediaFeature into one property makes it harder for users to make those determinations themselves.

Yeah. I totally agree.

My basic thinking is in part conditioned by how I present this to engineers in Yandex, and in part by what I heard, which included several times that "accessMode isn't really that helpful and we rely on mediaFeature and a set of complex algorithms for matching".

As I see things right now, mediaFeature seems really important as a way to rank results for things that can be useful. I see accessMode as a quick way of removing things that the user simply won't want from that sorting process.

The idea is that if we have 200 resources, and 120 are unable to meet the user's need, we save a LOT of processing by eliminating them quickly. Then we run the more complex ranking algorithms on the smaller set of things that, in principle, *could* meet the user's *needs* - but not all will be equally preferred.

One way I am thinking is that I want to use accessMode as a shortcut to make it cheaper to do more careful checking of mediaFeatures, so users get an exact match.

I'll keep the use cases Gerardo posted, because they are along the lines I have been trying to build up. I think we actually need to make a lot of these, to work through the issues of how we best satisfy them.

cheers

Chaals

--
Charles McCathie Nevile - Consultant (web standards) CTO Office, Yandex
     chaals@yandex-team.ru<mailto:chaals@yandex-team.ru>         Find more at http://yandex.com



--
You received this message because you are subscribed to the Google Groups "Accessibility Metadata Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to a11y-metadata-project+unsubscribe@googlegroups.com<mailto:a11y-metadata-project+unsubscribe@googlegroups.com>.
To post to this group, send email to a11y-metadata-project@googlegroups.com<mailto:a11y-metadata-project@googlegroups.com>.
For more options, visit https://groups.google.com/groups/opt_out.

Received on Tuesday, 24 September 2013 03:42:43 UTC