Re: TED talk on algorithm bias

Hi Judy,

Thanks for re-posting Jutta's Medium post. As the keynote speaker at
Web4All this year, she also presented these same thoughts. While I cannot
disagree with some of the data-driven issues she surfaces, the one area of
thought that strikes me the most was this:

*Leveling the playing field*

[image: image.png]
"To address the issue of majority data overwhelming the needs of people at
the edge, I’ve been playing with the Gaussian curve or normal distribution.
I call it the “lawnmower of justice.” I cut off all but a small number of
repeats of any given data point. This forces the learning model to level
the playing field for the full spectrum of needs and pay attention to the
edge or outlying data as well. It levels the hill in the normal
distribution."

To continue the analogy, the main question I have is, "how long (or short)
is the right height for the grass? What do I set the lawnmower's cutting
height at? And why?"

There is  no doubt that we must be mindful (super mindful?) of the effect
of bias in our decisions, but at some point we also need to decide "how
short the grass" we find acceptable. Too short then when there is a drought
the grass will die. Too long and the health and appearance of the lawn
suffers. Most turf grass (your suburban lawn) is recommended to be trimmed
to 2 and 3 inches during the growing season. Yet, if you maintain a golf
course, most putting greens can be mowed as low as 1/8 of an inch.

So, which is it for us? And as part of our decisions, when you maintain a
putting green, the cost to do so is significantly greater than mowing your
lawn every weekend. So while I am certainly in favor of looking at what the
optimum grass height will be for us, we first have to determine what our
grass will be used for: beautifying our home, or providing a
tournament-level golf course (or somewhere in-between).

And because we're not actually cutting grass  "inches" as a unit of measure
is inappropriate - but I've not seen or heard what our unit of measure is
or will be.

Just some thoughts to chew on from my perspective.

JF



On Wed, Jul 10, 2019 at 11:17 AM Judy Brewer <jbrewer@w3.org> wrote:

> Hi All,
>
> Joy Buolamwini's talk on algorithm bias is also very worth watching:
>
>
> https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms
>
> Jutta Treviranus, from OCAD University and former ATAG WG Chair, is
> doing interesting research on accessibility-specific AI bias.
>
>
> https://medium.com/datadriveninvestor/sidewalk-toronto-and-why-smarter-is-not-better-b233058d01c8
>
> - Judy
>
>
> On 7/10/2019 10:20 AM, Jeanne Spellman wrote:
> > Cyborg asked me to send this around and asks that those working on
> > conformance watch it:
> >
> > TED Task:  Cathy O'Neil - Weapons of Math Destruction
> >
> > There is a short version and the full version
> >
> > Short version: https://www.youtube.com/watch?v=_2u_eHHzRto
> >
> > Full version: https://www.youtube.com/watch?v=TQHs8SA1qpk
> >
> > I watched the short version and  thought it was well done. It is about
> > various kinds of bias and not specific to PwD.   Her points about the
> > data of the past continuing a bias into the future are cautionary.
> > We do not collect big data and our formulas are not sophisticated AI
> > algorithms, but the principles she cautions about apply, IMO.  There
> > are people in accessibility doing research on algorithmic bias against
> > PwD, and there are broader lessons from the research that could apply
> > to our work.
> >
> >
> >
> >
> --
> Judy Brewer
> Director, Web Accessibility Initiative
> at the World Wide Web Consortium (W3C)
> 32 Vassar St. Room 385, MIT/CSAIL
> Cambridge MA 02139 USA
> www.w3.org/WAI/
>
>
>

-- 
*​John Foliot* | Principal Accessibility Strategist | W3C AC Representative
Deque Systems - Accessibility for Good
deque.com

Received on Wednesday, 10 July 2019 20:57:49 UTC