Re: TED talk on algorithm bias

Hi John,

I’m missing a bit of context on this but I read the book a while ago and I think it’s worth drawing out the difference between:

  1.  A human-written algorithm, and
  2.  A machine-learning (ML) generated algorithm.

Both can be biased, but in current implementations the ML versions are black-boxes. There is no transparency, after the learning phase you ask it a question and get an answer. If it is biased you have no means of knowing in what way (apart from analysing the answers separately).

The book is a polemic, but the core problem is real because it is based on a logical extension of garbage-in garbage out. If the data (e.g. a real life situation) is biased, the input the ML uses is biased and it will continue that bias. That’s how it works.
For example, if the current data shows that people with disabilities are less likely to have a job (due to discrimination or otherwise), an ML-based assessment of job applicants would embed that bias unless some action is taken to prevent it.

I’m very interested in the work Judy mentioned, I’ll read up on that. A couple of days ago I sketched a normal distribution to indicate UX work, and then flattened it to indicate accessibility work, so that metaphor has been bouncing around my brain for a while!

What I’m not sure about is how this applies to accessibility guidelines. I have assumed so far we’d be talking about explicit & transparent calculations / algorithms, rather than ML?

Cheers,

-Alastair


From: John Foliot

Thanks Jeanne for sharing these. I've not spend the requisite time with the longer video, but did review the shorter one.

I have to say that I am personally concerned by the seemingly definitive declaration that "...blind faith in big data must end..." as nobody (certainly not me) has suggested that we put blind faith in anything. But data is what we are working on and with; it is in many ways our stock in trade, and is what "measurement" is all about. Measurement is data (whether big or small).

Received on Thursday, 11 July 2019 09:49:35 UTC