Re: Inside TikTok's Killer Algorithm

Seems like other tech companies are following suit:

https://www.axios.com/big-tech-pushes-voter-initiatives-to-counter-misinformation-022f2de8-8e4e-4ccf-8b84-c7224c569d65.html

On Thu, Sep 10, 2020, at 12:26 PM, Wayne Chang wrote:
> https://www.axios.com/inside-tiktoks-killer-algorithm-52454fb2-6bab-405d-a407-31954ac1cf16.html
> 
> I thought this article was especially interesting from a digital 
> identity and policy perspective. TikTok is effectively building 
> profiles for all of its millions of users, not unlike all tech giants 
> today. However, due to recent cries for transparency they are opening 
> up a lot of their algorithms in a way we generally haven't seen before.
> 
> > Once TikTok collects enough data about the user, the app is able to map a user's preferences in relation to similar users and group them into "clusters." Simultaneously, it also groups videos into "clusters" based on similar themes, like "basketball" or "bunnies."
> 
> Some questions it raised for me:
> - What happens when these "clusters" strongly correlate with sensitive 
> attributes including ethnicity, political views, and religion?
> - What are some standards-based ways that users can access not only 
> their own data but also reasonable representations of the specific 
> underlying algorithms being used to control their experience?
> - What could be governed here and by whom to increase user privacy and 
> freedom?
> - How can our ecosystem, data standards, associated products be used to 
> articulate & enforce such policies?
> 
>

Received on Sunday, 13 September 2020 23:27:11 UTC