- From: Wayne Chang <wyc@fastmail.fm>
- Date: Thu, 10 Sep 2020 12:26:35 -0400
- To: "W3C Credentials CG" <public-credentials@w3.org>
https://www.axios.com/inside-tiktoks-killer-algorithm-52454fb2-6bab-405d-a407-31954ac1cf16.html I thought this article was especially interesting from a digital identity and policy perspective. TikTok is effectively building profiles for all of its millions of users, not unlike all tech giants today. However, due to recent cries for transparency they are opening up a lot of their algorithms in a way we generally haven't seen before. > Once TikTok collects enough data about the user, the app is able to map a user's preferences in relation to similar users and group them into "clusters." Simultaneously, it also groups videos into "clusters" based on similar themes, like "basketball" or "bunnies." Some questions it raised for me: - What happens when these "clusters" strongly correlate with sensitive attributes including ethnicity, political views, and religion? - What are some standards-based ways that users can access not only their own data but also reasonable representations of the specific underlying algorithms being used to control their experience? - What could be governed here and by whom to increase user privacy and freedom? - How can our ecosystem, data standards, associated products be used to articulate & enforce such policies?
Received on Thursday, 10 September 2020 16:27:20 UTC