- From: Greg Scallan <greg@flipboard.com>
- Date: Sat, 18 Jan 2025 05:13:43 -0500
- To: Bob Wyman <bob@wyman.us>
- Cc: Social Web Incubator Community Group <public-swicg@w3.org>
- Message-Id: <99B647D0-BAE0-4707-A2B7-A59756A5779C@flipboard.com>
Thank you for this summary. I’m a huge fan of giving the end user the choice over the current (typical) activity-pub servers approach of giving the administrator of your instance the choice. Although you can choose to leave, it is not always easy to do so and not everything always moves with you as it can be at the whim of your administrator (but that is a separate issue, really) Missouri seems to care about giving the user a choice on what content to see or not. The notion of transparency seems very well aligned to the privacy minded folk. However, forcing default choices that might defeat the utility of the platform is a hard sell. Spammers exist and will leverage any successful platform in a significant way and to legislate users should by default be forced to see that information will degrade everyone’s experiences rendering the service useless. We will need to see how exactly this law is written to see whether it holds water or not. Moderation is is only one aspect of what the user sees in the Bluesky example, feed generators are the second, and each feed generator can make a choice on what, how and when you see something regardless of your moderation choices, so not sure how this legislation would apply there as those are done by a variety of organisations. I’m not sure this really impacts Activity-Pub as much as it impacts AP implementations. The Fediverse Auxiliary Service Provider Specification from Mastodon is great first example of what they are doing that could enable these kinds of choices. If there were Labeling services (whether on AP or not) available for an instance to choice from using that spec (https://github.com/mastodon/fediverse_auxiliary_service_provider_specifications) then theoretically the client could allow users to choose which ones to use and how that affects their timeline. Greg > On Jan 17, 2025, at 10:52 pm, Bob Wyman <bob@wyman.us> wrote: > > Yesterday, Missouri's Attorney General announced plans to issue regulations <https://ago.mo.gov/attorney-general-bailey-promulgates-regulation-securing-algorithmic-freedom-for-social-media-users/> that would require social media platforms to “offer algorithmic choice” to users. Clearly, it will take some time for this plan to be published, studied, challenged in court, etc. It is also quite possible that the regulations will be targeted to only the largest services (i.e. Twitter, Facebook, etc.). Nonetheless, I think we should anticipate that the coming to power of Trump and MAGA Republicans is likely to spawn many such proposals in the coming years. Given this, I think it would be useful to at least consider what impact such regulations would have on Social Media systems that rely on ActivityPub and ActivityStreams. > > My guess is that the position of the "ActivityPub" community would be that, in a federated system composed of a multiplicity of independent interoperating servers -- each having a potentially different moderation approach, it is not necessary for each individual server to offer algorithmic choice. Users are free to seek out and use a server whose default "algorithm" addresses their needs. However, this position might not be accepted as sufficient if the opinion is that the individual server, not the federated system as a whole, is considered to be the regulated "platform." The obvious question then becomes, what would need to be done to enable a federated service, even if very small on its own, to provide compliant algorithmic choice? > > Some will undoubtedly argue that the BlueSky support for a variety of "labeling" services, when combined with user-selected client algorithms capable of filtering, etc. based on labels, might be sufficient to provide the necessary algorithmic choice. If such an approach is sufficient, then one must ask if supporting it would require modification to the ActivityPub protocols and schemas? (i.e. Would we need to add a "content label" item that allows the annotation or labeling of posts, replies, collections, etc.?) Would a labeling service be able to rely on the existing server-to-server protocol? Or, would something tailored more to the specific requirements of labeling be necessary? Of course, it would be useful to ask if there is a less cumbersome or otherwise superior method for providing algorithmic choice. What do you think? > > While the text of the plan isn't yet available, the AG's press release does provide a sketch of what will eventually be published. See the list below or read the full release <https://ago.mo.gov/attorney-general-bailey-promulgates-regulation-securing-algorithmic-freedom-for-social-media-users/>: > "Users are provided with a choice screen upon account activation and at least every 6 months thereafter that gives them the opportunity to choose among competing content moderators; > No algorithm selection is chosen by default; > The choice screen does not favor the social media platform’s content moderator over those of third parties; > When a user chooses a content moderator other than that provided by the social media platform, the social media platform permits that content moderator interoperable access to data on the platform in order to moderate what content is viewed by the user; and > Except as expressly authorized below, the social media company does not moderate, censor, or suppress content on the social media platform such that a user is unable to view that content if their chosen content moderator would otherwise permit viewing that content." > bob wyman >
Received on Saturday, 18 January 2025 10:13:58 UTC