- From: Adam Lake <adam@mosaic.social>
- Date: Sun, 30 Apr 2023 08:59:41 -0400
- To: Marcus Rohrmoser <me+swicg@mro.name>
- Cc: public-swicg@w3.org
- Message-ID: <CAAqYVMBo_VjMB=mF2MLawUzkohbXSLsaLEzyPYc8uQjNp5sKyA@mail.gmail.com>
My understanding is that yes Activity Pub does have account portability but it's not implemented in a way that guarantees account portability. If a server host bans you they're under no obligation to enable you to migrate your identity and data (posts) from one server to another. As far as algorithmic moderation is concerned as long as users can choose their algorithm, swap them out, or choose not to use one, I don't see a problem. Don't we want flexibility and user choice? Maybe there are going to be some good content curation algorithms that help us have good information diets. There are also destructive ones like Facebook. Users should have choice -- it shouldn't be up to developers how people decide to curate content. Adam On Sun, Apr 30, 2023, 5:30 AM Marcus Rohrmoser <me+swicg@mro.name> wrote: > On 29 Apr 2023, at 15:29, hellekin wrote: > > > Composable, customizable curation and moderation is algorithmic > > moderation. I'd rather not leave moderation to a machine. > > Definitively this has to come with appealability and responsibility and > is beyond machine capability. It is a matter of personal dignity to be > not decided upon by a machine. > > In civilised societies there are courts, judges and police to apply > civil rules and finally settle disputes. > > Are we talking about arbitrary rules like you mustn't say Android in the > Apple App Store or about serious things to protect individuals? The > latter is be backed by civil law, right? > > /Marcus > >
Received on Sunday, 30 April 2023 12:59:56 UTC