Re: Special Topic Call - Social Web and CSAM: Liabilities and Tooling

Inline

On 8/4/2023 2:03 PM, David Somers wrote:
> I'd like to attend the meeting, but unfortunately I have a scheduling 
> clash, so instead some comments and observations:
>
> fediverse-csam-report takes a very USA-centric perspective, especially 
> with regard to classifying and reporting csam, so its direct 
> application in other jurisdictions is limited; what would be very 
> useful is what applies elsewhere; for the sites it examined a 
> breakdown of the numbers by jurisdiction would be better.
Stanford is a US university, after all! Here's today's country stats 
from fedidb.org:

I'm not sure the direct application will be THAT different in the EU, 
which has the lion's share of servers and users according to fedidb's 
data, once the Digital Services Act goes into effect.  For recent EU/EC 
guidance on CSAM liabilities and best practices, see the report that 
user @Federico shared on the SocialHub thread 
<https://socialhub.activitypub.rocks/t/about-child-safety-on-federated-social-media/3447/9> 
about this issue. CSAM is, like tax evasion and money laundering, a 
global problem that tends to seek out the weakest link in the global web 
anyways, regardless of liabilities-- even server operators under no 
legal or regulatory obligation to keep it off their servers may not want 
the reputation that comes with being that weak link.
>
> I think csam is a bit of a distraction from the underlying root issue 
> which is that of being able to undertake (effective) moderation; 
> moderation encompasses the gamut of csam, hate speech, etc.; 
> furthermore what is applicable in one jurisdiction may not be 
> applicable in others, and needless to say things can get more than 
> complicated when working across jurisdictions vis-à-vis liability and 
> reporting requirements. And then there's a whole thing about CW and 
> user or system filtering.
Yup.  Above-linked thread and others on SocialHub, as well as on the 
IFTAS matrix server, have been detailing the intricacies of these 
related problems.  The trick is to craft interoperable solutions that 
layer onto each other and interact rather than one-off solutions to them 
as isolated problems.  This is very much what SWICG's wheelhouse should 
be, if I understand the W3C as well as I hope to-- encouraging the most 
efficient and modular solutions rather than per-problem, 
per-implementation quickfixes 
<https://socialweb.coop/blog/firefish-cloudflare-quickfix-r2-tutorial/>.
>
> I think it would be useful to have some conversations with PhotoDNA, 
> SafeSearch, NCMEC, etc, about how to work with them... or whether to 
> funnel through "ActivityPub extensions for attestation" as proposed in 
> section 5.5. is the way (in which case, who is going to develop, 
> maintain, and administer such a service).
I sincerely believe that a FEP for how to express and parse such 
per-asset attestions with extension-vocabulary and test-vectors (if not 
test-suite) would be the way to go; I would prefer they not be 
"API-locked" to a given commercial solution, however, and that is hardly 
a trivial assesment to make even individually.

Section 5.5 did not strike me as a normative proposal, just a kind of 
easily (or already) prototyped one-off solution; generalizing it to work 
across implementations, many of which may be uncomfortable using 
PhotoDNA or competitors of similar size and business model, will 
definitely be a more complex discussion than fits in a one-hour CG 
meeting. It would probably be easier to design an adequately flexible 
attestation-sharing mechanism starting backwards from a media asset 
identity/deduplication mechanism,  since it's only worth sharing such 
per-asset information if all parties share the same identification 
scheme for those assets. Content-addressing of the sort IPFS is famous 
for might make more sense than organizing all attestation by the hash 
function used by PhotoDNA... and might also help with deduplication.  
Orthogonally, "distance-hashing" or "perceptual hashing" à la ISCC might 
make sense for some large-scale media-heavy servers, but those are 
possible solutions way down the road from agreeing to the shape of the 
shared problem :D

In any case, we'll take good minutes of the meeting and the [broader 
moderation-tooling and reporting and attestation-sharing] conversations 
will continue here on-list and in all the venues linked above, I'm sure.

>
> best,
> David Somers
>
> On Mon, Jul 31, 2023, at 12:47, Dmitri Zagidulin wrote:
>> Hi everyone,
>>
>> In light of the recent report Addressing Child Exploitation on 
>> Federated Social Media 
>> <https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media> and 
>> the many important resulting conversations (such as this megathread 
>> <https://mastodon.social/@det@hachyderm.io/110782896576855419>), 
>> SWITCH would like to host a Special Topic Call on "Social Web and 
>> CSAM: Liabilities and Tooling", this coming Friday, August 4th, 2023, 
>> at 9am Eastern / 6am Pacific / 3pm CET, at:
>>
>> https://meet.jit.si/social-web-cg
>>
>> We're very excited to be joined by special guests, David Thiel and 
>> Alex Stamos, from the Stanford Internet Observatory!
>>
>> The Chairs
>
-- 
------------------------------------------------------------------------
@bumblefudge in a few places, including https://chainagnostic.org

Received on Friday, 4 August 2023 12:53:50 UTC