Re: Special Topic Call - Social Web and CSAM: Liabilities and Tooling

The report, Child Safety on Federated Social Media
<https://purl.stanford.edu/vb515nd6874>, mentions the CyberTipline API
<https://report.cybertip.org/ispws/documentation/> of the US-based National
Center for Missing & Exploited Children (NCMEC)
<https://www.missingkids.org/> and suggests that it would be useful to
provide mechanisms to make it easier to file reports using that API. (See
page 11) US law apparently requires such reports by providers of electronic
communication services or remote computing services. (See: 18 USC 2258A
<https://www.law.cornell.edu/uscode/text/18/2258A>). Also, US law requires
that the NCMEC forward reports to appropriate Federal, State, or foreign
authorities. The NCMEC says that during 2021, they sent 75,000+ take-down
notices to companies and that the average time to remove offending media
was 27 hours. (See: Link
<https://www.missingkids.org/gethelpnow/isyourexplicitcontentoutthere#:~:text=In%202021%2C%20we%20sent%2075%2C000%2B%20notices%20to%20companies.%20On%20average%2C%20images%20were%20taken%20down%20within%2027%20hours.%C2%A0>)
I can't find any statement concerning the number of reports NCMEC made to
law enforcement agencies.

Some questions that might be addressed during the call, or in email prior
to it:

   - Are operators of ActivityPub instances considered to be "providers of
   electronic communication services or remote computing services" who, if in
   the USA, have a legal obligation to make reports to the NCMEC?
   - Do laws in other nations require CSAM reporting and/or establish
   organizations like the NCMEC to receive such reports?
   - Are there any other US or non-US laws that might require ActivityPub
   instance operators to make reports of other kinds of illegal speech?
   - Have any operators of ActivityPub services received NCMEC take-down
   notices? Is this a common, or an unusual event?
   - Have any users of ActivityPub services been prosecuted for
   distributing CSAM via ActivityPub? If so, were any of them as a result of
   reports to NCMEC?
   - Does anyone know if NCMEC reports are required for computer-generated
   media that does not depict actual children? (i.e. CG-CSAM)
   - If the operators of an instance did, in fact, make a practice of
   filing NCMEC or similar reports, would it be useful to announce this on
   their site? If so, would it be useful to define some standard means by
   which instance operators could announce that they do so? (e.g. Some site
   metadata, standard badge, etc?)
   - The 25 Mastodon instances studied in the report included Japanese
   instances which are known to carry lots of CSAM. How much of a problem is
   CSAM on non-Japanese instances?

bob wyman

On Mon, Jul 31, 2023 at 12:48 PM Dmitri Zagidulin <dzagidulin@gmail.com>
wrote:

> Hi everyone,
>
> In light of the recent report Addressing Child Exploitation on Federated
> Social Media
> <https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media> and
> the many important resulting conversations (such as this megathread
> <https://mastodon.social/@det@hachyderm.io/110782896576855419>),
> SWITCH would like to host a Special Topic Call on "Social Web and CSAM:
> Liabilities and Tooling", this coming Friday, August 4th, 2023, at 9am
> Eastern / 6am Pacific / 3pm CET, at:
>
> https://meet.jit.si/social-web-cg
>
> We're very excited to be joined by special guests, David Thiel and Alex
> Stamos, from the Stanford Internet Observatory!
>
> The Chairs
>

Received on Monday, 31 July 2023 20:38:38 UTC