Re: Special Topic Call - Social Web and CSAM: Liabilities and Tooling

Responses to Bob's questions inline, with the biggest "I AM NOT A LAWYER"
disclaimer ever. My responses come from my experience putting together a
legal guide with the Harvard CyberLaw Clinic that we plan to release this
fall for small federated server operators.

I answer the questions that I have some information about; I leave the
others quoted but unanswered in case someone is reading the thread and
wants to chime in.

On Mon, Jul 31, 2023 at 1:39 PM Bob Wyman <bob@wyman.us> wrote:

>
> Some questions that might be addressed during the call, or in email prior
> to it:
>
>    - Are operators of ActivityPub instances considered to be "providers
>    of electronic communication services or remote computing services" who, if
>    in the USA, have a legal obligation to make reports to the NCMEC?
>
> Yes. Critically, the legal obligation comes once an operator is made aware
of the presence of CSAM. An operator is not legally obligated to actively
scan and search for CSAM material, and is not on the hook for material
hosted that they do not know about, but rather they must report to NCMEC
when they are made aware of the CSAM material. They must then retain a copy
of material for 90 days (I believe to assist law enforcement if needed) and
then must delete the material after 90 days.

>
>    - Do laws in other nations require CSAM reporting and/or establish
>    organizations like the NCMEC to receive such reports?
>    - Are there any other US or non-US laws that might require ActivityPub
>    instance operators to make reports of other kinds of illegal speech?
>
> Under US law the only reporting requirement that I am aware of is CSAM to
NCMEC. I can't speak to the rest of the world.


>
>    - Have any operators of ActivityPub services received NCMEC take-down
>    notices? Is this a common, or an unusual event?
>    - Have any users of ActivityPub services been prosecuted for
>    distributing CSAM via ActivityPub? If so, were any of them as a result of
>    reports to NCMEC?
>    - Does anyone know if NCMEC reports are required for
>    computer-generated media that does not depict actual children? (i.e.
>    CG-CSAM)
>
> The answer here is "no one knows for sure" but Thiel et al have another
paper on CG-CSAM that states

 > Note that while some CG-CSAM which does not qualify as photorealistic
may be legally classified as obscenity and thus not subject to the same
legal reporting standards as CSAM under 2256, instances of photorealistic
CG-CSAM that could reasonably be a depiction of a known victim will need to
be reported by platforms.

Source:
https://stacks.stanford.edu/file/druid:jv206yg3793/20230624-sio-cg-csam-report.pdf


>    -
>    - If the operators of an instance did, in fact, make a practice of
>    filing NCMEC or similar reports, would it be useful to announce this on
>    their site? If so, would it be useful to define some standard means by
>    which instance operators could announce that they do so? (e.g. Some site
>    metadata, standard badge, etc?)
>    - The 25 Mastodon instances studied in the report included Japanese
>    instances which are known to carry lots of CSAM. How much of a problem is
>    CSAM on non-Japanese instances?
>
>

Received on Tuesday, 1 August 2023 17:13:15 UTC