Re: Safe harbo(u)rs: A structural proposal for interop

Thanks to both Vittorio and Hellekin for their feedback on the proposal. Some responses below:

Hellekin writes:

> define 'dominant companies' -- a lot of them will try and figure out a way to play underdog
> define the requirements -- it may be difficult to find a one-size-fits-all list of requirements

I think both of these questions misapprehend what's going on with DMA and with the ACCESS Act.

Both of these bills have statutory definitions - based on a mix of market cap and number of users - that define "gatekeepers" or "dominant platforms." The number of companies captured by these rules is quite small - fewer than ten, though exactly how many depends on the specific bill (and neither bill is law yet, and might change before that happens, if it happens).

I'm arguing that we should endorse this position:

"However these bills define gatekeepers (and without endorsing that definition, which may well be insufficient), we think that rather than mandating compliance with a standard, the bills should mandate compliance with the requirements, with the standard constituting a safe harbour."

How the requirements are set is *also* not yet fully developed. The DMA discusses working with an SDO; ACCESS describes creating a standardisation committee composed of a mix of reps from a platform, from new market entrants, from NIST and from academia or civil society (the first version capped the number of each group's participants *except* the dominant platform; I helped the drafters fix this by pointing out that this could yield a committee consisting of 500 FB lawyers and engineers, a NIST lawyer, two engineers from startups, an EFF rep, and an academic!).

Either way, these bills contemplate that there *will* be a set of requirements produced, somehow. Whether or not my proposal is adopted, the eventual standard will be based on those requirements, so again, I'm asking that we endorse the position that:

"However these bills arrive at requirements (and without endorsing that process, which may be inadequate), we think that rather than mandating compliance with a standard, the bills should mandate compliance with the requirements, with the standard constituting a safe harbour."

I think that this group could very well produce a similar one-pager on how requirements should be derived, but I'm trying to decompose these techno-policy questions into individual questions that we can find consensus on and make clear recommendations on.

So while Hellekin raises some good, chewy suggestions on how to derive requirements ("declare a set of wanted features, and group them into a charter") I think that's a separate project from this one. I'm arguing that *irrespective* of how the requirements are derived, we should endorse the principal that the small number of dominant firms (however they are defined) should be given the choice of either complying with the requirements *or* adopting the standard and securing a safe harbour.

Both Hellekin and Vittorio raise the issue that allowing compliance through either a standard or a requirements-compliant API would increase the burden on smaller interoperators. Again, I think this reflects a misapprehension of what both DMA and ACCESS are seeking.

In both cases, there is no suggestion that all dominant firms will adopt the *same* standard. Rather, they contemplate that each platform will have its own interop standard because each does something different - Google is not Facebook, Facebook is not iOS. That means that any interoperator seeking to take advantage of these bills' new regimes will already have to implement a separate function, with separate libraries and API calls, for each platform they wish to interoperate with.

So the question is: is it more burdensome to interoperate by adopting a standard or by adopting a documented API with an FLOSS implementation? Are there more opportunities for shenanigans in one or the other?

These are, effectively, game theory questions, and the answers depend on our views of when it is easier to get the referee (a judge or regulator) to punish cheating.

So let's say there's a mandatory, SDO-derived API. Dominant firms may seek to influence this API to make it less useful for competitors (I think this is *very* likely). Capturing an SDO is remarkably simple and has a well-developed playbook, since large firms with lots of money can assign full-time staff to fill in roles like editor, secretary, etc, and these roles wield enormous influence in standards development.

Even if there is a mandated structure for the committee - like the one contemplated by the revised ACCESS Act - there are effectively undetectable ways of securing this outcome. Like if FB is only allowed two reps on the committee, they could each be backed by 1,000 full-time FB employees who ghost-write their contributions, allowing them to "volunteer" to fill key administrative positions. Meanwhile, civil society groups and startups are still left trying to fund the time for their own reps to check the mischief of FB (FB produces a 1,000 page draft authored by hundreds of paid staffers but submitted with one staffer's name on it; a startup has to parse and critique that draft with whatever time it can wring out of its overstretched employees who are also working on daily operations).

Ultimately, the production of a good standard requires that the dominant platform's opponents - startups, civil society, outside experts, whomever - make a successful appeal to a referee (a regulator, a judge) about specific ways in which the standard is deficient. That plea is likely to be based on comparing the standard to whatever requirements it is supposed to fulfill (again, we should think about how those requirements are derived, but that is a separate question - whatever move we make assumes that there will be requirements).

This is the *same* process for seeking redress in the event that a dominant firm opts to produce its own API and claim it satisfies the requirements and therefore complies with the law.

There are other avenues of attack open to dominant platforms; for example, they could adopt a compliant API (either the standard or one of their own devising) and then alter their internal operations to sabotage the API. An example of this is the 2012 Massachusetts automotive Right to Repair law, which mandated that automakers provide access to diagnostic information traveling over a car's wired CAN Bus. The automakers responded by creating in-car WIRELESS networks that weren't covered by the mandate (in 2020, Mass voters passed a ballot initiative plugging this hole, but the 8 intervening years saw a mass exit of independent mechanics who either changed careers or went to work for the dominant car makers' repair depots).

This is a profound weakness in either approach - either mandating standards' compliance OR a safe harbour regime. I have other ideas about how to fix it (creating an "interoperators' defense" that can be used to counter cybersecurity, copyright, patent and contract claims against reverse-engineers, scrapers, etc: https://pluralistic.net/2022/02/05/time-for-some-game-theory/#massholes) but again, I'm trying to break this down into digestible chunks.


>
> As part of this obligation, any firm that offered its own API could be mandated to offer a FLOSS reference implementation of a library for interacting with it.
>

This I find difficult to support, because it puts the burden on the smaller actors again. Let me explain. It sounds a bit like ToS;DR: the approach chosen by ToS;DR is to scrape all companies' terms of service pages and watch for change. One part of the work is to analyze the ToS and classify them on a scale of acceptability. This is very intensive work that only bubbles up with more companies to watch.

Another, simpler approach, would be to declare a set of wanted features, and group them into a charter. Such a charter would make it very easy on everyone: users, lawyers, enforcers, and good-willing companies. People would have a single text to read, and could even have a plugin telling them whether a service complies to what they consider the minimum set of features they want. Obviously, companies relying on deception would not be able to agree to such a charter as it would be dead easy to check for compliance.

I think this kind of approach is to be preferred when trying to tackle such huge asymmetric power. If you put more burden on smaller players--as in: requiring them to implement ever-changing corporate interfaces--then you're on a wrong path for resolution. We must aim for less work, not more.

Besides, there are side effects that are not easy to take into account. For example, when Google Reader was discontinued, how many people lost links? Sure, they could save an OPML file with all their feeds. But nobody could rewrite years of web linking to Google Reader, and those were lost; the network effect was lost. We must be wary of such side effects coming up, which are arguably unavoidable when you have so much content and relationships behind walled gardens.


> I think a lot of the "standardization is bad for innovation" critique is in bad faith and basically advances a nihilistic, "Nothing can be done, this is as good as it gets" position meant to create paralysis.
>

Sure, it's the same vein as 'free software is bad for business'.


On 2/10/22 01:53, Vittorio Bertola wrote:
> 
> 
>> Il 09/02/2022 15:14 Cory Doctorow <cory@eff.org> ha scritto:
>>
>> What do you all think?
> 
> There has been a lot of discussion around this problem in the European open source policy community that works on the DMA.
> 
> There seems to be rough consensus that mandating the use of a standard, rather than mandating the opening of a custom API, is a superior solution because the API only enables "centralized" interop, i.e. each other app/provider towards the gatekeeper and back, while a standard would naturally prompt full interoperation, i.e. everybody speaking with everybody else.
> 
> This would make a big difference especially for SMEs, startups and open source projects, which would never have the resources to keep up with multiple different APIs independently managed by each dominant/big player.
> 
> Also, by pointing to an open standard developed through an open process at one of the existing SDOs, we avoid the risk of the gatekeepers manipulating their APIs to make interop hard to impossible (e.g. frequently making non-bw-compatible changes - this is e.g. what Youtube does to break third party downloaders all the time). There is some concern that open SDOs can be captured by the gatekeepers, and there will be safeguards against that, but still this process would be much more favourable to the community than the gatekeeper just doing their own or negotiating in private with the regulators.
> 
> The point around innovation is mostly moot; moreover, only the gatekeeper has obligations (all the others could actually choose not to interoperate) and they only relate to "industry-standard features", i.e. a set of basic features that can be defined in a relatively objective way (for example, you pick the top 10 services on the market and you see which features are available in at least 7 of them; something like this). Also, Internet standardization has always worked in cycles; you standardize some features, then someone innovates on top of them, then the others copy, then the new feature is brought into the standard. We just need to keep doing what we already did.
> 
> The FSFE actually managed to table an amendment to this effect that unfortunately was rejected. As a company coalition, we are now pushing the following text:
> 
> "through the use of open standards and through contractual and technical conditions that do not [unnecessarily] hinder the interoperation by any kind of third party providers[, including SMEs and non-commercial entities]."
> 
> It's very hard to push new text at this stage, but we'll see.
> 

Received on Thursday, 10 February 2022 15:09:51 UTC