[docs-and-reports] Align our principles with what we're building (#58)

martinthomson has just created a new issue for https://github.com/patcg/docs-and-reports:

== Align our principles with what we're building ==
This came up in the discussion on #52.  It was not clear that the text about researchers and auditors was consistent with the sorts of things that are possible within the systems we're building.  At that time, I said:

              If you are looking for a problem statement, try this on for size:

Any system that handles user data in the aggregate needs to provide strong constraints that limit the possibility that the data is misused.  However, the uses of data that are permitted within those constraints might still admit narrower forms of abuse.  For measurement, this might involve selectively targeting individuals or groups of individuals for the purposes of obtaining more and more actionable data about their online activities.

For advertising purposes, this sort of targeting is often a primary goal of measurement systems. A problem arises when this targeting is repeated to the point that it puts individuals at greater risk of exploitation based on the information that is obtained.

The distinction between abusive uses and ordinary uses of these systems could be hard to make without additional information about the inputs to the system.

The measurement systems being proposed all rely on oblivious computation to some degree.  This means that access to their internal operation reveals no meaningful information.  To that end, most of the information of interest is held by companies in the advertising market: ad techs, publishers, and advertisers.

In attempting to access that information, the key challenge is that any information that might be needed to detect abuse is also virtually guaranteed to be commercially sensitive.  Revealing information about the conduct of measurement also reveals information about how advertisers place their advertisements, how they structure their bidding strategies, and even details of clients.

It might be possible for an independent researcher or auditor to gain access to this sort of information.  They might be able to convince participants to allow access to the information for certain narrow purposes.  The current environment does not establish good incentives for market participants to accede to that sort of inspection.  Inspection carries risks both to that commercially sensitive data and to the reputation of the advertiser, with no real upside.

The question we need to ask is whether there is any change to how the system operates that might make the system more open to these sorts of aggregate, independent systems of accountability.  In doing so, we need to balance the commercial sensitivity interests of those participating in advertising with those goals.  And we need to sustain the high standards we have for privacy at the same time.

_Originally posted by @martinthomson in https://github.com/patcg/docs-and-reports/pull/52#discussion_r1505372082_
            

Please view or discuss this issue at https://github.com/patcg/docs-and-reports/issues/58 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Wednesday, 10 April 2024 19:07:35 UTC