[wg/pat] Formal Objection (charter review 2022)

From
   https://www.w3.org/2002/09/wbs/33280/PATWG-charter-2022/results

The Charter for this group states that its scope is to improve privacy 
in relation to digital advertising.

[W3C member] shares the tenets of this group that advertising as a 
funding model for the web is beneficial. Along with many other W3C 
members, [W3C member] supports efforts to ensure advertising remains an 
effective and efficient funding mechanism available to these digital 
businesses. The risks to the open web are well documented, should 
proposed mechanisms for advertising fail to adequately support 
decentralized digital properties:

“Second, blocking cookies without another way to deliver relevant ads 
significantly reduces publishers’ primary means of funding, which 
jeopardizes the future of the vibrant web. Many publishers have been 
able to continue to invest in freely accessible content because they can 
be confident that their advertising will fund their costs. If this 
funding is cut, we are concerned that we will see much less accessible 
content for everyone. Recent studies have shown that when advertising is 
made less relevant by removing cookies, funding for publishers falls by 
52% on average.”
[Building a more private web]

Advertising is a business process

Advertising is a business process. Digital advertising is paid by a 
marketer to a media owner, and hence is primarily a business-to-business 
process. Of course, the goals of both marketer and media owners are to 
attract, engage and achieve positive responses from consumers who are 
exposed to such paid content. To support the needs of responsible 
advertising, the Charter would benefit from clarifying some concepts.

Given both publishers and marketers wish to understand which ads are 
driving business outcomes relatively better than others, some feedback 
loop is required. Ensuring some controls exist to act on this feedback 
enables marketers to improve the effectiveness of their spend, driving 
increased revenues for publishers. This exchange of information between 
publisher and marketer organizations is by definition a cross-context or 
cross-site exchange of data.
We do not believe the above is in dispute, as the Charter explicitly 
focuses on cross-context and cross-site processing of Personal 
Information for advertising business purposes, such as remarketing, 
frequency capping and attribution as being in scope.

Improving privacy when conducting business processing

We are all interested in improving privacy when conducting business 
processing of Personal Information.
We note in forming this working group, the drafters were unable to align 
on a clear definition of “privacy,” and instead took as its focus 
appropriate risk mitigation.

To mitigate risks to specific individuals, the group often focuses on 
the output of data provided after the advertising business processing 
has been conducted. However, to appropriately address the concerns 
raised, we believe the Charter must also address concerns regarding the 
collection and processing of the input Personal Information.

Appropriate processing of Personal Information should be the focus, 
rather than which organization does such processing. There should be 
multiple mechanisms for web authors and media owners to work with 
business partners of their choice, rather than relying exclusively on 
consumer software manufacturers who also offer business advertising 
solutions for such processing. All actors must comply with regional data 
protection regulations, so any additional risk mitigations ought to be 
explicitly defined as to how a proposal reduces risks when an 
implementor processes Personal Information.

[W3C member] is concerned without modifications to the Charter language, 
there is a high risk of restricting competition for improved digital 
advertising while we seek to improve privacy outcomes for individuals.

By appropriately addressing these issues during the Charter stage, we 
believe this will improve the overall focus of the group and the utility 
of its work product. It will allow a full discussion of the range of use 
cases which may come up with no arbitrary limitation or framing so as to 
privilege particular businesses or their views on the appropriate role 
of the technology when it intersects important policy issues.

In line with the above, [W3C member] has grave concerns and, in a 
constructive spirit, raises a Formal Objection regarding the Private 
Advertising Technology Working Group Charter to ensure we can 
appropriately mitigate some potential, unintended consequences in the 
standard specifications this group proposes to develop.

Issue 1: The Charter is silent on which risk mitigation methods are in 
scope to address concerns
The Charter text states:

"The Working Group may consider designs that allow user agents for the 
same user — including non-browser agents, like Operating Systems — to 
collaborate in providing advertising features."

The Charter would be greatly improved from listing specific concerns and 
mitigation methods that it will pursue for this collection and 
processing of the input Personal Information, especially when 
collaborating in providing advertising features. For example, the 
Charter might rely on meaningfully informed consumer consent as the 
basis for reducing risk from such processing. If it were to pursue this 
approach, then how to meaningfully inform consumers of business 
processing of Personal Information would also be in scope.

Assuming it useful for consumers to select the business software used by 
the digital properties they visit, logically one would expect the 
Charter not to limit which choices a user would have to enable different 
software providers to “collaborate in providing advertising features.”

Consumers are unlikely to be informed of all business processing 
purposes or solution providers that support the digital properties they 
frequent, and it is questionable whether burdening them with such 
education is an appropriate approach to the necessary handling of 
business-initiated exchanges of innocuous Personal Information in our 
data-driven world. Yet without such information about business 
processing, consumers cannot make meaningful decisions regarding such 
processing decisions. If alternately one were to believe consumers 
should select which business solutions the digital properties they 
frequent should use, then one ought to suggest that consumers should be 
free to choose other software solutions to perform this business 
processing rather than limit their choice exclusively to consumer 
software manufacturers. We note that some reliance on software 
processing outside the user agent is contemplated in the Charter when 
mentioning specific use cases (e.g., Private Attribution). Thus it would 
be useful to further define which factors are considered in stating such 
processing is appropriate or alternately increases risks to individuals.

If we look to the risks identified by data protection regulations, they 
include re-identification risks and illegal discrimination by using 
special category sensitive information, even when such data is not 
linked to a specific individual’s identity.

The Charter would be improved by making explicit exactly which 
safeguards must be in place by any software that will collect and 
process the input Personal Information and how any business software 
could apply the safe processing of cross-context and cross-site data 
that is envisaged to be conducted by OS and browser manufactures for 
advertising purposes in the current draft. Examples of mitigation 
measures might include:

1) relying solely on de-identified input data for all processing, rather 
than identity-linked data,
2) conducting such business processing only with prior explicit, opt-in 
consent, or
3) ensuring audit logs for the records of such processing activity are 
easily available to end users.

Indeed, there is a notable absence of this final point, namely 
establishing practices and technologies that can aid software designers 
in proving compliance with risk-mitigated data processing.

Issue 2: The Charter is silent on how appropriate cross-context or 
cross-site business processing can be conducted
The Charter text states:

Here "privacy" minimally refers to appropriate processing of personal 
information.

The Charter admirably seeks to improve privacy for individuals, when 
interacting with ad-funded digital businesses.

The Charter explicitly mentions the collection and processing or 
Personal Information for business advertising needs such as 
“remarketing,” “frequency and recency controls” (aka “frequency 
capping”) and “attribution.”
Given all the above business use cases require collection of user 
activity cross-site and cross-context to perform the appropriate 
processing that produces:

1. the output eligibility for a specific business’ advertising campaign 
(e.g., remarketing),
2. restricting delivery based on marketer-initiated parameters (e.g., 
frequency capping), and
3. the value to a marketer for prior exposures across media owners’ 
inventory (e.g., attribution),

the group expects that it is possible for such data collection and 
processing to be done appropriately.

If software envisaged by this Charter is to responsibly process 
cross-site or cross context information for digital advertising (such as 
frequency capping and attribution), it would be useful to ensure how 
such processing is improving end user privacy.

However, the Charter fails to describe how to distinguish among the 
appropriate from inappropriate cross-context and cross-site data 
collection and processing to support these business use cases. Without 
such guidance referenced in the Charter, it is difficult to understand 
which proposals or data processing practices would or would not properly 
be in scope.

We recognize that there are a wide range of perspectives on defining 
“privacy” as well as what is “appropriate” – and indeed for this very 
reason would like to ensure the Working Group remains open to 
responsible uses of data exchanges that benefit individuals, 
particularly from smaller organizations that by definition of their size 
must rely on more partners than their larger rivals.

Issue 3: The Charter is silent on how trade offs in utility will be 
applied relative to reductions in risk

The Charter focuses on addressing the needs of individual web users that 
must be balanced with those of businesses involved in digital 
advertising. When a particular topic is too subjective, then it may make 
sense to not preclude future innovation by overly constraining how 
software may function, but err on the side of openness and choice.
As regards the Charter, this is simply the principle of keeping a more 
balanced perspective at the chartering stage. For example, there are 
serious concerns about arbitrary restrictions on some types of highly 
responsible and secure data handling on the basis that it involves 
moving data between sites or contexts. This is not always harmful and 
sometimes can be very helpful (e.g., frequency capping can reduce 
annoyance that comes from over exposure). The Charter ought to bear in 
mind these forms of use cases. We feel that the Charter can be improved 
by making explicit its intent to balance the rights of users with the 
freedoms required to support the advertising needs of the decentralized 
open web publishers. If helpful and beneficial data exchanges are 
restricted, this could have negative implications for end users as well 
as society.

The Charter ought to identify the weighting criteria involved for 
balancing the trade offs among improved effectiveness for business 
outcomes and reduced risks to specific individuals from the business 
processing of their Personal Information. While the weighting may be up 
for debate, the Charter ought to list the criteria it will use in making 
such judgement calls. A potential list of business factors could include:

• efficiency of exchanges required for digital advertising,
• relative ability of publishers to generate revenue from digital 
advertising,
• relative effectiveness of paid media for marketers,
• cost of implementation by publishers or marketers, and
• foreseeable impacts on competition.

Of course, the above criteria ought also to be balanced with some 
practical improvement for individuals. Accordingly, it would be useful 
to list which risks are eliminated by specific proposals, even if they 
reduce utility or increase costs.
As part of the risk-mitigation measures, data protection regulations 
provide guidance on the appropriate privacy-by-design measures, which 
are largely consistent across regions. These same regulations clearly 
articulate risk-mitigation measures associated with the processing of 
Personal Information that are meant to appropriately balance the rights 
of individuals with the freedoms of the organizations with whom they 
interact.

The Charter would be improved by similarly and explicitly recognizing 
this need for balance.

Issue 4: The Charter should ensure risk mitigation is proportional to 
the concern of processing Personal Information

If we look to data protection regulations, they suggest measures to 
reduce the risk to specific individuals from the processing of Personal 
Information.

Data protection regulations (e.g., CPRA & HIPPA in the US and GDPR in 
EEA) share the same goal of providing guidance on how to support 
responsible sharing of Personal Information, rather than preventing all 
sharing among organizations.

Differing regional data protection regulations supply largely conforming 
definitions of Personal Information (aka Personal Data). These same data 
protection regulations distinguish aggregate anonymous data from the 
concept of Personal Information. More importantly, these regulations 
distinguish identity-linked Personal Information that poses high risk, 
with pseudonymized Personal Information data that poses lower risk.

Many organizations that highlight their support for consumer privacy 
emphasize this key distinction. As just one example, Apple’s privacy 
policy notes that its software relies on such pseudonymous “random 
identifiers” that are not linked to the identity of Apple customers:

“Apple News leaves what you read off the record.
Apple News delivers content based on your interests, but it isn’t 
connected to your identity. So Apple doesn’t know what you’ve read.
Many news sources keep track of your identity and create a profile of 
you. Apple News delivers personalized content without knowing who you 
are. The content you read is associated with a random identifier, not 
your Apple ID.
You get editor-curated content and a personalized newsfeed so you can 
stay up to date with the latest news and stories. And because Apple News 
uses machine learning, the more you use it, the better your app gets to 
know what you like — without Apple ever knowing what you’re into.”
[Apple Privacy Policy]

The Charter would be improved by ensuring that proposals to responsibly 
share lower risk input data for digital advertising are within scope.

Issue 5: The Charter should afford the same risk mitigation measures to 
users equally to all software manufacturers, regardless of whether a 
consumer-software manufacturer or business-software manufacturer 
provides the advertising solution

The Charter text states:

“Ways in which new features might enable inappropriate processing 
include (but are not limited to) enabling of cross-site or cross context 
recognition of users or enabling same-site or same-context recognition 
of users across the clearing of state.

The Working Group may consider designs that allow user agents for the 
same user — including non-browser agents, like Operating Systems — to 
collaborate in providing advertising features.”

The Charter fails to outline which safeguards must be in place by OS or 
browser manufacturers or other user-agents, in their own collection and 
processing personal information to address concerns related to 
cross-site or cross context recognition.

Most OS and browser manufacturers offer individuals the ability to 
register an account that discloses their identity to this consumer 
software. There must be some technical or organizational measures to 
ensure such organizations can collect and process digital activity, but 
not mingle it in ways that would increase risks to specific individual’s 
privacy. This recalls the example cited above, where when processing 
cross-site or cross-context information is associated with a random 
identifier rather than linked to individual’s identity.

The Charter could be improved by clarifying how other entities can rely 
on similar mechanisms as contemplated by user agents, browsers or 
operating systems to reduce the risk to specific individuals associated 
with the collection and processing of Personal Information.

It is practical to ensure we design solutions that enable individuals to 
disable recognition of the same user after clearing state. However, the 
Charter seems to overlook the need for individuals to clear recognition 
of their identity from the user agent or other web-navigating consumer 
software, while not interfering with their account recognition or even 
non-authenticated data that is a key part of their desired interaction 
with various digital properties across the open web.

Indeed, the second sentence quoted from the Charter above, seems to 
support enabling cross-device use cases that allow for matching of the 
same user’s activity for business processing purposes across devices or 
user agents, so long as these are provided by business-to-consumer 
software manufacturers.

Should the Charter mean that risk to specific individuals is reduced 
when a user agent, browser or OS manufacturer performs such business 
processing, then the Charter would be improved by explicitly describing 
which risks are eliminated when conducted by such consumer software.

Without such justification, the Charter suggests that it is concerned 
only with benefitting consumer software manufacturers that compete with 
software solutions provided by manufacturers that specialize only in 
business advertising solutions. While likely not intended, we note that 
this is the analogous concern to the TAG’s review of First Party Sets, 
which found that “proposal can result in detrimental effects to the 
greater web ecosystem. It is likely that this proposal only benefits 
powerful, large entities that control both an implementation and 
services.” [TAG Review]

Without such explicit guidance, the Charter would unintentionally favor 
organizations who have the time and resources to participate in the W3C 
without incorporating appropriate market feedback from the digital 
properties and marketers whose businesses this group seeks to provide 
advertising solutions.

Issue 6: Ensure the Charter does not inappropriately limit discussion to 
channels of distribution or divisions of markets that would violate the 
W3C Antirust and Competition Guidance

The W3C Antitrust Guidelines include:

“W3C does not play any role in the competitive decisions of W3C 
participants nor in any way restrict competition…. For example, 
participants should not discuss product pricing, methods or channels of 
product distribution, division of markets, allocation of customers, or 
any other topic that should not be discussed among competitors.”

We note with approval that the Charter makes explicit reference to the 
W3C’s Antitrust policy.
However, to comply with this policy, the Charter could be improved by 
ensuring that it is not restricting “channels of product distribution” 
or “division of markets” by its current language that seems to limit all 
processing of personal information to being conducted inside user agents.

It is fair that business advertising software not involving use of 
consumer software as an input (e.g., content creation or media mix 
modeling) ought to be beyond the scope of this Charter. However, so long 
as the business advertising specifications contemplated by the Charter 
involve a user’s interaction with a digital property, then it seems 
prudent not to limit the appropriate business advertising processing 
exclusively to consumer software manufacturers to divide the market 
amongst themselves and their chosen partners or be the exclusive channel 
for accessing business advertising solutions. Were the Charter to limit 
scope of acceptable proposals to only those where business processing is 
controlled by business-to-consumer software manufactures, without 
evidence that such processing is putting the interests of users ahead of 
the business software manufacture themselves, then this poses a high 
risk of limiting rival business software providers ability to compete on 
the merits. What if the business-to-consumer software were only to 
charge rivals to participate in business solutions for digital markets? 
As such, the current Charter raises competition concerns.

The Charter language as drafted does not provide guidance when the group 
considers preventing a given data processing practice altogether or only 
under specific circumstances, such as posing lower risk to a specific 
individual. Indeed, without modification the current draft suggests that 
a given business-processing purpose is less of a concern purely because 
it is bundled with the business-to-consumer software of the same 
manufacturer. This is unlikely to be the intent, as privacy risks adhere 
to the nature of what input data is collected and purpose for which it 
is processed and what reasonably likely harms could occur, rather than 
which software manufacturer conducts the same data processing of 
identical data inputs.

It is undeniable that a user interacting with a digital property must 
involve some consumer software. However, as stated above, there is 
currently a lack of definition around appropriate safeguards any 
business-to-business ad system must ensure is in place to conduct the 
processing of this input data for the business advertising solution 
appropriately.

If the Charter were to state that specific business advertising software 
processed on a server may have access to raw cross-context input data, 
but other software should not, then it must be more explicit on why 
certain server software may obtain this data, but other software should 
not. For example, the explicit reference to Private Attribution 
Measurement emphasizes how the post-processed output data will be 
protected from other recipients learning about the input data, but is 
silent on how this software itself reduces risks to specific individuals 
from the cross-site or cross-context processing of this same input data.

The Charter text states:

“Private Attribution Measurement
This specification defines how to privately measure advertisement 
attribution/conversion rates without revealing whether any individual 
user converts or does not.”

The Charter would be improved by explicating listing the distinguishing 
characteristics of how responsible business-to-business software, 
specially that focused on meeting advertising needs, can be 
distinguished from business-to-business software conducting 
inappropriate cross-context and cross-site processing of personal 
information.


REFERENCES

- [Building a more private web] Justin Schuh, Director, Chrome 
Engineering, Google (August 22, 2019). 
https://www.blog.google/products/chrome/building-a-more-private-web

- [Apple Privacy Policy] Apple Privacy Policy (last accessed: September 
19, 2022). https://www.apple.com/privacy

- [TAG Review] TAG Review Feedback on First Party Sets (April 7, 2021). 
https://github.com/w3ctag/design-reviews/blob/main/reviews/first_party_sets_feedback.md

Received on Wednesday, 10 January 2024 15:52:16 UTC