CDNs for common-goods done right?

Hello dear community around the TAG,

In an effort to finalise the “conformance checks” that an education 
ministry has required from a company I work for, privacy concerns have 
had huge impacts. Among the biggest concerns, the fact that Content 
Distribution Networks (CDNs) were used going to servers in far-away 
countries for common-goods such as open-source projects (eg: jquery, 
bootstrap) was considered inacceptable. Indeed a trial came out recently 
saying that using Google fonts in a website without explicit consent is 
a breach of the EU GPDR.

My question is: There was a resource justification that said that using 
such common-URLs is good for performance and resources limitation: An 
already loaded and parsed CSS should only be used once and not reloaded. 
Is there not a markup-based method to do this in normal web-browsers?

I would expect such common-goods to be identifiable as “always the 
same” from any URL and “verifiable” by some signature mechanism 
and they could thus be correctly cached and shared among sites without 
any security concern. Cache-partitioning should not be needed just as it 
is not needed for standards-conformant software parts of the browsers 
(e.g. the http-stacks, the way JS is run…)

[This article of Tim Perry discourages CDNs for 
such](https://httptoolkit.tech/blog/public-cdn-risks/) and the reasons 
are security motivated. A peer-to-peer-like protocol is among the 
hopeful bright futures. However, something based on http(s) sounds 
completely accessible and much nearer.

Did I miss a point?
Was this already considered?

I think that TAG or its community might know.

Thanks in advance.

Paul

Received on Thursday, 8 September 2022 08:11:07 UTC