W3C home > Mailing lists > Public > ietf-http-wg@w3.org > April to June 2014

Re: Trusted proxy UI strawman

From: Nicolas Mailhot <nicolas.mailhot@laposte.net>
Date: Tue, 24 Jun 2014 00:14:31 +0200
Message-ID: <72e72733463ac7a3a5e040d2f0a8d448.squirrel@arekh.noip.me>
To: "William Chan (陈智昌)" <willchan@chromium.org>
Cc: "Nicolas Mailhot" <nicolas.mailhot@laposte.net>, "Peter Lepeska" <bizzbyster@gmail.com>, "HTTP Working Group" <ietf-http-wg@w3.org>, "Martin Thomson" <martin.thomson@gmail.com>

Le Lun 23 juin 2014 22:44, William Chan (陈智昌) a écrit :
> On Mon, Jun 23, 2014 at 4:31 AM, Nicolas Mailhot <

> I
> think the key point is that, given the web composition model of sourcing
> scripts and other active content, trusting example.com will require
> trusting whoever example.com delegates to via active content, and it's
> often the case that the end user doesn't trust those third parties.

Unfortunately users are not informed of the extent of those delegations by
their tools so there is no pushback and delegations are proliferating to
worrying levels

> Comparing to a random http://browse.here.for.a.good.time.com website
> sounds
> like a straw man. Isn't the discussion about giving the proxy a privileged
> position in the network to examine/modify traffic to all websites, not
> just
> the random ones?

The discussion is on how to enable users to identify and accept (or not)
their processing. Regardless of the conclusion of this discussion, other
third parties are getting pervasive enough they are already getting the
level of access people object of proxies.

> I'm not sure I follow this. Don't you think CDNs have an incentive not to
> be completely evil here?

I like the completely here

> Usually the way this works is the content owner
> pays a CDN to host its content, with the expectation that the CDN won't be
> evil.

Or the content owner is paid to include some cdn content (ads, sometimes
malware-infested) or the cdn is made freely available for other reasons…

The user is not informed in any way of who hosts what and why, has little
legal recourse even if he acquired this information, what little trust
infrastructure exists in browsers or tls is dns-based but cloud cdns blow
it appart by hosting unrelated content on the same common domains

Trust can only exist if there is someone clearly accountable in case of
misbehaviour and the clearly part is missing here for anyone but the root
site owner (and sometimes not even then)


>>
>> Frankly, if you forget the quick money angle, a delegating web site is
>> about as (or usually less) trusted as a mixed content web site. And if
>> you
>> accept to use a site that delegates anything to 666.cloud.com for taxes
>> or
>> banking you deserve what you will get.
>>
>
> I'm not sure what you're referring to by the quick money angle here =/

I think news web sites will include pretty much any delegation nowadays to
scrap a little money. They'll soon refer to more third party sites than a
search result page.

> You seem to assert that if a website sources content
> from a third party like Facebook for the Like button, then it should be OK
> for a local operator's proxy to MITM that.

I assert that when a significant proportion of the web requires executing
scripts directly sourced from Facebook or Google or whatever, those
operators are de facto given the same power a proxy would have with less
room to wiggle away and less accountancy (avoiding a proxy only requires
changing networks). Delegation would not be so worrisome if network
effects were not concentrating it on a handful of platforms.

-- 
Nicolas Mailhot
Received on Monday, 23 June 2014 22:15:16 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:31 UTC