W3C home > Mailing lists > Public > ietf-http-wg@w3.org > April to June 2014

Re: Trusted proxy UI strawman

From: 陈智昌 <willchan@chromium.org>
Date: Mon, 23 Jun 2014 16:44:05 -0400
Message-ID: <CAA4WUYjhc0F=qZMG78+MptwY_Kh4F0_Ldg_B9Tt9Ko0CeyEd+w@mail.gmail.com>
To: Nicolas Mailhot <nicolas.mailhot@laposte.net>
Cc: Peter Lepeska <bizzbyster@gmail.com>, HTTP Working Group <ietf-http-wg@w3.org>, Martin Thomson <martin.thomson@gmail.com>
On Mon, Jun 23, 2014 at 4:31 AM, Nicolas Mailhot <
nicolas.mailhot@laposte.net> wrote:

> Le Jeu 19 juin 2014 21:33, William Chan (陈智昌) a écrit :
> > That said, I do not agree with your assertion that referencing content
> > from
> > a third party is equivalent to being willing to trust a proxy. I am
> > frankly
> > surprised to hear such a statement. Does anyone else agree with this?
> I think you need to step back a little and get some perspective:
> 1. first, trust is not a mathematical concept it's a human evaluation of
> the behaviour of other humans (trust does not apply to dead matter or
> automatons, it applies to the people that use them for a particular
> purpose). Since humans are not perfect, trust is fundamentally not
> transitive. You can try to limit the scope of the trust to the maximum to
> limit trust loss during delegation but that's all you can do: limit loss.
> That is why any serious security policy will use MAC and not DAC.
> That is why despite iron-clad contracts Boeing had to reinternalise
> production to rein in the effects of chain subcontracting (and it was
> *not* a new discovery every decade beancounters manage to convince a big
> organisation that the sanity rules again deep subcontracting can be
> lifted).
> That is why even with perfect poll processes, if you let your elected
> representatives elect another layer or representatives and so on it does
> not take many steps before the resulting assembly makes decisions contrary
> to the original electing population (some will say even one level of
> subdelegation is too much).
> This is why ponzi schemes are so effective.
> Trust is not transitive and the more delegations you accept the less trust
> remains irrespective of technical measures

Yeah, I more or less agree with you here. There are some ambiguities that
I'd quibble over, but yes, I think I agree with what you've written here. I
think the key point is that, given the web composition model of sourcing
scripts and other active content, trusting example.com will require
trusting whoever example.com delegates to via active content, and it's
often the case that the end user doesn't trust those third parties.

> 2. second, you've still not internalized that on the Internet trust is a
> three-way concept (site, network and user) and you're still evaluating
> trust from the site owner perspective ignoring totally other actors. This
> is why you consider a checksum (!) an appropriate trust safegard when you
> would never admit it as sufficient for other forms of trust.

I suspect you are confused here about my referencing the SRI proposal? In
my earlier statement, I simply acknowledged the problem of site owners
needing to fully trust 3rd parties in order to source script from them and
noted that there are proposals to address that. You seem to being jumping
to conclusions about my stance on said proposals. If you have feelings
about that proposal, perhaps you should raise them in the appropriate W3C

> From a user point of view a clearly identified local operator they can
> easily reach physically and which operates on local laws will often be
> more trustable than a random difficult-to identify site on the other side
> of the world (that, as shown again and again, will decline any obligation
> under local laws when put to trial).

Comparing to a random http://browse.here.for.a.good.time.com website sounds
like a straw man. Isn't the discussion about giving the proxy a privileged
position in the network to examine/modify traffic to all websites, not just
the random ones?

> 3. third, cdns and especially js cdns are the ultimate scammer dream:
> they're more anonymous than PO boxes (usually computer-generated ids on
> shared platforms you have to trust as a whole with no way to identify any
> responsible party), they're outside any effective juridiction, they
> execute on the marks' systems with direct access to the marks' info using
> the marks' computing power. (and no one cares: not the delegator – it's no
> on his site, nor the delegatee – it's not his users)
> cdns have a long way to go before being remotely trustable from a user
> point of view and they have zero incentive to try especially in a http2
> world where all power is delegated to the calling site.

I'm not sure I follow this. Don't you think CDNs have an incentive not to
be completely evil here? Usually the way this works is the content owner
pays a CDN to host its content, with the expectation that the CDN won't be
evil. If the CDN maliciously transformed content, then the content owner
would probably stop paying the CDN and use a competitor. Isn't that how it
works? Or are you talking about free CDN hosting for certain content, like
Google Hosted Libraries for stuff like jQuery?

In any case, I won't defend CDNs any more. There are plenty of CDNs here on
this mailing list that can defend themselves just fine against these sorts
of claims.

> Frankly, if you forget the quick money angle, a delegating web site is
> about as (or usually less) trusted as a mixed content web site. And if you
> accept to use a site that delegates anything to 666.cloud.com for taxes or
> banking you deserve what you will get.

I'm not sure what you're referring to by the quick money angle here =/

And yeah, delegating to dodgy origins is a terrible idea. But lots of third
party content is hosted by ostensibly reputable origins. For example,
social widgets are hosted by Facebook, Twitter, Google, Pinterest, etc. If
you're able to MITM and replace those social widgets, you can subvert large
chunks of the Web. You seem to assert that if a website sources content
from a third party like Facebook for the Like button, then it should be OK
for a local operator's proxy to MITM that. I find this very surprising.

> If browsers were serious about their user interests they would severely
> restrict the number and depth of delegations on any given page, and show
> an untrusted warning any time the delegation complexity went over a very
> low level, instead of trying to enable more delegation.

This seems to be tangential, so I will ignore it. If you are serious about
this, please re-raise in a separate discussion thread.


> --
> Nicolas Mailhot
Received on Monday, 23 June 2014 20:44:34 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:31 UTC