Re: The TLS hammer and resource integrity

On 28 Mar 2012, at 05:55, Martin Thomson wrote:

> We have already touched on this in discussions of SPDY, but I wanted
> to make a statement on this prior to the meeting on Thursday.
> 
> TLS is a great tool in the protocol design toolbox.  It provides a
> great many things together.  Confidentiality, integrity,
> authorization, and so forth.  Therefore, it is very easy to pick up
> that tool and apply it without a complete analysis of the treat model
> and what aspects of that model it addresses.
> 
> Mixed content (see Monday's plenary topic) is a good example of where
> TLS doesn't necessarily provide the best trade-off of all the security
> options present.  The classic concern is that I have a TLS-secured
> page that pulls in content over HTTP in the clear.  That unsecured
> content can then potentially poison the entire page.

That is the problem of the web in general. As long as most pages
are not secured you have the mixed content problem. Even 100% TLS sites
have it. But this is a reason to make the move to 100% https everywhere. 

For example if I  am reading a  blog from an author I trust and he writes 
a review of his good experience shopping in some small company, a story I heard 
perhaps through other channels and have every reason to trust, and I 
click on the link to go to that site, but a man in the middle attacker
has replace the link to the site he was writing about with a link to his 
proxy (in order to take the money sent to the payment links he controls), 
then it will be very easy to fool me. I could well end up making a purchase 
at the fake site.  Even if the site in  question was 100% behind  https 
the links linking to it  could have been attacked, because the mixed content 
came from links linking TO the web site, which is how we and search engines 
navigate the web. 

That things are not completely broken shows how powerful the network of effect
of links is. But that same network effect will make a huge difference as 
people move TO TLS. On the web as in the world, we are not an islands. We work
as a society together, so that we get a huge boost of security as we secure
our follow citizens too, and as they secure themselves and us. 


> 
> The property that is required in the mixed content scenario is
> integrity.  The host page might not care that confidentiality is
> maintained when requesting this content, but it really does care that
> the content matches the content that it expects.
> 
> Today, the only option we have available to deal with this problem is
> TLS.  And along with our integrity (and source authentication), we
> also get confidentiality.  This is occasionally desirable, but
> frequently, it is merely consequential.
> 
> One significant downside to this arrangement is that confidentiality
> also rules out intermediation options that could be hugely beneficial.
> Now it is no longer possible to cache copies of JQuery all over the
> web.  (TODO: deal with obvious CDN counter-argument)
> 
> Intermediation is a fundamental part of the web architecture and
> building a protocol that makes this inherently difficult would be a
> disservice to the web.
> 
> The separation of resource integrity from communication
> integrity/confidentiality is something that I know others have been
> thinking about.  I'd like to see this discussed in HTTP/2.0.
> 
> --Martin
> 
> long p.s. I should include a reference to the work from decade, that
> deals with exactly this sort of problem in an environment that
> consists entirely of unauthoritative "intermediaries".
> 
> One proposed solution, which should probably be at least considered,
> is to provide a content-specific identifier for a resource.  That is a
> resource is identified by a hash of its representation, so that a
> modified representation can be easily detected.  This might actually
> be more restrictive than is entirely ideal, but it is worth knowing
> about:see draft-farrell-decade-ni.
> 

Social Web Architect
http://bblfish.net/

Received on Wednesday, 28 March 2012 06:07:07 UTC