W3C home > Mailing lists > Public > ietf-http-wg@w3.org > April to June 2012

Re: breaking TLS (Was: Re: multiplexing -- don't do it)

From: Henry Story <henry.story@bblfish.net>
Date: Tue, 3 Apr 2012 11:57:11 +0200
Cc: Martin Thomson <martin.thomson@gmail.com>, Stephen Farrell <stephen.farrell@cs.tcd.ie>, "William Chan (?????????)" <willchan@chromium.org>, Mike Belshe <mike@belshe.com>, Adrien de Croy <adrien@qbik.com>, Peter Lepeska <bizzbyster@gmail.com>, "ietf-http-wg@w3.org Group" <ietf-http-wg@w3.org>
Message-Id: <3BE8D067-5E2F-46A0-8680-91EBB647B3EE@bblfish.net>
To: Willy Tarreau <w@1wt.eu>

On 3 Apr 2012, at 08:31, Willy Tarreau wrote:

> Hi Martin,
> On Tue, Apr 03, 2012 at 02:34:06AM +0200, Martin Thomson wrote:
>> I suspect that, at the root of all this, is a desperate attempt by
>> administrators to retain some sort of control.  The fact is that the
>> way that we communicate on the web is vastly more complex than their
>> policy engine is capable of managing.  Adding real time communications
>> only complicates that further.
>> The quick and easy solution to the realtime communications mess is
>> blocking UDP.  My guess is that this is what we'll get.  Controlling
>> this is going to be quite complex.  That said, with a lot of work, I
>> can see how sites might be selectively allowed onto a whitelist.
> You know, I have several customers where the only way to look outside is
> to pass through a proxy, that's fairly common, especially in companies
> which run on non-rfc1918 addresses. At these places, it's very simple :
>  - port 80  => URL classification + content inspection
>  - port 443 => destination must match a short whitelist of allowed domains
>    that are directly related to employees' job
> And I'm seeing this becoming more and more common because it's simple and
> efficient to apply the web policies that managers want. In fact the first goal
> is not to ensure there is no data leak, the first goal is to try to protect
> the PCs against malware as much as possible by limiting their access to what
> they really need, because infected PCs cost *a lot* to an enterprise by
> preventing people from doing their job. Of course there is also the goal to
> avoid the temptation of entertainment. If people want to browse unfiltered,
> they do it with their smartphones and it's their problem.

I have recently been thinking about this and I believe that this can be made to 
work in a way that would probably be a lot more flexible that what people think 
is possible here. By using linked data [1] one could effectively get something like
this to function quite well. You can imagine governments publishing a resource
linking to 
  - a resource linking to all the national banks domains and identifier and 
    foreign similar lists
  - a resource linking to company directories (i.e. the NASDAQ, London 
      Stock Exchange, and so on ) which themselves list all the companies and link 
      to them
  - local non public markets companies and foreign equivalents
  - lists of universities and public institutions
  - lists of charities and others
This would then create a linked data global web of information about companies and
resources to which one could add local unofficial preferences which could be used
for different players to set their access policies.

Anyway, if you keep in mind the possibility for this type of thing then you can
see that it is not a problem very much at the TLS layer, but at a different layer.

I'll post something soon that goes into more detail on the architecture of this
kind of system.


[1] http://linkeddata.org/

> Regards,
> Willy

Social Web Architect
Received on Tuesday, 3 April 2012 09:57:52 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:13:59 UTC