W3C home > Mailing lists > Public > public-xg-webid@w3.org > March 2011

RE: report on EV and SSL MITM proxying

From: peter williams <home_pw@msn.com>
Date: Wed, 9 Mar 2011 02:12:23 -0800
Message-ID: <SNT143-ds919E5C204D74775A5DACF92C90@phx.gbl>
To: "'Henry Story'" <henry.story@bblfish.net>
CC: <public-xg-webid@w3.org>
Try to Remember, an ldap URI is a URI. Certs issued in the windows worlds
have had ldap URIs in them for a dacade, now. Try to remember also that
X.509 preceeds ldap's first glimor of existence. Arguments tying ldap to the
originality of testing for certs existence in a directory records just don't
fly.

 

May want to study (from 2005 and earlier) lots of interesting work done on
short-life client certs, talking to an IDP here:
http://grid.ncsa.uiuc.edu/presentations/gridshib-cip-seminar-dec05.ppt. Some
ideas from what others have done, in practice, with proxy certs, IDPs, TLS
client authn and a non-conventional https conception. This work is still
ongoing (and Tom will happily talk for hours about it..). Should also to
talk to David Chadwick's group on PERMIS, since they tried hard to introduce
complex authorization/policy languages there (similar to the IETF proxy cert
work, with its focus on policy languages). Little took, in the general web,
though.

 

But, to business. Do we accept yet something that is important -  that when
a MITM firewall is present (at corporate boundary, or ISP boundary for the
public, wired or wireless) that the SSL client authn signature originated by
the browser will either not get to the resource server, or if it gets there
the nature of the SSL handshake's integrity service for SSL connections will
prove it's been tampered with (since its bound to the browser->firewall
connection, not the firewall->resource server connection)?

 

This is what we have to most deal with. Its strikes to the heart of
FOAF+SSL.

 

Now, I believe Ive read the same concern stated also by 2 others in the last
few days. I read it from Ryan (who advised looking for any user authn
mechanism OTHER than TLS client authn), and I read it from friends at Opera
(who minced words about the impact of SSL MITMing on client authn, trying to
find a diplomatic spin). As I said in my initial report, it's a spin zone -
with vendors trying to craft a looking-forward and modern story distinct
from what https simply is no longer what was originally sold: an end-end
security protocol guanteed by the power of crypto and the inherent magic of
certs.

 

You know you are talking to a secure site, says VeriSign in 1996. Your
credit card is safe in the hands of only the merchant, says Symantec these
days. Urrr. no it's not. Neither are true these days. Because of the MITMing
and the users "strangely" not recognizing the change of color of address bar
(sometimes, for some sites, but not most and for never for those that use
self-signed server certs, or CAs not in the EV forum (e.g. CAcert)), the
credit card number posted over SSL to a million merchant sites is actually
at any number of intermediaries, unnamed, non-disclosed, un-noticed. The
pre-2000 security claim and assurance is vacous, that is. It's social
snakeoil. But what vendor is going to say that!

 

What was interesting to me most was that the statements were elicited from
folks evidently strong influenced to act in defense of EV. EV is evidently a
sensitive topic, sufficiently emotive to have folks rationalize and defend
its design principles. While EV is on the one hand just a better
originally-qualified server cert, it's coming across as more than that -
being a response to the SSL MITM issue that emerged after 2000 - insofar as
it works to that end. It's really broaching topics traditionally out of
scope in https: consent, notice, interception, control, etc etc.

 

Take a good look at Tom Scavos' material, and related technical specs. Folks
have been at  trying to make client certs work "effectively" for a long
time! His work has long been about mixing IDPs, https client authn, and ldap
pulling of user records from a directory, given a name in a cert.

 

 

From: public-xg-webid-request@w3.org [mailto:public-xg-webid-request@w3.org]
On Behalf Of Henry Story
Sent: Wednesday, March 09, 2011 12:59 AM
To: peter williams
Cc: public-xg-webid@w3.org
Subject: Re: report on EV and SSL MITM proxying

 

 

On 9 Mar 2011, at 09:04, peter williams wrote:





On Galileo.

 

Galileo thought about stuff and reasoned. He was also tortured, by the state
of the day, for articulating ideas. That state lives in shame to this day.
Like states founded with slavery, history will never forget.

 

Put any marketing spin on it you like, seeing if a cert exists in a remote
file is not going to qualify as Galilean grade science. First it's not
novel, second it's not original, third its was all done 15+ years ago. All
we are doing is swapping form - one container for another.

 

Two things:

 

1- It was not done 15 years ago. 

  

  You keep saying that LDAP was designed to be able to do this. But ldap IDs
are not URIs, and neither are LDAP attribute values.  Without a global
namespace the previous attempts could not succeed. The important change is
in this semantic atom, working together with an architecture - the web -
that was built with that as its building block.

 

2. It is a simple step

 

Indeed: an extremely simple step. That is why it is invisible to so many.

 

It requires not much new learning, not many new facts. It requires the
security community to go through a paradigm shift, to see anew what they
were always looking at. They need to think distributed, when they tend to
think hierarchical. They need to think global, where they tend to think
closed, they need to think trust where they tend to think distrust.

 

The idea of the earth turning around the sun is also a simple step: it can
be said in one simple sentence. Certainly the Church at the time could argue
from its very complicated theory that they had numerous advantages, most of
all was established knowledge and past experience. But the forces of
globalisation gave new opportunities all the time to those who understood
how to take advantage of this conceptual turn.

 

 

Now, I think you are on to something when you want to see the web as a giant
computer, one whose atoms can all calculate in RDF'ised Boolean algebra, and
jointly compute to find satisfiable expressions of ever greater complexity,
of ever wider applicability. And, in that sense, one sees why one wants
trust and security - so that contributions to some subweb takes its joint
computation ahead, avoiding contamination by the malicious side of human
nature. The human world of the human-centric web is messy and somewhat
malicious (since it models its creators). The semantic web part is
supposedly going to be more machine like, and thus nice and friendly. 

 

Not at all. I don't think the web makes anything nicer and friendlier that
it was. It just enables much larger network effects. 

 

But you can see your old pre globalisation thinking: you are thinking it
terms of hierarchical systems that would take the burden off the citizen.
But in fact all civilisation is built on trust. What we are enabling is that
trust to be expressed all the way down to the citizen: the atom of trust. Or
we can put it the other way around. Starting from the citizen as the atom of
trust we can build the whole system on top.

 

But in fact we are not even that reductionistic. We allow companies as
entities too, which can have their own WebIDs. When companies give  their
employees WebIDs, those do not identify a Person the same way a WebID on a
Freedom Box <http://www.youtube.com/watch?v=vkMXwy796p4>  will do, because
the company is part of the conversation as we saw with the MITM proxying
case.

 

(Hmm... scifi movies about evil robots, notwithstanding)

But, this is rather ideal, and somewhat abstract. At the same time, it's a
nice goal - one that is not typical "security" - and one that provides us
with a wider mission that embodies the semantic web aspects of this project.
This takes us beyond fiddling with old SSL, and even older certs, and get us
back to the spirit of the webid itself.

 

yes.





 

Of course, it would be nice it actually worked, too, for the countless
millions of browsers behind a corporate or ISP firewall that insists on
speaking for them, intermediating always. What I hope we learned this week,
is that for at least 50% of the web population, today, "the MITM firewall"
has immense power to spoof the best that server certs can offer, and will
normally prevent the SSL client authn message reaching its target site with
connection-integrity - which means webid don't really work in practice as we
defined it in the FOAF+SSL era. It only works in the lab, in conditions too
rarified to relate to the real world of SSL, full of nastiness.

 

Wrong. It works behind firewalls: You get notified that the company wants to
look at your transactions, and your transaction is aborted. In those cases
you should simply stop doing any private business or get a wireless device.

 

If companies want to participate themselves and give their employees WebIDs,
I proposed a simple way for them to work with firewalls and current
browsers. I think if a few companies try that out, then we will have some
real people thinking on these issues and with time proposals for the IETF
for ways to improve things.





 

Which isn't to say that we cannot change the praxis of https/SSL; where
certs (the proxy path) and/or SSL (the multiplexed path) are GREAT ideas to
followup. There is lots of latent power, yet unexploited there. But, I
predict, this endeavor is going to involve as much cryptopolitical skill as
technical design skill. It's just the nature of crypto that, like science,
make it inherently political - something that really hasn't changed much
since Galileo's day.

 

yes. My father is professor of Political Science. So I understand that world
well.

 

Henry

 





 

From: Henry Story [mailto:henry.story@bblfish.net] 
Sent: Tuesday, March 08, 2011 10:26 AM
To: peter williams
Cc: public-xg-webid@w3.org
Subject: Re: report on EV and SSL MITM proxying

 

 

On 8 Mar 2011, at 18:58, peter williams wrote:






After 15 years of trying and 10 major vendors, isn't it a bit surprising
that none of them have reduced it to a trivial easy UI? It's so easy!

 

Indeed, it is surprising, and it is easy to improve :-)






 Hardly a logical argument; peter. Far too empirical!

 

So I suppose if Galileo found that the earth turns around the Sun, you would
have argued that this was impossible because for the previous million years
nobody had thought of it.

 

Science moves in weird ways. You can't argue from the fact that something
has not been done, that it can't. And you can't argue that because your
opponents are rich that they must be right. After all Galileo had the Pope
against him. 

 

Think rather what is NEW in what we are doing, and think why there was not
much momentum for client side certificates.  I explained that already a few
times. Client side certs that can only be used on one web site are NOT very
useful. They only become useful when useable globally. And they can only do
that with URIs. So the problem with client side certs was:

  - using distinguished names

  - lack of linked data

  

So we solve these problems the rest  is easy. There will then be an
incentive to improve the User Interface.






 

Try to go through Ryan's argument; he actually answered you logically. In
defending EV, he distinguished it from other server certs that *some*
browsers will not clearly characterize as having been MITM'ed.

 

It may be useful to have browsers show that they are using proxy
certificates. If this is your UI improvement proposal it sounds ok, but it's
not a big deal. You can see why not a lot of time was invested in this.

 

Having thus acknowledged the world of SSL MITMing as a fundamental threat,
he then suggested: perhaps look for user id protocols OTHER than TLS client
authn.

 

My argument is that it is not a fundamental threat. You make it sound like
TLS does not work. But in fact it works very well. If you are in control of
your OS and machine you will not be able to work through these proxies - ie:
you will be blocked from revealing information to a third party which is
what you want.

 

Whatever system you come up with will have this problem: if you need to use
someone else's network and they want to see what you are doing and
communicating they will be able to argue that they should not pass on the
information unless you give them the keys. Be that using TLS, XML encryption
or anything else.

 






He outlines how the world involving any cascade of SSL MITMing proxies
(needed for social/corporate "interests") interferes with end-end client
authn, precluding its use beyond interaction with the first proxy in the
sequence. He didn't say it, but this is not a real constrain if that proxy
is now a websso IDP, than translates the layer4 assertion into a layer 7
signed token that bypasses transport bridging.

 

Social Corporate interests are not necessarily evil. They are larger agents
and if they need to see what is going on their network, they will only be
able to do so if you accept to give them access to your machine. And this
will be so whatever technology you use.

 

So in fact the legal uses of TLS work correctly, the illegal uses of TLS
work correctly. There is a gray zone with corrupt CAs, but that will be
dealt with ietf DANE and DNS-SEC. 

 

 

Not really my words, and really not my argument. I just elicited the words
and arugment from folks that others here *should* find reputable.







 

From: Henry Story [mailto:henry.story@bblfish.net] 
Sent: Tuesday, March 08, 2011 12:03 AM
To: peter williams
Cc: public-xg-webid@w3.org
Subject: Re: report on EV and SSL MITM proxying

 

Some folks suggested that life would be all rosy, in webland, if the browser
displayed which client cert had been presented to a given website (per tab,
presumably). How come those over-complicating security designer types, just
don't do simple and obvious things, when its ALL so EASY if one just thinks
minimally and logically!

 

I have not seen an argument in what you have put forward that shows that
this is not an easy thing to do. The only arguments from browser vendors I
have heard, is that client certs are not widely used, and so it has not been
a priority for them.

 

 

 

Social Web Architect
http://bblfish.net/

 

 

Social Web Architect
http://bblfish.net/

 
Received on Wednesday, 9 March 2011 10:12:59 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 9 March 2011 10:13:00 GMT