W3C home > Mailing lists > Public > ietf-http-wg@w3.org > October to December 2013

Re: Fwd: New Version Notification for draft-nottingham-http2-encryption-02.txt

From: Adrien de Croy <adrien@qbik.com>
Date: Sun, 15 Dec 2013 23:40:18 +0000
To: William Chan (陈智昌) <willchan@chromium.org>
Cc: "Brian Smith" <brian@briansmith.org>, "Stephen Farrell" <stephen.farrell@cs.tcd.ie>, "Paul Hoffman" <paul.hoffman@gmail.com>, "HTTP Working Group" <ietf-http-wg@w3.org>
Message-Id: <em980ecf7d-c330-4c7a-a32b-390b23e266f6@bodybag>

Hi

would need to trawl the last 4 months or so messages to pick all the 
bits out.

But I think the general issues are these.

nothing is truly free
you get what you pay for

There's more to PKI than just the issuing of certs.  How can we trust 
certs if no attempt has been made by the CA to properly validate 
identity of applicants?  How can a CA afford to run DoS-proof OCSP 
servers if you don't pay them?  How can they also afford to keep their 
private key secure against increased risk of attack (increased incentive 
/ reward for breaking) etc etc etc.

How can a CA currently offering free certs suddenly provide 1000 times 
(or more?) as many free certs and still have them be free?

I just don't think it stacks up for there still to be free certs that 
are worth anything once everyone has to have a cert, and the place that 
leaves us is worse than where we are now.  A proliferation of worthless 
(untrustable) certs on a vast scale.

I think this is why people have been proposing alternatives to CA-issued 
certs.

Adrien



------ Original Message ------
From: "William Chan (陈智昌)" <willchan@chromium.org>
To: "Adrien de Croy" <adrien@qbik.com>
Cc: "Brian Smith" <brian@briansmith.org>; "Stephen Farrell" 
<stephen.farrell@cs.tcd.ie>; "Paul Hoffman" <paul.hoffman@gmail.com>; 
"HTTP Working Group" <ietf-http-wg@w3.org>
Sent: 16/12/2013 12:03:28
Subject: Re: Fwd: New Version Notification for 
draft-nottingham-http2-encryption-02.txt
>I don't recall this argument being sunk. Can you provide a reference
>or explain why?
>
>On Sun, Dec 15, 2013 at 2:14 PM, Adrien de Croy <adrien@qbik.com> 
>wrote:
>>
>>  I'm pretty sure this argument (there are free certs so we should all 
>>use
>>  them for everything) has been floated and sunk about 3 times on this 
>>list.
>>
>>  Maybe we need some place where we can collect these arguments and the
>>  results of them so we can post referrals to that place instead of 
>>doing that
>>  work over and over?
>>
>>  Adrien
>>
>>
>>  ------ Original Message ------
>>  From: "Brian Smith" <brian@briansmith.org>
>>  To: "Stephen Farrell" <stephen.farrell@cs.tcd.ie>
>>  Cc: "William Chan (陈智昌)" <willchan@chromium.org>; "Paul Hoffman"
>>  <paul.hoffman@gmail.com>; "HTTP Working Group" <ietf-http-wg@w3.org>
>>  Sent: 15/12/2013 09:40:58
>>  Subject: Re: Fwd: New Version Notification for
>>  draft-nottingham-http2-encryption-02.txt
>>
>>  On Sat, Dec 14, 2013 at 11:20 AM, Stephen Farrell
>>  <stephen.farrell@cs.tcd.ie> wrote:
>>
>>>  Possibly a different thread really but...
>>>
>>>  On 12/14/2013 05:20 AM, William Chan (陈智昌) wrote:
>>>  > Anyhow,
>>>  > we don't support any type of opportunistic encryption, especially
>>>  > unauthenticated. We want people to use https://, therefore we more 
>>>or
>>>  > less only plan to support HTTP/2 for https:// URIs. Let me know if
>>>  > this still leaves anything unclear.
>>>
>>>  What that leaves unclear for me is how the current 30-40% of web
>>>  sites that are setup for some form of TLS will suddenly become
>>>  99%. Without some other action on helping sites get certs, it
>>>  just won't happen would be my prediction.
>>
>>
>>  We need to focus our effort on that problem.
>>
>>  There are already at least three commercial CAs, that browsers trust, 
>>that
>>  give away free certificates: StartCom (restricted to non-business 
>>use),
>>  GlobalSign (restricted to open source projects), and GoDaddy 
>>(restricted to
>>  open source projects). These CAs give away an inferior good 
>>(presumably) in
>>  the hopes of you eventually upgrading to their non-free goods. The 
>>main
>>  problem with these CAs' freemium models is that their decision 
>>process for
>>  whether you qualify for the free product isn't (and cannot be) 
>>automated.
>>  However, I believe there is an opportunity for us (browser makers in
>>  particular, and the IETF community in general) to create a new kind 
>>of
>>  inferior good in the certificate space that CAs (possibly other than 
>>the
>>  ones I mentioned) may be willing to give away for free in a way that 
>>allows
>>  CAs to be comfortable with, without jeopardizing their businesses. 
>>Note:
>>  when I say "inferior good," I use "inferior" in the economic sense 
>>only; I
>>  think we'd insist that such certificates have security properties at 
>>least
>>  as good as what we already accept as the minimum in browsers today.
>>
>>  Even if such efforts were to fail, we still wouldn't be at the point 
>>where
>>  completely unauthenticated encryption is the only option left. There 
>>are
>>  other ways of authenticating servers than punting to a commercial CA. 
>>We
>>  should make sure we have thoroughly exhausted these alternatives 
>>before
>>  giving up.
>>
>>>  I think its all the more puzzling when contrasted with other cases
>>>  where people claim that we can't do X because that'd cause a problem
>>>  for 1% of the web, but yet here you seem to be saying its ok to
>>>  do this when it'd cause a problem for 60-70% of the web. (I don't
>>>  recall whether or not you've made such claim William.)
>>
>>
>>  When it comes to breaking interoperability or regressing performance, 
>>small
>>  percentages like 1% matter. The fact that most connections web 
>>browsers make
>>  are not encrypted+authenticated is a huge problem that needs to be 
>>addressed
>>  with strong action, but it isn't acute in the way that a 
>>compatibility or
>>  performance regression is.
>>
>>  Difficulty with certificates doesn't explain why bing.com, 
>>reddit.com,
>>  tumblr.com, baidu.com, wikipedia.com, and other top sites aren't 
>>HTTPS-only.
>>  Social issues (wikipedia has been very open about how politics 
>>affects their
>>  HTTPS deployment) and performance issues are much more serious 
>>issues, and
>>  those issues won't be properly addressed by adding opportunistic 
>>encryption
>>  to HTTP/2.
>>
>>  Do third-party advertising sites (the kind whose cookies are being 
>>used to
>>  de-anonymize users) use HTTP instead of HTTPS because they can't 
>>afford
>>  certificates? No. Performance, scalability, the pain or migrating 
>>websites
>>  from http:// to https:// URLs, and lack of motivation seem to be the
>>  problems. Web browsers can encourage them to move to HTTPS by getting 
>>them
>>  on our HSTS preload lists (so the browser "fixes" those http:// links 
>>to
>>  https:// links automatically) and by doing other things. For example, 
>>at
>>  Mozilla we've long had a desire to strip cookies from third-party 
>>requests
>>  that aren't HTTPS. It seems like now is the time to figure out how to 
>>make
>>  that work. We've already seen big advertisers make changes like this 
>>to
>>  accomodate our recent mixed-content blocking changes. I'm confident 
>>that
>>  such advertisers would be willing to accomodate further changes, if 
>>nudged a
>>  little bit.
>>
>>>
>>>  Even if only as a backup in case that 30-40% -> 99% transition
>>>  fails, I'd hope folks do continue working on ways to provide
>>>  opportunistic encryption for HTTP/2.0.
>>
>>
>>  I agree that it is reasonable to continue to explore unauthenticated
>>  encryption options. However, I encourage people to support the 
>>efforts to go
>>  further--to try to hit a home run instead of trying to bunt.
>>
>>>
>>>  On the current draft - its seems quite odd to ignore the existing
>>>  anon-DH ciphersuites when trying to do opportunistic encryption.
>>
>>
>>  The way cipher suites are currently negotiated in TLS, with the 
>>client
>>  saying which cipher suites it supports and the server choosing one, 
>>suffers
>>  from the same problem that ALPN causes for http2-tls-relaxed: the 
>>client is
>>  telling potential MitMs whether or not they will get caught. I 
>>appreciate,
>>  and agree with, the fundemental aims of the perpass effort. However, 
>>I think
>>  way too much emphasis is being put on the "passive" part. We need to
>>  remember that perfect is the enemy of the good, but at the same time 
>>it
>>  would be unfortunate to spend a huge amount of effort trying to 
>>prevent
>>  passive attacks while making active attacks easier to carry out.
>>
>>  Cheers,
>>  Brian
>>  --
>>  Mozilla Networking/Crypto/Security (Necko/NSS/PSM)
>
Received on Sunday, 15 December 2013 23:40:19 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:11:20 UTC