- From: Aryeh Gregor <Simetrical+w3c@gmail.com>
- Date: Fri, 7 May 2010 16:18:58 -0400
On Fri, May 7, 2010 at 1:06 PM, Juuso Hukkanen <juuso_html5 at tele3d.net> wrote: > You asked many questions, and unfortunately all you missed the > auth="verisign" argument, which _is_ enough to prevent all practical (,even > if they are all theoretical!,) man-in-the-middle attacks. You haven't explained what it does. Did you mean that the <meta> tag should include a certificate as well as a public key? If so, how is that better than HTTPS? > Maybe someone can show a _complete_ alternative Javascript & https solution > about how those can be achieved in a computer or PDA-device without > javascript support. Just serve the page using HTTPS, and have a normal HTML form. It will transmit the username, and the password. The server can salt and hash the password. (You could also easily have the client salt and hash the password using JavaScript before submission, but this doesn't improve security once you're using HTTPS.) > I am not suggesting replacing https with anything, government and business > sites can and should keep on using it. ?I am suggesting a small easy to use > mini-encryption which would be enough for those 90% of sites should salt > their passwords and encrypt sensitive data and but who currently aren't. It is not sufficient, because it's trivially circumventable by a man-in-the-middle attack. It therefore provides no security against any attacker. It also provides no greater assurance of security on the server side, because anyone who's competent enough to include this meta tag will probably also be competent enough to hash and salt passwords on their own. > Most servers are already configured to read the requested pages before > submitting those over the internet. I'm not aware of any HTTP server that attempts to parse outgoing HTML content. Could you provide a specific example? In particular, I'm rather certain that neither Apache, nor IIS, nor lighttpd parse outgoing HTML pages, and that accounts for most servers already. > For example my above form-page has a > small php-script inside which the server program must notice; as the > PHP-program needs to compile the script. Client never sees the <?php echo > $_SERVER['PHP_SELF']; ?> part but is instead shown an URL. To implement > meta-encrypt tag would just require (on/off) configuring server program to > read the header of requested page and see if there is a meta-encrypt tag in > there the server calls a program which decrypts! the client submitted data. This requires HTTP servers to implement an HTML parser, and to run it every time a page is submitted. They don't actually do this right now. HTML parsing is actually very complicated and slow -- have you looked at the HTML parsing algorithm <http://www.whatwg.org/specs/web-apps/current-work/multipage/parsing.html#parsing>? > 1) Man-in-the-middle problem; which doesn't exists because > ? ? ? ?a) those are just academic mind games If so, there's no reason for encryption at all. You can just send the content unencrypted if no one is going to intercept it. > ? ? ? ?b) if auth="verisign" is used as external CA Saying who the CA is is not enough to certify it. You need to provide the actual certificate, e.g., in X.509 format <http://en.wikipedia.org/wiki/X.509>. To get a certificate, you will typically have to pay a CA some sum of money, making it prohibitive for casual sites. What's the advantage over HTTPS at this point? > 2) HTTPS = good (even if it is typically NOT used with forms Many sites use HTTPS for everything, including login. Most sites don't, but that's mainly because 1) it's hard to get a certificate (you don't solve this), and 2) it doesn't work well with shared hosting (there are better solutions to this in progress). > 3) password salting = webmasters duty to do it (which 50% forget), after Then why will they remember to add the <meta> tag you suggest? They'll just forget both. This is a problem, but the only solution is to have the browser act more securely by default, regardless of what the webmaster does. > 4) Declaring encrypt action doesn't fit into HTML (; then why is there a > form method get/post) HTML is the wrong place to do encryption, because once you receive the page, it might have already been tampered with. The entire connection needs to be specified as secure-only from the beginning, such that the client will abort if it receives unencrypted/unauthenticated content. This is what HTTPS does. By the time you get to the actual contents of the document, it's impossible to know whether it's been secured. You haven't explained yourself very clearly, but in summary, this is what I think you're trying to do and why it doesn't work: 1) You're trying to provide protection against MITM attacks that's easier to use than HTTPS. This fails, because a) you still need a certificate (the most annoying part of HTTPS), and b) an MITM could just alter the outgoing HTTP request to remove the encryption request, get the plaintext reply, and encrypt that itself, with the client none the wiser. (But you seem to say you don't think MITM is a serious problem, so I don't know why your proposal includes encryption at all.) 2) You're trying to provide a different way for authors to salt and hash passwords. This is a useful goal, but your proposal doesn't do it, because there's no reason to believe authors will know to use the <meta> tag if they don't already know to salt and hash on the server side. Mozilla's Account Manager <https://mozillalabs.com/blog/2010/03/account-manager/> is a step in this direction that aims to solve this problem and others, and encourages webmasters to comply by making their non-compliance obvious and (potentially) making their life easier. You really need to explain 1) why your proposal is actually better than HTTPS, and 2) why you think your tag will actually encourage more authors to salt and hash their users' passwords. Taking a defensive or dismissive stance will not help your proposal.
Received on Friday, 7 May 2010 13:18:58 UTC