W3C home > Mailing lists > Public > public-usable-authentication@w3.org > July 2006

Re: Re[2]: Secure Chrome and Secure MetaData (correction)

From: Rachna Dhamija <rachna@deas.harvard.edu>
Date: Wed, 5 Jul 2006 14:13:55 -0400
Message-Id: <5E516992-5388-4E75-A4BF-77C92AAD264D@deas.harvard.edu>
Cc: "spam filter" <spam+w3c@jeff-nelson.com>, "James A. Donald" <jamesd@echeque.com>, public-usable-authentication@w3.org
To: Chris Drake <christopher@pobox.com>

On Jul 5, 2006, at 1:13 PM, Chris Drake wrote:

>
> Hi Jeff,
>
> You wrote:-
>
> sf> We need to determine techniques which are unspoofable, such as
> sf> personalization known only to the user ...
>
> This is why I am trying to eradicate the word "Chrome" - since you're
> suggesting none-chrome things as solutions here - the continued use of
> that word is serving only to give everyone the wrong impression and
> point newcomers down known dead-end paths.

The personalization that Jeff refers to can be accomplished in many  
ways, by modifying the website or the chrome:

1) Allow the user to share a personalized secret (e.g., a  
personalized image) with the server, which is then used to modify the  
content of a web page.  For example see the Passmark solution:
http://www.passmarksecurity.com/

2) Allow the user to share a secret with the chrome, so that the  
secret is displayed within the chrome to establish a trusted path  
between the user and the chrome.  For example, see my Dynamic  
Security Skins proposal:
http://people.deas.harvard.edu/~rachna/papers/securityskins.pdf

Other combinations of these two approaches are possible that allow  
the personalized secret to be displayed within the chrome or within  
the content of a page, regardless of whether the secret resides  
locally or at a remote server.

> This entire debate has been going round-and-round in circles for some
> months now.  I think it's time to start documenting problems, and
> posing some solutions.
>
> Here again is my threat table.  How about you turn your suggestions
> #'s 1 through 7 into a "solutions and techniques" table and add it on
> to the end of my stuff.
>
> We can then look through my list of problems, work out which ones your
> list of solutions solves, and get a good idea about what's missing.

Eric Rescorla has started to develop a taxonomy of requirements/ 
features, which would be good to incorporate:
http://www.educatedguesswork.org/movabletype/archives/2006/06/ 
notes_on_web_au.html

> I like where you're heading with your thoughts - although I can't get
> my head around lost passwords, or how you can block the theft of
> "personalization" *and* passwords - it seems either one or the other
> can be protected - not both?  that is: you can either alert users to
> fake sites (after they've potentially already given their password to
> a spoof one, which can then go ahead and impersonate them anyhow...),
> or you can show random strangers what the users "personalization" is
> (which does potentially block users from giving away passwords to
> spoof sites ... until the spoof sites wize up and go get the
> "personalization" info from the legit site in real time...)

I think that you are referring to the first case of personalization I  
mentioned above, where the secret is shared with the server and  
presented by the server to the user on the website.  In the Passmark  
solution, for example, a man in the middle can capture the user's  
secret personalization.   I do not think this is the only approach-   
other schemes are possible that will require more effort on the part  
of an attacker to capture the user secret and credentials.

Rachna 
Received on Wednesday, 5 July 2006 22:11:40 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 2 June 2009 18:34:14 GMT