W3C home > Mailing lists > Public > public-xg-webid@w3.org > February 2011

RE: nasty nasty bug in chrome

From: peter williams <home_pw@msn.com>
Date: Wed, 9 Feb 2011 09:48:50 -0800
Message-ID: <SNT143-ds33305D869A3ACC7948E1C92ED0@phx.gbl>
To: "'Henry Story'" <henry.story@bblfish.net>
CC: <ltorokjr@gmail.com>, <nathan@webr3.org>, <public-xg-webid@w3.org>
Ill try to stay astride the professional snobbiness that abounds in security
communities working on "trusted desktops" - saying only nice things about
the ATTEMPT to push the envelope. Then we analyze it wrt "trusted browsing"
and then how webid protocol then interacts with these "metaphors",
particular in the area of sharing state or dialog design for cert selectors
or pin/bio requests.

For years, some users have used a compartmented mode workstation (particular
analysts like Manning; trusted not do contaminate the categories). You can
think of the private browsing mode as a light version of what others did
when constraining the window manager (since not all apps are browsers), so
that drag and drop (just as an example) didn't allow one to subvert a policy
that TS marked data (from windows X) cannot be written down to S marked
paragraph (in window Y). It's the "information leakage" problem, between
compartments, reduced to windows in a frame buffer or browser instances in a
process pool/tree.

>From "anonymous browsing" - which suggests the threat is the server - we got
to  more insider threats, which the compartmentation is trying to address.
These are more subtle issues than: is someone snooping? (Of COURSE they are;
but who cares!)  They address the: its unusable if I have to place my thumb
on the finger reader everytime I open a "protected tab"; the smartcard doing
client authn has to be inserted, merely to browse; I have to be in 5m range
of the internet access device for the RFID to range properly, in  cheap PC
in internet  cafes.

The topic generalize best (and we can thank the websso folks here, I think)
into a consent issue. The client cert release (or personal pin/bio) is just
part of the topic of consent, which has to be "manageable". 

But this is all good. I think we know that webid protocol will not work
effectively till the 1994 era cert dialog is "updated" - which of course is
where MSFT went in their infocard work. Something to said for now
understading the RATIONALES of that work - done 5+ years ago, now; since it
bridges the topics of "consent", "trusted desktops", "websso", and even
client certs for https!

-----Original Message-----
From: Henry Story [mailto:henry.story@bblfish.net] 
Sent: Wednesday, February 09, 2011 6:53 AM
To: Peter Williams
Cc: ltorokjr@gmail.com; nathan@webr3.org; public-xg-webid@w3.org
Subject: Re: nasty nasty bug in chrome

> 3. how does webid protocol work in the transference of control between
(trusted) desktops hosting browser and non-browser https instances, each
engaged in a (compartmented?) run of the webid protocol?

You mean something like drag and drop of resources onto the desktop, or from
the desktop to the browser. Or since any application can be a browser of
data, dragging resources across applications?

That is a good question. In theory there is no problem, but in practice, it
is quite possible to place identity information in a URL, even if this
cannot be very strong. So an anonymous web site can  - even when cookies
don't work - create such URLs and use those to track visitor activity across
the site. This is not fool proof, as the user could paste the URL and mail
it to someone, but I suppose even that can be interesting.

So now you drag such a URL onto your authenticated app and drop it there.
That app makes the request, and the server can now draw some relation
between the initial browsing experience and the identity.

One solution to that is to warn the user in drag and drop mode of this
danger, and future operating systems could even point out to the user that
by doing this he has potentially left anonymous mode - depending on what the
reciving app does with the URL.

> Date: Wed, 9 Feb 2011 15:08:49 +0100
> Subject: Re: nasty nasty bug in chrome
> From: ltorokjr@gmail.com
> To: home_pw@msn.com
> CC: nathan@webr3.org; public-xg-webid@w3.org
> Hi,
> 2011/2/9 Peter Williams <home_pw@msn.com> is it a bug it is
>   I dont know what an incognito window is.
> Nathan is probably refering to the "mode" that is also available in
Firefox as "Start private browsing".
> This means all cookies and any kind of previous browsing history that 
> might help identify you at the server is reset/unavailable. It should 
> mimic/replicate the situation of landing on a website for the first 
> time. Any kind of action taken during this private browsing session is 
> purged after you terminate the session (i.e close the window)
> But the general rule is that a browser can be itself replicated, and it
can inherit the ssl state of its replicator.
> I believe it is exactly the thing that "private/incognito browsing" should
not allow. 
> I am not sure how the rest relates to this, but I found it very insightful
> Thanks!
> Las
> You have to remember (and IETF NEVER got this) is that https target
hypermedia, and the brower concept. Pre tabs and pre-popups), one expected
to be looking at page X, and want to see page Y in parallel. The UI notion
of Linking didnt allow for this. One could "duplicate" a browser frame
however, and then link from there -  producing a view of X and Y.
> If X was an https hypermedia document supproting by n SSL sessions, 
> and m SSL connections, so too must replicant of X (since on X' only, the
user may refresh before linking on)  .
> This all generalizes to the browser behaviours in Mozilla (and its strict
emulators, such as IE) on client certs. This "theory of UI and state"
controls when a cert dialog is shewn, when and when behind the scenes on
the nth SSL session handshake on 1 TCP connection the client signing key for
(RSA) client authn is automatically shewn to have been used, without
prompting user for pin or bio. Similar arguments hold for shared cookie
stores when one opens an "icognito" IE instance - a play where famously IE
security model different from Mozilla - as anyone who builds server-side
session managers in windows knows.
> If you observe the behaviour of other invokers of https with client authn
(e.g. the infocard "trusted desktop" of the Windows window manager) its
behaviour is not mozilla-compatible. its not trying to be a browser with its
link concept, after all; its trying to be an authentication protocol SEF.
And, it has to support a browser, classical windows apps doing (web
services) client/server, and ajax callbacks and ajax sockets. Its still a
consumer and user of the webid protocol, I believe, even though its not a
> This webid protocol has rapidly gone from rescueing client certs from
obscurity and 15 year old waits for national smartcards... to something
> > From: henry.story@bblfish.net
> > Date: Wed, 9 Feb 2011 11:38:35 +0100
> > CC: public-xg-webid@w3.org
> > To: nathan@webr3.org
> > Subject: Re: nasty nasty bug in chrome
> > 
> > On 9 Feb 2011, at 02:21, Nathan wrote:
> > 
> > > 
> > > It appears, that if you webid auth in chrome, them open a new
incognito window, then go to the same website again, it'll automatically
send your cert and auth you w/o asking..
> > 
> > Did you report that bug? It's worth doing it. They are very responsive. 
> > Just send us the bug ID here, and we can all vote on it :-)
> > 
> > Henry
> > 
> > 
> > > 
> > > 
> > 
> > Social Web Architect
> > http://bblfish.net/
> > 
> > 

Social Web Architect
Received on Wednesday, 9 February 2011 17:49:30 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:39:41 UTC