W3C home > Mailing lists > Public > www-talk@w3.org > March to April 2000

Re: Security: Cookies

From: Kevin J. Dyer <kdyer@draper.com>
Date: Mon, 20 Mar 2000 07:57:13 -0500
To: Clover Andrew <aclover@1VALUE.com>, "'www-talk@w3.org'" <www-talk@w3.org>
Message-id: <4.2.2.20000320072805.00d113b0@imap>
At 03:09 AM 3/20/00 , Clover Andrew wrote:
>Specifically, most browsers allow cookies to be sent and received on
>embedded objects in a web page: frame, object, embed, and image.
>
>When a user inputs a URL on www.a.com they are implicitly agreeing that
>their access can be logged by a.com and may be used for marketing
>purposes. However, if www.a.com/index.html includes an image stored
>at images.b.com, the user will unknowingly be allowing b.com to log not
>only the access to images.b.com, but also, by implication, the original
>access to www.a.com. If b.com ensures that it has embedded images on
>a great number of sites, it can use a cookie at images.b.com to tie
>together accesses to all its partner sites and obtain a detailed
>report on individuals' browsing habits.

But the cookie can only travel from the User Agent to www.a.com or if
it was set as a domain cookie all of a.com or conversely b.com.
The cross link to information that resides on b.com is the cornerstone
of what HTML and WWW is all about, the ability to create virtual
documents (pages) that span multiple sites.  Unless you have given
a.com and b.com specific information that they can place in their
cookies.  You can only be targeted as an IP address coming from
a .com or an ISP.  Since most ISPs use DHCP this is short lived
information.  All they can really do is track you statistically, as
long as they don't have personnel data.


>more on this: http://www.tiac.net/users/smiths/privacy/banads.htm
>
>The solution is to stop browsers from sending cookies to places the
>user would not expect for the URL they typed. At the moment the best


A better solution does exist but the two big browser developers haven't
given the community any indication that they want to support it, Digest
Authentication.  At least for the sites that require authentication.

Otherwise a finer grained control, like most of our spam filters, needs
to be added to the User Agents.  So the end-user can control who can
create virtual and persistent connection on their machine.  The web
has always required trust on both ends of the connection.
The end-user needs to be able to tweak that trust more,
now that the web has become more commercialized.


                                 Kevin

===========================================================
Kevin J. Dyer				     Draper Laboratory  MS 35
Email: <kdyer@draper.com>		     555 Tech. Sq.
Phone: 617-258-4962			     Cambridge, MA 02139
FAX: 
617-258-2061                                   http://www.draper.com 

---------------------------------------------------------------------------- 
------------------------------------------
	    _/_/_/_/    _/          _/  _/  _/        _/     _/_/_/_/
	   _/      _/   _/_/     _/_/  _/  _/_/     _/   _/
	  _/       _/  _/ _/   _/ _/  _/  _/  _/   _/    _/_/_/
	 _/      _/   _/  _/ _/  _/  _/  _/    _/ _/            _/
	_/_/_/_/   _/    _/    _/  _/  _/        _/  _/_/_/_/
        Data Management & Information Navigation Systems
=========================================================== 
Received on Monday, 20 March 2000 07:57:51 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 27 October 2010 18:14:24 GMT