W3C home > Mailing lists > Public > public-privacy@w3.org > October to December 2011

Re: "CSI just called, you're in."

From: Rigo Wenning <rigo@w3.org>
Date: Tue, 18 Oct 2011 18:55:58 +0200
To: public-privacy@w3.org
Cc: Bjoern Hoehrmann <derhoermi@gmx.net>
Message-ID: <6411359.iJoEiE4bg4@longtarin>

I tried to get the partners in the PrimeLife research project to do some of 
this access-control-like wider dissemination control constraints. Because I 
think it is still very much in need of research. They haven't included my 
request in the access control language. But Ronald Leenes wrote great articles 
about audience segregation we implemented some of those on the elgg platform 
on http://clique.primelife.eu/ 

It is good to see that google plus tries at least some of audience 

I think we need further research on the "right to be forgotten" to erase 
content from the web. But this is tricky. Remember the cancel messages in 
NNTP? There were cancel-message wars going back and forth. I also don't 
believe the web can guarantee the total technically guaranteed erasure of 
content. But there should be ways to make some stuff more difficult to see as 
it may ruin lifes, even after years. 

The current attempts I've seen so far are not really compelling. I think this 
is also an architectural question for the TAG. 



On Sunday 16 October 2011 02:03:47 Bjoern Hoehrmann wrote:
> Hi,
>   http://imgur.com/gallery/yarZ7 I had been waiting for something like
> it for some time now with the proliferation of image search technology.
> Here in Germany http://www.ccc.de/de/hackerethics hacker ethics include
> rules to use public data and to protect private data. I wonder where
> this fits in. Note that the depicted person could have modified his own
> image here, and then this would be prominently linking this citizen's
> private political expression to his professional life; or someone could
> have simply claimed as much with similar consequences for them if he'd
> deny authorship either way in fear of repercussions, of refuse to deny
> or confirm because it's nobody's business (which, legally, may be the
> case depending on where he lives).
> As it is, society does not consider the question of private versus pub-
> lic a trivial binary matter, it's common to separate the Whether and the
> How in access control. We might say anyone can access some records, but
> they have to access them in a certain place and they cannot make copies,
> for instance. Or, more simply, you can access some files through indices
> and browsing, but there is no automated full text search.
> Here in Europe, some would like to have such notions on the web aswell,
> under titles like the "right to be forgotten" and others. A prominent
> case recently was in Spain where, as I recall it, Google was ordered to
> remove pages from its indices to protect the privacy of some people.
> As I wrote this, the view counter on the image increased by 60,000. As
> yet, there is not a single comment finding this use of the image search
> feature problematic. I wonder what the reaction would be like if the
> search feature had proper facial recognition support, requiring no bit
> of effort to find the original image, or, for that matter, if there was
> no notion of uncovering some kind of fraud as is suggested in this case.
> regards,
Received on Tuesday, 18 October 2011 16:56:13 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:23:53 UTC