- From: Peter Ansell <ansell.peter@gmail.com>
- Date: Thu, 27 Mar 2008 11:52:08 +1000
- To: "Karl Dubost" <karl@w3.org>
- Cc: kidehen@openlinksw.com, "Phil Archer" <parcher@icra.org>, "Semantic Web" <semantic-web@w3.org>, "foaf-dev Friend of a" <foaf-dev@lists.foaf-project.org>
On 27/03/2008, Karl Dubost <karl@w3.org> wrote: > Le 26 mars 2008 à 22:53, Kingsley Idehen a écrit : > > Phil Archer wrote: > >> Imagine a teenage girl who is being abused at home. She uses her > >> social network to call for help. Luckily, she finds it and manages > >> to escape the dangerous home life. Now she wants to keep in touch > >> with her new support network but become invisible to her former > >> abuser. > > She assumes a new Identity via a new URI in here new Social Space. > >> In short, any privacy control needs to support changing > >> circumstances. > > One URI dies and another is born :-) > > > Not that simple. This is a very binary statement and our social life > is not binary. The flaw of the Web with regards to privacy is the > structure change. > > # Information Opacity > > In our social structures, information: > * takes time to travel > * is replicated with errors > > In some contexts, people will consider this to be bad. In fact, it is > necessary in many others. Phil Archer gave an example which is good > because, it relies on opacity. Changing URI will not solve the issue. > As soon as someone connects the dots (old-new uri sameAs), suddenly > the whole system is aware of it (information travels faster on the > Web) and the replication is identical (no errors). In this context I like the way that Facebook works on multiple privacy levels, and strictly enforces a decree where personal information about a user should never be stored by applications offline, so one can delete oneself and create a new entity without connections to the old. You would then have to be previously allowed to view the information before you would have enough knowledge to be able to say new-uri sameAs old-uri because you have to be allowed first. I think a no-caching directive for FOAF would be a good thing, so you can disappear, and making mboxsha1 the absolute default so that your email address is not being stored on servers outside of your control, afterall, there is (theoretically) a single sha1 sum that matches a single email address, and it is only being used for identity purposes anyway. > Our social structures need information opacity. We need to be able to > lie, we need to be able to evolve in time and not necessary have a > record accessible. My perimeter of knowledge in the past was limited > to my close environment and by the transmission of voice messages > given face to face. Then the phone came and helped to accelerate this, > and now with the web it's becoming even shorter. > > Someone can take your photos, put it on flickr, someone else can > identify you on this photo, and people can comment and say things > which were known only in one community, time, context. There is very > little way for an individual to say to the system, erase me. There is > very little way for an individual to remove yourself from the Web. > Even if you decide to not publish something, people will put you in > the system. That is very bad. You could not have a social life, and become a hermit. That might stop them being able to annotate your photo with details about you. Not being able to erase ones identity might be a good thing if you are more worried about identity theft than about identity disappearance. Are you an enemy of the government? If not there is nothing to be worried about. IngSoc is good! (by definition) > For addressbook information and the initial question of Henry, I would > say any personal information should not be moved from one system to > another without the consent of the person. > Example: All my addressbook is on my computer, there is around 600 > hundred persons into it. I will *never* put it in an online > addressbook like plaxo, yahoo mail, etc. because I don't have the > right to do so. People don't necessary want to be indexed in database > of private companies. It depends how strictly you interpret laws, where they live, where you live, where the data lives, what treaties there are between the three places, as to whether you have the right to keep information about someone or not. Private companies have been keeping personal information since the beginning of capitalism... thats just the way things work. How much information does personal information include? Is it a whitelist or a blacklist within the context of the three legal environments that the web is operating in. > As an individual I have no way to inform the Web that I don't want to > be in this Acme Inc. database. I have no way to say to someone, I > don't want my mail to go to a Google Mail address (sometimes by > forward proxy.) > > I don't have choices. Why would you say that you don't want your email to go to a Google Mail address. You are posting to a public mailing list... You have a public identity. Nabble.com doesn't necessarily ask permission to republish data from public mailing lists, other than to add their address to a mailing list. I definitely do not agree with the premise that once a triple always a triple, things should be able to be deleted or denied, but you can only do that once you have identity and trust, where do you propose we start? Peter Ansell
Received on Thursday, 27 March 2008 01:52:45 UTC