Re: PLING - Call to Action....

Hi Jean Pierre

Firstly the rule set should be of the type "anything not expressly
allowed is forbidden". This is the only way you can ever control
anything, since you can never list the complete infinite set of things
that should be forbidden.

jeanpierre.lerouzic@orange-ftgroup.com wrote:
> Hi David,
> 
> Thanks for your answer. let's suppose a silly example:
> 
> I go now to the web site of www.malware.com with a privacy ruleset
> which doesn't express I want my personal data to be used to profile
> me 

in which case it cannot be used to profile you, since you have only said 
what it can be used for in your rule set.


(there isn't anyway to forbid it).

yes there is. By not giving explicit permission for it.
The data controller should ask your permission for things he wants to do 
with your personal data, and then he should only do those things and 
nothing else.

> 
> Indeed www.malware.com will profile me. 

then the audit will catch them out, since they have done something which 
was forbidden.


Suppose that later I suspect
> I am profiled because I saw advertizements at the BBC web site which
> were really tailored for me.
> 
> How can I know for sure that: (1) The BBC sold web page space to
> malware.com which could even be a subcontractor of a nice company. 
> (2) malware.com is profiling me?
> 
> Now suppose an insider at malware.com informed me that I was
> profiled: 

An internal audit. Saves having to get an external one :-)

(3) How can I prove that I went at malware.com with a
> privacy ruleset that didn't express authorization to profile me?

Because malware would have to had to have kept an audit trail of your 
input and the auditor can see from this that you did not authorise 
profiling.

Now if malware is so bad as to delete/modify/loose/corrupt all its audit 
records, then the auditor ought to smell a rat.


> 
> My HTTP "GET www.malware.com" wasn't recorded at that time and my
> policy could have been changed later.

In the TAS3 project we have audit records that are tamperproof and 
summaries are saved external to the organisation, so it will be a bit 
difficult for this to happen with a secure trusted system.


> 
> At a philosophical level: What is profiling anyway? Every company
> using cookies for pages that doesn't involve a personal account is
> doing profiling, isn't? What is the meaning of a policy rule that
> authorize profiling?

I am not going to enter that discussion just now. Safe to say that every 
organisation should be able to justify to a judge that its action did 
fall within the scope of the allowed policy rules.

regards

David

> 
> I would be happy to know how the three points could be proved to a
> judge, but I am open to any ideas and furthermore I may have
> misunderstood something. The semantics of the privacy ruleset seem
> also quite difficult to interpret in any non specific context.
> 
> Best regards,
> 
> Jean-Pierre
> 
> 
> -----Message d'origine----- De : David Chadwick
> [mailto:d.w.chadwick@kent.ac.uk] Envoyé : mardi 17 août 2010 14:03 À
> : LE ROUZIC Jean-Pierre RD-MAPS-REN Cc : renato@iannella.it;
> public-pling@w3.org Objet : Re: PLING - Call to Action....
> 
> Hi Jean Pierre
> 
> Audit is clearly desirable, but the auditor has to audit against
> something. The user's privacy preferences (or rule set) would be
> something for the auditor to audit against, so in that context, they
> make sense.
> 
> regards
> 
> David
> 
> 
> jeanpierre.lerouzic@orange-ftgroup.com wrote:
>> Hi all,
>> 
>> Isn't the privacy ruleset approach similar to a weak audit
>> approach? I mean it's not so useful to specify some future
>> behaviour of a service provider if one is not sure she is
>> confronted to a real threat or not. The ruleset approach works well
>> with the nice guys, who probably will behave nicely anyway. The bad
>> guys will laugh at the privacy ruleset. Another thing about
>> auditability is that it involve some notarial recording, here with
>> the "privacy ruleset" there is no record about what the user
>> specified, so no legal enforcement could be achieved: The user
>> terms about her interaction with the service provider will be lost
>> as nobody record it! This audit approach is not the same as a
>> policy approach which enforce in real time.
>> 
>> Let me know your opinion,
>> 
>> Jean-Pierre
>> 
>> ----------------------------------------------------------------------
>>  -- *De :* public-pling-request@w3.org 
>> [mailto:public-pling-request@w3.org] *De la part de* Renato
>> Iannella *Envoyé :* mardi 17 août 2010 02:19 *À :* pling *Objet :*
>> PLING - Call to Action....
>> 
>> Dear PLINGers...
>> 
>> You maybe interested in the outcomes of the recent W3C Workshop on
>>  Privacy for Advanced Web APIs - the report [1] states "the W3C
>> staff plans to propose a charter for a Privacy Interest Group...
>> Such an Interest Group could also provide a focal point for
>> privacy-related coordination with other interested standard
>> development organizations".
>> 
>> One of the other interesting activities of the W3C Device APIs and
>>  Policy WG - reported from the Workshop -was the development of the
>>  "Privacy Rulesets" [2] - a way to describe user privacy
>> preferences.
>> 
>> Clearly, these impact on the future of PLING and our role in W3C.
>> 
>> We should discuss this at the next teleconference (at least) and 
>> online now...
>> 
>> Cheers
>> 
>> Renato Iannella http://renato.iannella.it
>> 
>> [1] http://www.w3.org/2010/api-privacy-ws/report [2]
>> http://dev.w3.org/2009/dap/privacy-rulesets/
> 

-- 

*****************************************************************
David W. Chadwick, BSc PhD
Professor of Information Systems Security
School of Computing, University of Kent, Canterbury, CT2 7NF
Skype Name: davidwchadwick
Tel: +44 1227 82 3221
Fax +44 1227 762 811
Mobile: +44 77 96 44 7184
Email: D.W.Chadwick@kent.ac.uk
Home Page: http://www.cs.kent.ac.uk/people/staff/dwc8/index.html
Research Web site: http://www.cs.kent.ac.uk/research/groups/iss/index.html
Entrust key validation string: MLJ9-DU5T-HV8J
PGP Key ID is 0xBC238DE5

*****************************************************************

Received on Wednesday, 18 August 2010 07:55:14 UTC