- From: Gerard CHOLLET <gerard.chollet@telecom-sudparis.eu>
- Date: Sat, 24 Aug 2024 09:20:17 +0200 (CEST)
- To: Deborah Dahl <Dahl@conversational-Technologies.com>, Dirk Schnelle-Walka <dirk@switch-consulting.de>
- Cc: public-voiceinteraction <public-voiceinteraction@w3.org>, hugues <hugues@shankaa.com>, Gerard CHOLLET <gerard.chollet@telecom-sudparis.eu>
Dear Colleagues,
I'm afraid I will not be available next Wednesday, August 28 , 4:30pm CEST, sorry !
Have a good WE, Gérard
_____________________________________________________________________
Gérard CHOLLET,PhD,DR,CNRS-SAMOVAR,gerard.chollet@TELECOM-SudParis.eu
Institut Polytechnique de Paris, IMT-TSP, 9 rue Charles Fourier, 91011 Evry
Tel: +33 1 75319628 , http://perso.telecom-paristech.fr/~chollet/
_____________________________________________________________________
----- Mail original -----
De: "Gerard CHOLLET" <gerard.chollet@telecom-sudparis.eu>
À: "Deborah Dahl" <Dahl@conversational-Technologies.com>
Cc: "public-voiceinteraction" <public-voiceinteraction@w3.org>, "hugues" <hugues@shankaa.com>
Envoyé: Samedi 17 Août 2024 09:15:50
Objet: Re: [voiceinteraction] minutes August 14, 2024
Dear Colleagues,
You may want to have a look at https://en.wikipedia.org/wiki/Personal_assistant
The IPAs mentioned in https://w3c.github.io/voiceinteraction/voice%20interaction%20drafts/paArchitecture/paArchitecture-1-3.htm
are really virtual assistants which are not personal !? They could become personal if they memorize conversations with users
and verify the identity of the persons they converse with,... Of course, there is a danger of violation of privacy,...
We'd rather prefer that private data remain private and be shared only with trusted providers (like your bank !) who implement
reliable identity verification systems.
Cheers, Gérard
_____________________________________________________________________
Gérard CHOLLET,PhD,DR,CNRS-SAMOVAR,gerard.chollet@TELECOM-SudParis.eu
Institut Polytechnique de Paris, IMT-TSP, 9 rue Charles Fourier, 91011 Evry
Tel: +33 1 75319628 , http://perso.telecom-paristech.fr/~chollet/
_____________________________________________________________________
----- Mail original -----
De: "Deborah Dahl" <Dahl@conversational-Technologies.com>
À: "public-voiceinteraction" <public-voiceinteraction@w3.org>
Envoyé: Mercredi 14 Août 2024 21:48:04
Objet: [voiceinteraction] minutes August 14, 2024
The next call is August 28.
https://www.w3.org/2024/08/14-voiceinteraction-minutes.html
and below as text
[1]W3C
[1] https://www.w3.org/
- DRAFT -
Voice Interaction
14 August 2024
[2]IRC log.
[2] https://www.w3.org/2024/08/14-voiceinteraction-irc
Attendees
Present
debbie gerard hugues
Regrets
dirk
Chair
debbie
Scribe
ddahl
Contents
1. [3]privacy
Meeting minutes
privacy
gerard: personal assistant isn't really personal
. because it's in a call center or the cloud
debbie: maybe IPA providers shouldn't be called personal
because they don't belong to a person
hugues: the personal assistant has the history
. the whole dialog section could be in your pocket
. how to keep personal information away from a cloud provider
. the other issue is how do I increase my context with
encryption
debbie: could we put a privacy barrier between the Dialog and
the IPA providers
hugues: the provider shouldn't be able to access the context
hugues: if the provider requests confidential information, it
will need my approval
. for example, if I'm buying a flight, it's good if I don't
have to repeat the information
debbie: you have to balance security with driving the user
crazy
hugues: most people don't understand why we need security and
privacy
. a medical system is even worse
debbie: let's agree on privacy requirements
gerard: goes in 3.3.6
Brainstorming privacy requirements
1. keep user's personal information secure during interactions
2. enable authorization of personal information for user's
intended purposes
3. keep user's dialog secure
4. keep provider's dialogs secure
5. enable personal information to be used later as part of
context in future conversations
6. reuse of trained ASR should be possible, for example, by
feeding a vectorized model to improve understanding
7. enable any part of the dialog to be used later as part of
context
8. support levels of security (e.g. name vs. your financial
situation), school records would have different levels. The
levels could be assigned ratings or levels of security, e.g.
this information has a level 5 rating, the server has to be
certified to support that rating, the certification has to
support that rating. Does this exist? Part of the
trust organization of the web like SSL?
9. Some organizations, such as banks, will request
certification, and still want to be able to use their
mechanisms
10. don't prevent organizations from using their own security
mechanisms
Amazon has special agreements with banks, so that the banks
trust them
This bank requirement is also for payment information
equivalent to single sign on for business
11. is there something equivalent to a "trusted device" for
voice?
12. Does it trust the phone or the browser, browser is more
correct, users need to authenticate the new device
13. what about two people using the same phone? For example,
biometrics
debbie: resume next time
Received on Saturday, 24 August 2024 07:20:25 UTC