Re: Use Cases | ACTION-13 Revisited [and latest API draft comments]

On Mon, Aug 27, 2012 at 11:53 AM, Davenport, James L.
<jdavenpo@mitre.org> wrote:
> Ryan,
> Yes, you are correct... The use case in 2.2 does cover both examples.
>
> One level lower, however, my two use cases tried to break out how (in the
> first use case) the user's out-of-band private key must somehow be extracted
> from an external source, and/or how (in the second use case) the external
> device's own cryptographic functions (which are associated with the
> out-of-band) key are used to unwrap and decrypt the message.
>
> This is related to other discussion threads today on key discovery. Since
> keys are internally bound to a cryptographic provider or module, how can we
> ensure that a particular out-of-band key is bound to a SUPPORTED
> cryptographic provider? My second use case shows the need to allow external
> cryptographic functions.

I'm sorry, I'm having trouble seeing how these two use cases differ,
and I'm especially not following your last point.

If you have an out-of-band key, you (are supposed to) have
out-of-bound knowledge of how that key was provisioned and stored. The
two concepts are closely linked - if you know about a particular key
existing, you should know how it exists. If you don't know about a
particular key existing, then you cannot (reliably) discover how it
exists - at least, not with some other attestation (eg: a certificate
that proves it was issued by some authority, for which the authority
promises the keys it provisions are only issued to secure elements).

If the out-of-band key is not available, then you can presume one of
the following
- The user opted not to grant you access to the key
- The user agent does not support that particular out-of-band key

Conversely, if the out-of-band key is present, then you know it's
bound to a supported cryptographic provider. That's the only way it
could be present.

>
>
>
> Sent with Good (www.good.com)
>
>
>
> -----Original Message-----
> From: Ryan Sleevi [sleevi@google.com]
> Sent: Monday, August 27, 2012 02:13 PM Eastern Standard Time
> To: Davenport, James L.
> Cc: Vijay Bharadwaj; Arun Ranganathan; public-webcrypto@w3.org;
> estark@mit.edu
> Subject: Re: Use Cases | ACTION-13 Revisited [and latest API draft comments]
>
> On Mon, Aug 27, 2012 at 6:11 AM, Davenport, James L. <jdavenpo@mitre.org>
> wrote:
>> On the draft API, section 2 "Use Cases," I would also still like to see
>> the previously mentioned out-"of-band provisioned" keys in the use cases:
>>
>>
>> A national agency uploads each citizen's benefits summary report to the
>> cloud. For privacy and security reasons, the reports are stored in an
>> encrypted form. This national agency announces on their web site the
>> availability of the reports. Each citizen can then open a browser, go to the
>> web site for this national agency and log on, which takes them to their
>> encrypted report. The encrypted report is fetched by the browser, along with
>> some HTML and JavaScript.  The JavaScript requests the Crypto API to decrypt
>> the report using an out-of-band provisioned key. The API returns the
>> decrypted report and the JavaScript then inserts it into the HTML, which is
>> then displayed on the browser screen.
>>
>> A financial brokerage firm generates and stores quarterly reports for each
>> of its members to the cloud. For privacy and security reasons, the reports
>> are stored in an encrypted form. Each member can then open a browser, go to
>> the web site for the financial brokerage firm and log on, which takes them
>> to their encrypted quarterly report. The encrypted report is fetched by the
>> browser, along with some HTML and JavaScript.  The JavaScript requests the
>> Crypto API to decrypt the report using the external decryption capabilities
>> of the external device, using this device's out-of-band provisioned key. The
>> API returns the decrypted report and the JavaScript then inserts it into the
>> HTML, which is then displayed on the browser screen.
>>
>> --JD
>
> James,
>
> Apologies if I've misread, but are both examples you provide different
> in the capabilities they require from the API, or are they simply
> differences in how the API might be consumed?
>
> Further, aren't these both examples of the currently documented
> "Protected document exchange" use case (currently section 2.2)? Is
> there a capability or concept required in these use cases that is not
> documented in the existing use case?
>
>>
>> -----Original Message-----
>> From: Vijay Bharadwaj [mailto:Vijay.Bharadwaj@microsoft.com]
>> Sent: Monday, August 27, 2012 5:55 AM
>> To: Arun Ranganathan; Ryan Sleevi
>> Cc: public-webcrypto@w3.org; estark@mit.edu
>> Subject: RE: Use Cases | ACTION-13 Revisited
>>
>> Perhaps there is a case for locally encrypted content when you combine it
>> with a secure token.
>>
>> Take for example a web app that stores its local data encrypted to a smart
>> card (provisioned out of band, like we have been assuming all trusted smart
>> cards are). Then while the app is vulnerable if it is used after the user
>> agent is compromised, at least it raises the bar by requiring the attacker
>> to do a two-touch attack. An attacker who just compromises the user agent
>> cannot decrypt the locally stored data, because the user agent itself cannot
>> decrypt it without the token.
>>
>> To be more specific:
>>
>> Use case: encrypted local storage
>>
>> When caching sensitive data locally, an application may wish to ensure
>> that this data cannot be compromised in an offline attack. In such a case,
>> the application may leverage a key stored on a secure token distributed out
>> of band (such as a smart card) to encrypt the local cache. Thus, the cache
>> may only be decrypted by the application when the secure token is present;
>> at other times (such as when an attacker has stolen the machine) the local
>> cache is inaccessible and all operations will require online authentication
>> to the application's web service.
>>
>> -----Original Message-----
>> From: Arun Ranganathan [mailto:arun@mozilla.com]
>> Sent: Friday, August 17, 2012 7:57 AM
>> To: Ryan Sleevi
>> Cc: public-webcrypto@w3.org; estark@mit.edu
>> Subject: Re: Use Cases | ACTION-13 Revisited
>>
>> Ryan,
>>
>>
>> On Aug 16, 2012, at 7:16 PM, Ryan Sleevi wrote:
>>
>>> On Thu, Aug 16, 2012 at 3:55 PM, Arun Ranganathan <arun@mozilla.com>
>>> wrote:
>>>> While working through the use cases (per [ACTION-13]) with Wan-Teh
>>>> (wtc), we came up with the following:
>>>>
>>
>> <snip/>
>>
>>>> 1. The use cases rsleevi added to the draft [spec] are pretty solid;
>>>> they are only missing a "local storage" scenario, first mentioned on
>>>> the Wiki [cf. local].
>>>> [cf. local]
>>>> http://www.w3.org/community/webcryptoapi/wiki/Use_Cases#Storing_local
>>>> _storage
>>
>>
>>> I'm a little concerned about the "local storage" case, and wondering
>>> whether it's something that would necessarily be in scope for this
>>> group.
>>>
>>
>>> Consider the example of IndexedDB, which uses "Keys" (IDB keys -
>>> http://www.w3.org/TR/IndexedDB/#key-construct ) and returns "Values" (
>>> http://www.w3.org/TR/IndexedDB/#value-construct ), and can
>>> alternatively be accessed via indices (
>>> http://www.w3.org/TR/IndexedDB/#index-concept ).
>>>
>>
>>> A naieve assumption would be that this API would only protect the
>>> Values - not the keys, nor the indices. However, as practically
>>> deployed today, that wouldn't offer much protection, since both Keys
>>> and Indices often reveal quite a bit of information.
>>>
>>> Further, by ciphering contents, it's a tradeoff between efficiency and
>>> privacy. Perfect privacy (storing no relationships about keys/indices,
>>> everything randomly distributed) is the worst efficiency, while
>>> perfect efficiency (which is what is afforded by today's IndexedDB)
>>> has no privacy/cryptography.
>>>
>>> A refinement might be to have the IndexedDB actually take a Key
>>> (Crypto API key), that it can use to protect however the IndexedDB is
>>> stored - keys, indices, everything. Call it an "EncryptedIndexedDB".
>>> This is better, in that it allows the user agent to decrypt on the fly
>>> (see caveat), and allows applications to use existing indices/keys.
>>> The caveat, however, is that encryption requires defining an
>>> encryption algorithm, and the choice of encryption algorithm directly
>>> affects the efficiency of the API. For example, under today's
>>> IndexedDB, a user agent can load data on the fly (eg: from disk), but
>>> under EncryptedIndexedDB with say, a block cipher alg like AES, it
>>> might have to read the entire DB into memory, then decrypt, in order
>>> to be able to offer this functionality.
>>>
>>> Even more fundamentally though, is the question about what attack this
>>> is trying to defend against. The arguments I've heard for encrypted
>>> local storage seem to be about a remote server, serving a web
>>> application, distrusting the client platform. If that's the case, it
>>> doesn't seem like any level of cryptography will save them. As I noted
>>> in the existing security considerations, it SHOULD be perfectly valid
>>> for a user agent to store a key in plaintext on disk, so what actual
>>> protections are afforded by this?
>>
>>
>> You're right -- if the use case is primarily about an untrusted multi-user
>> machine or virtual computing environment, we're only as safe as general user
>> safety anyway.  This doesn't seem to be a use case we can salvage, nor one
>> that should influence the API.  We should probably not include it.
>>
>> But:
>>
>>>
>>> If something like EncryptedIndexedDB is what is meant here, then this
>>> seems like something that would likely live in the Web Apps WG (since
>>> it's about extending IndexedDB).
>>>
>>
>> Maybe -- I doubt it's worth their while to solve for that use case either
>> :).  Interestingly enough (and not to confuse matters, but) we've just heard
>> from Facebook [FB-ScriptSigning] about localStorage (or IndexedDB) used as a
>> script cache.  People are already using IndexedDB and localStorage in
>> unsafe-ish ways.  Of course, we shouldn't confuse script signing with a
>> general use case for protected/encrypted local storage, but perhaps if we
>> jettison the "protected local storage" use case, we can bolster the
>> "document signing" use case to explicitly refer to documents extracted from
>> local storage for signature verification.
>>
>> This raises the sticky issue of types of documents.  We might naively say
>> that a script is no ordinary document, and can be used by the relevant JSON
>> primitive if it passes signature validity.
>>
>> In a nutshell, I'm saying: perhaps we cannot cater to an encrypted local
>> store use case, but we may be able to flesh out the use case for signature
>> verification, including extraction from local storage.  Our use cases should
>> encourage patterns of behavior that we think are desirable.  We can't
>> control or solve for undesirable patterns of behavior :)
>>
>>
>>> I just want to make sure that we're carefully considering the use case
>>> and the security implications before committing to them, as well as to
>>> figure out what parts of the spec may need to change in order to
>>> meaningfully implement them.
>>
>>
>> +1.
>>
>> -- A*
>>
>> [FB-ScriptSigning]
>> http://lists.w3.org/Archives/Public/public-webcrypto/2012Aug/0121.html
>>
>>
>>
>>

Received on Monday, 27 August 2012 21:34:21 UTC