Re: xAPI Badges integration(Including references to badge assertions in Experience API activity streams)

On 02/06/2015 10:53 AM, ☮ elf Pavlik ☮ wrote:
> On 02/05/2015 12:08 AM, Dave Longley wrote:
>> On 02/04/2015 04:51 PM, Nate Otto wrote:
>>> Friends and Badgers, (CC: Credentials Community Group)
>>>
>>> Andrew Downes, Jason Lewis, Ryan Smith and a few others have been doing
>>> some good work on defining how it might be possible to transport badges
>>> within xAPI "activity streams." (These streams are essentially
>>> collections of statements like "<entity> <verbed> <object>", or in this
>>> case, "Nate earned Badge #5"
> Dave, I wonder if you've noticed example of *signed* statement in xAPI spec
> * https://github.com/adlnet/xAPI-Spec/blob/master/xAPI.md#AppendixE
> * https://github.com/adlnet/xAPI-Spec/blob/master/xAPI.md#signature
> 

I hadn't seen it until you pointed it out just now. I took a quick look:

1. Signing is done at the statement level (I can't tell if you can sign
more than one statement at once; it may be limited to just one at a time).

2. It seems to use JWS and indicates the original serialization of the
data is included as an attachment to the statement. This means that the
same data is effectively represented twice. There are some notes about
verification that require that the "original" statement be reconciled
against the "received" statement to ensure they are logically
equivalent. I assume this means doing some sort a semantic comparison
between the base64-encoded JSON statement (the payload that was actually
signed) and the JSON statement that contains it as an attachment. There
are some special exception rules for certain properties that must be
ignored when doing this comparison.

3. There's some other stuff in there about preferring X.509 certificates
and chains be specified and used, etc.

In short, my opinion is:

1. It's more complex than Linked Data Signatures 1.0 and you don't get
semantic comparison for free or in a generic way (RDF Dataset
Normalization).

2. It uses JWS in a custom way and uses an RDF-like data model that
isn't actually RDF despite being extremely close to it.

With their current approach, they can't use JWS tools out of the box
(they need to do something on top of them to *actually* verify
signatures) and they can't use RDF tools out of the box either.

Their spec use statements of triples (subject-verb-object) (sounds just
like RDF to me). Their spec transports these statements using JSON
(sounds just like JSON-LD to me). Their spec defines how to digitally
sign statements, continue to transport them as clear JSON, and
semantically compare them when verifying (sounds just like Linked Data
Signatures to me).

I think these parts of their spec should align with existing standards
work as their overall design goals seem to be nearly identical. I
haven't yet seen a good reason to deviate. They could focus more on
their value add and achieve better interoperability by building on
existing technology stacks instead of reinventing them with what appear
to be, at least to me, only slight changes.

Elf can speak more about aligning with the Social Web WG's Activity
Streams as well as he's deep in that work.


-- 
Dave Longley
CTO
Digital Bazaar, Inc.

Received on Friday, 6 February 2015 17:12:53 UTC