Re: RDF Dataset Canonicalization - Formal Proof

For implementers:

https://www.openattestation.com/docs/getting-started

V3 just released https://www.openattestation.com/docs/advanced/v3/overview

Current production implementations 
- Singapore government : https://www.tradetrust.io/
- Australian Border Force : https://igl.trade.np.cp1.abf.gov.au/

UN Project to encapsulate this as best practice for nations : https://uncefact.unece.org/display/uncefactpublic/Cross+border+Inter-ledger+exchange+for+Preferential+CoO+using+Blockchain

Cheers 

Steven Capell
Mob: 0410 437854

> On 28 Mar 2021, at 1:47 pm, Christopher Allen <ChristopherA@lifewithalacrity.com> wrote:
> 
> 
>> On Sat, Mar 27, 2021 at 7:22 PM Steve Capell <steve.capell@gmail.com> wrote:
> 
>> The Singapore government https://www.openattestation.com/ does this already . Version 3 is W3C VC data model compliant 
>> 
>> Each element is hashed (with salt I think) and then the hash of the hashed is the document hash that is notarised 
>> 
>> The main rationale is selective redaction (because the root hash is unchanged when some clear text is hidden). But I suppose it simplifies canonicalisation too...
> 
> I’m a big fan of this approach, a form of redaction distinct from zk forms of selective disclosure.
> 
> There was an attempt to spec one here in the CCG three-four years ago, but it died on the vine.
> 
> I’d be interested is seeing this spec & implementation. Any links?
> 
> — Christopher Allen [via iPhone] 

Received on Sunday, 28 March 2021 03:10:33 UTC