Re: A look at WoT specs from Linked Data and AWWW perspective

> On 7 Apr 2018, at 17:04, Martynas Jusevičius <martynas@atomgraph.com> wrote:
> 
> Dave,
> 
> then I see this as a failure of the specification. If the WoT specs merely specify the exchange of JSON messages and it schemas, what does it achieve? We already have HTTP for that. How does it advance the interoperability of IoT devices?

The use of JSON and Linked Data to describe the object interaction model (properties, actions and events) is distinct from the protocols used by platforms to support that interaction. Think of the difference between reading and writing an object property in a scripting language, from the code that you would need to write at the protocol level. The Web of Things is intended to decouple the protocols as people are likely to use different protocols for different circumstances.


> I think rather than being message-agnostic, WoT should work on a domain vocabulary and should integrate closer with SSN: https://www.w3.org/TR/vocab-ssn/ <https://www.w3.org/TR/vocab-ssn/>
> I see SSN only mentioned once (!) in the text. How so?

The Web of Things Working Group charter scope excludes work on domain vocabularies as these require groups with domain specific experts.


> My main point: if WoT is to become a cornerstone specification for millions of devices on the web, the interoperability has to be absolutely bullet-proof. I mean space-mission bullet-proof. And that takes a sound, formally verifiable abstract model(s), in the form of algebras, semantics, and so on.

I am not sure that formally verifiable abstract models are what’s needed in the form of algebras. We definitely need interoperability at multiple layers including protocols, encodings, data formats, interaction models, security, semantics, and business requirements. The Web of Things is defining interoperability at an abstract API level, and linking to semantic models for semantic interoperability.

Consider commercial products. Companies want to differentiate their products from their competitors, and also want to provide a range of models to suit different customer needs. This means that the semantic models need to be highly modular, so that companies can effectively describe the capabilities of each product model using shared semantics.

A further challenge is that on the scale of the Web, we can’t expect everyone to use the same ontologies, and thus need to address how to map data across vocabularies with over lapping semantics.  There is also a lot of work to be done on business requirements including privacy to support open markets of services for B2C and B2B.

The current charter for the Web of Things WG necessarily only addresses a small part of the overall vision. But that is as it should be as effective standards need to focus on particular challenges.


> Currently I see nothing of that sort, only prose "definitions" and a bunch of JSON examples -- that will not cut it.
> 
> These articles, even if long, explain my point rather well:
> https://www.theatlantic.com/technology/archive/2017/09/saving-the-world-from-code/540393/ <https://www.theatlantic.com/technology/archive/2017/09/saving-the-world-from-code/540393/>
> https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920 <https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920>

Formal verification has yet to be shown to scale adequately. We need to get better at designing resilient systems and using AI to monitor and assess behaviour in real-time. This is a predator-prey evolutionary battle where we need to commit the resources to stay ahead.  In the WoT WG, I am very grateful to Intel in particular for the resource they are providing to the WG for work on security.

> 
> 
> On Fri, Apr 6, 2018 at 11:03 AM, Dave Raggett <dsr@w3.org <mailto:dsr@w3.org>> wrote:
> 
> 
>> On 6 Apr 2018, at 09:24, Charpenay, Victor <victor.charpenay@siemens.com <mailto:victor.charpenay@siemens.com>> wrote:
>>  
>> > JSON is only one of the syntaxes -- RDF is the model, and constraints should be based on it.
>>  
>> Not exactly. RDF is the foundation of the TD model, sure. But the schemas embedded in a TD model the data a device exposes, not the TD. That data model can be based on SenML or the BLE GATT specification, for instance. WoT devices typically exchange once a TD and ten, hundred times sensor data. It is important to optimize the latter exchanges; RDF would introduce a significant overhead and is therefore not a good candidate. If it is necessary to merge data and meta-data (e.g. a TD or some SSN description of a system), one would have to perform RDF lifting, which is a pretty well-known procedure (see for instance, RML or SPARQL Generate).
> 
> I am not quite sure what Martynas is signifying in his comment.  I agree with the idea that constraints should be expressible in terms of the RDF model for how applications expose and interact with things as objects. However, there is developer interest in simple use of JSON to express data types. This can be mapped to constraints on the corresponding RDF model and as you suggest to RDF shape rules in SHACL, however, applications can more simply directly use the JSON expression to apply the constraints to the data.
> 
> Dave Raggett <dsr@w3.org <mailto:dsr@w3.org>> http://www.w3.org/People/Raggett <http://www.w3.org/People/Raggett>
> W3C Data Activity Lead & W3C champion for the Web of things 
> 
> 
> 
> 
> 
> 
> 

Dave Raggett <dsr@w3.org> http://www.w3.org/People/Raggett
W3C Data Activity Lead & W3C champion for the Web of things 

Received on Saturday, 7 April 2018 19:41:34 UTC