W3C home > Mailing lists > Public > public-rww@w3.org > March 2017

Re: Which semantics?

From: Sebastian Samaruga <ssamarug@gmail.com>
Date: Mon, 27 Mar 2017 20:38:24 -0300
Message-ID: <CAOLUXBsB7n=EKpiXD7zGVNYvG+KTSeCmT0QR6aGXt_Z2T5L23g@mail.gmail.com>
To: HansTeijgeler <hans.teijgeler@quicknet.nl>
Cc: semantic-web@w3.org, pragmaticweb@lists.spline.inf.fu-berlin.de, public-rww <public-rww@w3.org>
Sorry but I have to apologize because I've reading all the previous drafts
I've posted and they seem like a mess, even to me. I believe I'm not being
able to express the concepts in which I think I could be right.

That said, please be kind with me regarding this last one I'm posting. It
began as a TOC so I can arrange what I can explain better and I'm begining
to fill the gaps.

The main idea is to be able to merge diverse datasources (from existing
applications databases for example) and from they and their metadata expose
'declarative' application models which can be used for domain driven front
ends or services.


On Feb 28, 2017 4:00 AM, "Sebastian Samaruga" <ssamarug@gmail.com> wrote:


Thanks for your time in replying me. I've know about the standard a time
ago and I've became very interested. I'd like to put an effort into
alignement of what I use as 'internal' representation, please read the
first part of this doc:


and your models. At first sight the standard looks huge and the examples
I've found are very specialized. But you say it may be used to render
models of many domains. What I'd like to know is how much 'interoperable'
it is respect other Web standards (beyond its core) and which applications
may consume it if I tailor my 'Ports' to this kind of ontology.

Please read the last posts on this thread because previously there was
confusion regarding the scope of what I'd like to do and Semantic Web in

I'm also attaching the new doc as a PDF (Notes.pdf) if you have trouble for
reading the link and an older (fuzzy) doc (Datastore.pdf) which may serve
as an index. Both are a (very early) work in progress and I apologize for
their quality or even for their eventual value.


On Feb 15, 2017 7:38 AM, "HansTeijgeler" <hans.teijgeler@quicknet.nl> wrote:

Hi Sebastian,

Martynas describes what we are doing.


   1. defined a generic conceptual data model
   <http://15926.org/topics/data-model/index.htm> of 201 entity data types;
   2. created a reference data library <http://data.15926.org/rdl/> with
   15000 standard core classes, where required with local extensions thereof
   (e.g. supplier catalogs, standards bodies);
   3. created 180 generic templates
   <http://15926.org/15926_template_specs.php>, using entity types from
   that data model, to express small chunks of information;
   4. declare all OOIs (Objects Of Interest) by typing them with an entity
   type of the data model and a reference class from the library;
   5. map data from the proprietary format of the various
   applications/databases to specialized templates, defining those specialized
   templates with the applicable reference data;
   6. store these declared OOIs and template instances in one or more RDF
   triple stores or quad stores that can be federated for SPARQL queries;
   7. time stamp all declared OOIs and all template instances with the
   effective date-time and, if no longer valid, with the deprecation date-time.

Doing this the lifecycle information of a process plant, from comceptual
design to operations and maintenance, can be integrated.
Since the data model is generic, with a proper reference data set the above
can be used for anything else, e.g. airplane, ship, car fleet,
organization, and natural objects.

The source of this information is in the applications and systems that are
used throughout the lifetime of the facility. These need an import/export
Note, however, that this integration can only be done in case the
information is made as explicit as possible (and affordable), without
shortcuts that leave out OOIs that are involved in other information.

Read more at http://15926.org

Use of the data model, reference data and template specifications is free
under GNU license.

On 14-2-2017 20:20, Martynas Jusevičius wrote:


I think it is useful to think about the merge operation between datasets.

Here I mean a "physical" merge, where records with the same
identifiers become augmented with more data, when multiple datasets
are merged together. A "logical", or "semantic" merge, with vocabulary
mappings etc., comes on top of that.

So if you take the relational or XML models, there is no generic way
to do that. With RDF, there is: you simply concatenate the datasets,
because they have a stable structure (triples) and built-in global
identifiers (URIs).

That said, you should try approaching things from another end: start
building a small but concrete solution and solve problems one by one,
instead of overthinking/reinventing the top-down architecture. Until
you do that, you will probably not get relevant advice on these
mailing lists.

On Tue, Feb 14, 2017 at 6:21 PM, Sebastian Samaruga
<ssamarug@gmail.com> <ssamarug@gmail.com> wrote:

Sorry for me being so ignorant. But what could be called 'semantic' (in the
sense of 'meaning', I suppose) for the current frameworks, at least the
couple I know, available for ontologies of some kind if they could assert
between their instances which statements and resources are equivalent (being
them in a different language/encoding or different 'contextual' terms for
the same subjects for example).

Another important lack of 'semantics' is ordering (temporal or whatsoever)
where a statement or resource should be treated at least in relation to
their previous or following elements.

If my last posts where so blurry is because I try to address some of this
issues, besides others, trying no to fall in the promise that adhering to
one format will free us all of any interoperability hassles. Remember a
similar promise from XML: "All we have to do is share DTDs and
interoperate". I'll still trying to give the format a twist (RDF Quads) but
I'll publish a Google Document open for comments.


Received on Monday, 27 March 2017 23:39:55 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:10:59 UTC