W3C home > Mailing lists > Public > semantic-web@w3.org > October 2011

Re: SOA organised with RDF - update

From: Michael F Uschold <uschold@gmail.com>
Date: Tue, 4 Oct 2011 14:51:12 -0700
Message-ID: <CADfiEMOvpOtnkM=_TQyR0Nj=QsfwMWns2VH2z7HSEP6G+KodFQ@mail.gmail.com>
To: Frank Carvalho <dko4342@vip.cybercity.dk>
Cc: semantic-web@w3.org
Thanks for that.  A few comments...

On Tue, Oct 4, 2011 at 2:38 PM, Frank Carvalho <dko4342@vip.cybercity.dk>wrote:

> Hi Michael
>
> I will try to explaing the rationale behind this. First of all, when you
> build a SOA there are many things to be taken into account. You may be
> lucky enough to be able to pick your favourite modeling tool for the
> requirement specifications. In our case System Architect was already a
> household system when we started, so we stuck to that.
>
> The problem is that the requirement specification is going to consist of
> UML Class diagrams, Use Cases, service definitions, BPMN diagrams, even
> service description in MS Word, plus a bunch of other stuff.
> Furthermore, other modelling universes might be introduced.
>
> But all these areas are interconnected. The BPMN diagram may have an
> activity which is expanded into a Use Case. This Use Case may have
> sub-UCs and Use Case Steps, which again point to service definitions.
> Then, when we define services, we enter the XML WSDL and XSD world. When
> we develop and document the service we use javadoc, source code and more
> MS Word.
>
> Next problem is that the set of universes to include as metadata is
> invariably going to change over time. The next IT system may be built
> in .NET, and its documentation requires a different type of metadata.
> And anyway, each IT-house document their efforts in different ways. And
> then some bright head fancies rule bases, and introduces rules in the
> requirement specification on par with BPMN and Use Cases.
>
> And all these domains ARE interconnected, but are semantically very
> different areas. Each has its own semantic understanding of its domain,
> and very little notion of the semantics of its neighbours.
>
> At some point I started to examine formal OWL ontologies for UML Class
> diagrams, to see if it was useful. I found some articles describing a
> formal method of transforming Class diagrams into an OWL ontology. It
> was very nice, but it struck me that in order to do so, you simply had
> to make some tough decisions about the meaning of simple things such as
> inheritance, multiplicity etc. It is in the very nature of establishing
> formal semantics, that you rule out all alternative interpretations.
>

If you need alternative interpretations then create them.  If you don't know
whether an attorney is:
1. someone who passed a bar exam
2. someone who went to law school
3. someone who is currently practicing law

or whether a hospital is:
1. the building that health care is provided in
2. the legal organization that owns the building
3. the legal entity that has a number of beds registered for a specific kind
of medical services

Then create three different things with three different names and define
them accurately.


>
> But what if other interpretations are equally valid according to the
> original specifications? The problem in this area seemed to be, that the
> original UML Class diagram did really not have strict semantics at all,
> and a lot of things are left open to interpretation.
>

Exactly. Sometimes fuzziness is ok, sometimes it is not. If you want to root
out what things really mean, then the exercise of translating to OWL or any
other formal KR language is helpful.


>
> This is a big problem, because UML diagrams are meant as tools for
> communication, and people will interpret these things in different ways.
> They will also be taught to use UML in different ways, and are not going
> to accept a strictly religious only true interpretation of the
> semantics.
>
> Scale this problem up to all the other areas of information, and we will
> not be done with the big master ontology until SOA is a thing of the
> distant past.
>
> So my conclusion is, that any general ontology that tries to encompass
> all these different types of information is doomed to fail.
>

I don't know what the whole range of things are that you are talking about.
It depends on what you want the ontology for. That will dicate its scope and
level of detail and formality.


>
> Instead I say, live and let live. Let each domain set its own rules of
> interpretation, and accept those.


Sure - it is really nice if everyone understands everyone else (rarely so, a
major cause of grief).


> Focus should be on the connection
> points between semantic islands. Make sure that one subgraph shares
> objects/subjects with another subgraph, so that the interconnectivity is
> established, and then make queries in each area on its own terms.
>

Indeed. So if the maintenance department means the 'building' when they talk
about a hospital, and if the legal department is talking about the legal
entity registered with 400 heart care beds, so be it.  Have them both, and
say that the (hospital) building isOccupiedBy the (hospital) legal entity w/
registered beds.  Everyone wins. This can be the basis for integrating data
and applications from different depatrments with their different
interpretations/views of the world.

>
> It is true that to some extent you loose the capability of very general
> queries that span several "islands", but you gain a practical
> flexibility in that you can easily plug in new areas of knowledge, and
> change the understanding of the ones you already have. I suppose that if
> you really insist on a general high level ontology, this could probably
> be superimposed upon the metabase, but I would consider that an added
> bonus.
>
> The purpose of our metadatabase is to give an "imprint" of those aspects
> of the sources that are important to us. And for us important means
> those things that interconnect, and where there will be an impact if you
> change something.
>
> Therefore all predicates bear a relationship to the sources from where
> they came, and are rooted in the logic of those sources. To call it
> "ontologies" may even be too strong a word, as I am really talking about
> sets of fixed predicates of local relevance. There is nothing wrong in
> using a full blown OWL spec for an area, but it is just not something we
> have done so far.
>
> Terminology is therefore strongly local, but that is not bad at all. A
> query that spans several domains will be expressed with predicates that
> combine the features of these different domains, and they really make
> sense when you read them.
>
> It is similar, but not exactly the same, as using ontologies as a
> database schema. "Database schema" kind of suggests a design-first
> principle  - but in fact this is exactly the opposite.
> The database schema is open-ended and not fixed, and we may, at any time
> we like, backward engineer the current database meta-structure with a
> few simple queries. A sort of "Meta-structure of the day".
>
> I hope this makes sense and explains a little better what we are trying
> to do.
>
> Best
>
> /Frank
>
> > This is very intriguing, thanks for the update.  I'm curious about the
> > apparent lack of use for 'fixed' ontologies.  I get that in a highly
> > dynamic environment, having fixed metadata (whether as an ontology or
> > not) is a great hindrance.  What I'm curious about that you do not
> > discuss, is what the role is for the ontologies you do have.  One
> > obvoius possibility is as metadata for the data, i.e.ontology as data
> > schema.  You say you have many ontologies, do you need to map them to
> > one another to do any kind of data integration/interoperability to get
> > over problems of different terminology?  In that context a single (not
> > necessarily fixed) ontology can act as a lingua franca allowing the
> > multiple ontologies to cross-reference and translate among each other.
>
>
>
>


-- 
Michael Uschold, PhD
   Senior Ontology Consultant, Semantic Arts
   LinkedIn: http://tr.im/limfu
   Skype, Twitter: UscholdM
Received on Tuesday, 4 October 2011 21:51:50 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:48:31 UTC