W3C home > Mailing lists > Public > semantic-web@w3.org > May 2014

Business Intelligence + Semantic

From: Sebastian Samaruga <cognescent@gmail.com>
Date: Thu, 8 May 2014 16:32:53 -0300
Message-ID: <CAFnmbpU=5O266u4KAqnes=8yLFHXz5CRDpkcAe8j_8m6QWqGDA@mail.gmail.com>
To: "semantic-web@w3.org" <semantic-web@w3.org>, "pragmaticweb@lists.spline.inf.fu-berlin.de" <pragmaticweb@lists.spline.inf.fu-berlin.de>
(sorry, I'm posting this again because the previous mail had the links
wrong)

After a while, I've came up with this mix. I face up sources transformation
using RDF as an underlying unifying model of, for example and not limited
to: Tabular, XML, JSON, and even OLAP data sources as input. Then, perform
an 'ETL' and inference in a Loader layer where I can infer types an so and
then populate a semantic/semiotic graph. The idea is that the graph is
flexible enough to be viewed as any of the APIs mentioned in the document
(Tabular, Neo4J, XML, JSON, etc). Any of this APIs are to be implemented in
an ad-hoc manner so there is no limit if you need another format. I try to
explain the benefits of doing things this way in the document, like
analysis, mining and drill. Apologize if I'm not clear enough or even
totally wrong with this, I wrote this not having my medications at hand...

really, the first link is a very summarized readme of what is being to be
built. the other link is the Google Code project where the sources are
being hosted, check the 'Cognescent' folder for watching an implementation
of the semiotic graph
https://drive.google.com/file/d/0BxxuOINjaiBNRER3c3d3NnBaVWs/edit?usp=sharing

Best Regards,
Sebastian Samaruga
Received on Thursday, 8 May 2014 19:34:39 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 07:42:50 UTC