W3C home > Mailing lists > Public > semantic-web@w3.org > April 2010

Re: connections

From: Kingsley Idehen <kidehen@openlinksw.com>
Date: Mon, 19 Apr 2010 09:19:49 -0400
Message-ID: <4BCC5875.7030006@openlinksw.com>
To: Danny Ayers <danny.ayers@gmail.com>
CC: greg masley <roxymuzick@yahoo.com>, Semantic Web <semantic-web@w3.org>, "dbpedia-discussion@lists.sourceforge.net" <dbpedia-discussion@lists.sourceforge.net>
Danny Ayers wrote:
> Thanks Kingsley
>
> still not automatic though, is it?
>   
Is it "Automatic or Nothing?" .

What's mechanical to Person A might be automatic to Person B, both are 
individual operating with individual context lenses (world views and 
skill sets).

What I can say is this: we can innovate around the Outer Join i.e., not 
finding what you seek triggers a quest for missing data discovery and/or 
generation. Now, that's something the Web as a discourse medium can 
actually facilitate, once people grok the process of adding Structured 
Data to the Web etc..


Kingsley
> On 18 April 2010 22:38, Kingsley Idehen <kidehen@openlinksw.com> wrote:
>   
>> Danny Ayers wrote:
>>     
>>> Kingsley, how do I find out when to plant tomatos here?
>>>
>>>       
>> And you find the answer to that in Wikipedia via
>> <http://en.wikipedia.org/wiki/Tomato>? Of course not.
>>
>> Re. DBpedia, if you have a Agriculture oriented data spaces (ontology and
>> instance data) that references DBpedia (via linkbase) then you will have a
>> better chance of an answer since we would have temporal properties and
>> associated values in the Linked Data Space (one that we can mesh with
>> DBpedia even via SPARQL).
>>
>> Kingsley
>>     
>>> On 17 April 2010 19:36, Kingsley Idehen <kidehen@openlinksw.com> wrote:
>>>
>>>       
>>>> Danny Ayers wrote:
>>>>
>>>>         
>>>>> On 16 April 2010 19:29, greg masley <roxymuzick@yahoo.com> wrote:
>>>>>
>>>>>
>>>>>           
>>>>>> What I want to know is does anybody have a method yet to successfully
>>>>>> extract data from Wikipedia using dbpedia? If so please email the
>>>>>> procedure
>>>>>> to greg@masleyassociates.com
>>>>>>
>>>>>>
>>>>>>             
>>>>> That is an easy one, the URIs are similar - you can get the pointer
>>>>> from db and get into wikipedia. Then you do your stuff.
>>>>>
>>>>> I'll let Kingsley explain.
>>>>>
>>>>>
>>>>>
>>>>>           
>>>> Greg,
>>>>
>>>> Please add some clarity to your quest.
>>>>
>>>> DBpedia the project is comprised of:
>>>>
>>>> 1. Extractors for converting Wikipedia content into Structured Data
>>>> represented in a variety of RDF based data representation formats
>>>> 2. Live instance with the extracts from #1 loaded into a DBMS that
>>>> exposes a
>>>> SPARQL endpoint (which lets you query over the wire using SPARQL query
>>>> language).
>>>>
>>>> There is a little more, but I need additional clarification from you.
>>>>
>>>>
>>>> --
>>>>
>>>> Regards,
>>>>
>>>> Kingsley Idehen       President & CEO OpenLink Software     Web:
>>>> http://www.openlinksw.com
>>>> Weblog: http://www.openlinksw.com/blog/~kidehen
>>>> Twitter/Identi.ca: kidehen
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>         
>>>
>>>
>>>       
>> --
>>
>> Regards,
>>
>> Kingsley Idehen       President & CEO OpenLink Software     Web:
>> http://www.openlinksw.com
>> Weblog: http://www.openlinksw.com/blog/~kidehen
>> Twitter/Identi.ca: kidehen
>>
>>
>>
>>
>>
>>     
>
>
>
>   


-- 

Regards,

Kingsley Idehen	      
President & CEO 
OpenLink Software     
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen 
Received on Monday, 19 April 2010 13:20:23 UTC

This archive was generated by hypermail 2.4.0 : Tuesday, 5 July 2022 08:45:17 UTC