- From: Steve Harris <steve.harris@garlik.com>
- Date: Thu, 17 May 2012 16:22:21 +0100
- To: Kingsley Idehen <kidehen@openlinksw.com>
- Cc: public-lod@w3.org
- Message-Id: <D94CABED-6548-4C3C-B362-DE9C5DD7A2A0@garlik.com>
On 2012-05-17, at 00:04, Kingsley Idehen wrote: > On 5/16/12 6:55 PM, Bernard Vatant wrote: >> >> Adrian >> >> Don't dream of accessing the Google Knowledge Graph and query it through a SPARQL endpoint as you do for DBpedia. As every Google critical technological infrastructure, I'm afraid it will be well hidden under the hood, and accessible only through the search interface. If they ever expose the Graph objects through an API as they do for Gmaps, now THAT would be really great news. >> >> Kingsley says they have Freebase, yes but Freebase stores only 22 million entities according to their own stats, which makes less than 5% of the overall figure, since Google claims 500 million nodes in the Knowledge Graph, and growing. So I guess they have also DBpedia and VIAF and Geonames and you name it ... whatever open and structured they can put their hands on. Linked data stuff whatever the format. >> >> Bernard > > And it will be query accessible, this is something that's inevitable and unavoidable. This is the Web. I doubt it. Google don't even allow API access to their search engine. I can still remember the days when they were a search company ;) For them it's all about staying ahead of the competition so they can get more eyeballs on google ads, and more tracking data - interactions with humans basically - providing APIs to their graph data doesn't help that aim. Doesn't mean they won't do it, but I don't think there's any reason for them to. - Steve -- Steve Harris, CTO Garlik, a part of Experian 1-3 Halford Road, Richmond, TW10 6AW, UK +44 20 8439 8203 http://www.garlik.com/ Registered in England and Wales 653331 VAT # 887 1335 93 Registered office: Landmark House, Experian Way, Nottingham, Notts, NG80 1ZZ
Received on Thursday, 17 May 2012 15:23:06 UTC