- From: Francisco Javier López Pellicer <fjlopez@unizar.es>
- Date: Tue, 05 Apr 2011 11:05:43 +0200
- To: Giovanni Tummarello <giovanni.tummarello@deri.org>
- CC: Brandon Schwartz <brandon@boomajoom.com>, "bvillazon@fi.upm.es" <bvillazon@fi.upm.es>, Richard Cyganiak <richard@cyganiak.de>, Martin Hepp <martin.hepp@ebusiness-unibw.org>, semantic-web <semantic-web@w3c.org>
I start to believe that tools such as Elda and Pubby should publish in addition sitemap/void descriptions of the exposed SPARQL enpdoints. And even, the robots.txt when they are launched in a standalone fashion. I mean, these tools should be active players in data/SPARQL endpoint discovery in the Web, not only passive data publishers. Cheers, -- fjlopez Giovanni Tummarello wrote: > that is indeed our current reccomendation: > > please see this: > > http://sindice.com/developers/publishing > > "How to Publish Web Data for Effective Discovery and Synchronization" > > > > On Tue, Apr 5, 2011 at 9:30 AM, Francisco Javier López Pellicer > <fjlopez@unizar.es> wrote: >> Hi, >> >> Meanwhile, we can use the Sitemap protocol to point to human readable >> (HTML+RDFa) VoID descriptions. I mean, a pragmatic "semantic" sitemap tool >> should be a tool that creates for a linked dataset >> >> (1) its VoID description (this step is optional) >> >> (2) a standard sitemap (such as the tools in [1]) with links to relevant >> resources in the linked dataset (mandatory) and a VoID description (optional >> but recommended) >> >> I think that this approach is simpler and don't require to convince SEO >> consultants. >> >> In addition, we can use the Google extensions. For example, this one [2] >> about Code Search. This is a valid description: >> >> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" >> >> xmlns:codesearch="http://www.google.com/codesearch/schemas/sitemap/1.0"> >> <url> >> <!-- the HTML+RDFa --> >> <loc>http://dbpedia.org/page/Armenia</loc> >> </url> >> <url> >> <!-- the data (the code in Google terms) --> >> <loc>http://dbpedia.org/data/Armenia.rdf</loc> >> <codesearch:codesearch> >> <codesearch:filetype>xml</codesearch:filetype> >> </codesearch:codesearch> >> </url> >> </urlset> >> >> >> [1] http://code.google.com/p/sitemap-generators/wiki/SitemapGenerators >> [2] http://www.google.com/support/webmasters/bin/answer.py?answer=75225 >> >> Cheers, >> >> -- fjlopez >> >> Brandon Schwartz wrote: >>> I think that as Google and major search engines focus on quality of >>> information instead of quantity or simple backlink counts, they will begin >>> accepting semantic sitemaps. In the mean time, I think that using both >>> semantic and standard sitemaps is a viable option. >>> >>> As soon as SEO people are informed about the relevance that the semantic >>> web has for them and semantic sitemaps are easily available (say as >>> extensions in CMS systems such as http://drupal.org/project/xmlsitemap) then >>> I think it will take off. >>> >>> Sent from my iPhone >>> >>> On Apr 4, 2011, at 2:28 PM, Boris Villazón Terrazas<bvillazon@fi.upm.es> >>> wrote: >>> >>>> Hi all >>>>> On 4 Apr 2011, at 13:58, Martin Hepp wrote: >>>>>> I agree. But it is unlikely that Google will accept semantic sitemaps >>>>>> and it will be hard or impossible to convice SEO consultants to waive a >>>>>> Google-valid sitemap in favor of a semantic sitemap. So as of now, I think >>>>>> it is the best we can get. >>>>> Yes, I agree with this assessment. >>>> I'm talking from my ignorance .... but let's try to be optimistic. >>>> Let's hope that some day Google will accept semantic sitemaps ... ;) >>>> >>>> Boris >>>> >>>> >>>> >
Received on Tuesday, 5 April 2011 09:06:41 UTC