W3C home > Mailing lists > Public > public-rdf-in-xhtml-tf@w3.org > November 2009

Re: Non-XHTML host languages for RDFa

From: Mark Birbeck <mark.birbeck@webbackplane.com>
Date: Mon, 30 Nov 2009 13:51:57 +0000
Message-ID: <640dd5060911300551p48916425tb1ea45d4f130fb5b@mail.gmail.com>
To: Ivan Herman <ivan@w3.org>
Cc: Toby Inkster <tai@g5n.co.uk>, RDFa Developers <public-rdf-in-xhtml-tf@w3.org>
Hi Ivan,

> [...]
> However: if we work on a generic XML+RDFa, we essentially have two
> possibilities:
> 1. we define some sort of a generic mechanism whereby an XML application
> language (and maybe even the user!) can define his/her own set of
> keywords. This should be compatible with what we have in XHTML+RDFa and
> it is then up to the SVG group to decide whether they want to use it or not
> 2. we scrap the whole mechanism of keywords except for XHTML for
> backward compatibility reasons.
> I must admit I tempted to go for #2; the only reason we kept the keyword
> mechanism in RDFa was for historical reasons only, and I do not see why
> this mechanism would have any particular value for other XML dialects
> where history is not a factor...
> [...]

I favour #1. :)

In my view, having short 'tokens' for URIs is the Holy Grail...if we
can get to this point, then RDFa will essentially become an amalgam of
Microformats' ease of use, HTML's ease of deployment, and RDF's
scaleable and decentralised nature.

I discussed some of the advantages of 'tokenising the semantic web' in
a blog post, a while back. Forgive me for quoting myself, but the key
idea is in the middle of the document:

  Whilst it's obviously true that having unqualified values like 'fn'
and 'url' make
  it difficult to bring Microformats into the semantic web, we should be careful
  not to throw the baby out with the bathwater; what may be a weakness in
  terms of scalability, is a strength when it comes to authoring documents.
  Authors need only use simple values in their documents, without having to
  get involved with XML namespaces or other forms of prefix mappings.

  Of course, at some point our dumb machines still need to know how to map
  the token, but it's a lot better to get the machines to do the work, and allow
  authors the freedom of using simple tokens. [1]

My feeling is that we're getting closer to being able to find a
solution to this second step of the problem.



[1] <http://webbackplane.com/mark-birbeck/blog/2009/04/30/tokenising-the-semantic-web>

Mark Birbeck, webBackplane



webBackplane is a trading name of Backplane Ltd. (company number
05972288, registered office: 2nd Floor, 69/85 Tabernacle Street,
London, EC2A 4RR)
Received on Monday, 30 November 2009 13:52:40 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:02:05 UTC