Re: Non-XHTML host languages for RDFa

On Mon, 30 Nov 2009 14:51:57 +0100, Mark Birbeck  
<mark.birbeck@webbackplane.com> wrote:

> Hi Ivan,
>
>> [...]
>>
>> However: if we work on a generic XML+RDFa, we essentially have two
>> possibilities:
>>
>> 1. we define some sort of a generic mechanism whereby an XML application
>> language (and maybe even the user!) can define his/her own set of
>> keywords. This should be compatible with what we have in XHTML+RDFa and
>> it is then up to the SVG group to decide whether they want to use it or  
>> not
>>
>> 2. we scrap the whole mechanism of keywords except for XHTML for
>> backward compatibility reasons.
>>
>> I must admit I tempted to go for #2; the only reason we kept the keyword
>> mechanism in RDFa was for historical reasons only, and I do not see why
>> this mechanism would have any particular value for other XML dialects
>> where history is not a factor...
>>
>> [...]
>
> I favour #1. :)

In chatting with Ivan, I favour something like:
	take the applicable namespace
	append "/vocab#"
	use that as the missing prefix.

Steven

>
> In my view, having short 'tokens' for URIs is the Holy Grail...if we
> can get to this point, then RDFa will essentially become an amalgam of
> Microformats' ease of use, HTML's ease of deployment, and RDF's
> scaleable and decentralised nature.
>
> I discussed some of the advantages of 'tokenising the semantic web' in
> a blog post, a while back. Forgive me for quoting myself, but the key
> idea is in the middle of the document:
>
>
>   Whilst it's obviously true that having unqualified values like 'fn'
> and 'url' make
>   it difficult to bring Microformats into the semantic web, we should be  
> careful
>   not to throw the baby out with the bathwater; what may be a weakness in
>   terms of scalability, is a strength when it comes to authoring  
> documents.
>   Authors need only use simple values in their documents, without having  
> to
>   get involved with XML namespaces or other forms of prefix mappings.
>
>   Of course, at some point our dumb machines still need to know how to  
> map
>   the token, but it's a lot better to get the machines to do the work,  
> and allow
>   authors the freedom of using simple tokens. [1]
>
>
> My feeling is that we're getting closer to being able to find a
> solution to this second step of the problem.
>
> Regards,
>
> Mark
>
> [1]  
> <http://webbackplane.com/mark-birbeck/blog/2009/04/30/tokenising-the-semantic-web>
>
> --
> Mark Birbeck, webBackplane
>
> mark.birbeck@webBackplane.com
>
> http://webBackplane.com/mark-birbeck
>
> webBackplane is a trading name of Backplane Ltd. (company number
> 05972288, registered office: 2nd Floor, 69/85 Tabernacle Street,
> London, EC2A 4RR)
>

Received on Monday, 30 November 2009 14:24:25 UTC