W3C home > Mailing lists > Public > whatwg@whatwg.org > August 2008

[whatwg] Creative Commons Rights Expression Language

From: Paul Prescod <paul@prescod.net>
Date: Thu, 28 Aug 2008 05:51:03 -0700
Message-ID: <1cb725390808280551p15683718q7d2931a4db4e7b58@mail.gmail.com>
On Thu, Aug 28, 2008 at 2:28 AM, Ian Hickson <ian at hixie.ch> wrote:
>...
>
>> Site-specific hacks don't scale to the Web. A solution that scales will
>> require a single parser, not site-specific parsers (though site-specific
>> parsers will likely be a transition path.)
>
> To scale to the whole Web, the only thing I can see working is the
> computers understanding human language. I just don't see the whole Web
> marking up their data using fine grained semantic markup. We have enough
> trouble getting them to use <h1> and <p>.

When did it become necessary for every new HTML element to be used by
every author of every web page on the web? A huge amount of browsing
time is spent on the top hundred web sites. If they do it right, it
will filter down. If it doesn't the web is still a better place than
if those top hundred sites did not use standards for representing
metadata.

> I think (some hip) sites will totally plug in, just as they already have,
> using site-specific scripts that can be downloaded by the users of those
> sites. I think a few will use simple domain-specific fine grained markup
> conventions (like Microformats); I think fewer still, possibly many but
> likely not a critical mass, will use RDF and RDFa.

Why would "hip sites" prefer site-specific scripts to standard
markups, standard scripts and/or browser features? Is it really
logical for each of the top sites to invent their own markup and
scripts rather than cooperate on common tools?

> ...
> I don't see that tools like Ubiquity give any incentive to use RDF. The
> immediate reward from a hard-coded site-specific script is more effective
> than the compound reward of writing a generic script (typically a harder
> task), convincing at least one site to rewrite its markup to use a
> suitable convention, and then debugging the script to work around the bugs
> that that site has, even if one eventually convinces multiple sites to
> support the same conventions.

Good point. It turns out that we don't need standards bodies at all.
It is also easier *at first* for every site to write their own vector
markup or stylesheet language. It is even easier to invent your own
networking protocol than to get one standardized. (after all, you must
invent it before you can get it standardized)

I don't see why you believe that metadata is uniquely immune to the
forces of standardization.

> This mirrors what happens today (e.g. GMail and other big
> sites have contacts APIs, a small number of sites have hCard, a very
> few have FOAF).

HTML has no standard mechanism for embedding contacts. hCard is a sort
of de facto mechanism. Given how long it takes Web standards to work
their way through the ecosystem, I think it's doing okay. Google
supports it on some key sites. Yahoo supports it on some as well. Does
it really need to be supported on Bob's Hockey Team site in order to
be a success? It should be available and accessible to Bob if he wants
the feature, but if not, that's cool too. Javascript is not necessary
for every site out there either.

 Paul Prescod
Received on Thursday, 28 August 2008 05:51:03 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:04 UTC