W3C home > Mailing lists > Public > www-tag@w3.org > June 2011

Re: Issue-57

From: Jonathan Rees <jar@creativecommons.org>
Date: Fri, 24 Jun 2011 12:16:24 -0400
Message-ID: <BANLkTikz2hWym8j1JOsAFymfmvuikHVPGA@mail.gmail.com>
To: Tore Eriksson <tore.eriksson@po.rd.taisho.co.jp>
Cc: www-tag@w3.org
2011/6/24 Tore Eriksson <tore.eriksson@po.rd.taisho.co.jp>:
> Jonathan Rees wrote:
>> Suppose the following holds:
>>
>>     <http://example/z> xhv:license <http://example/l1>.
>>
>> Suppose that I do a GET of 'http://example/z' and retrieve a "representation" R.
>>
>> My interlocutor wants me to be able to infer that
>>
>>     R xhv:license <http://example/l1>.
>>
>> so that my remix tool knows what license terms apply when using the bits in R.
>
> Sure, let's say this holds. However, when downloading R to a local file,
> creating a new resource <file:///tmp/z>, the licensing information might
> be lost. Wouldn't it be safer to use a license embedded in R (i.e. in HTML
> META tags)?

(Advising against using RDF is funny to hear from someone who puts RDF
in their email signature.)

These are all good questions, so thanks for listening.

You're switching problems on me. What was on the table was how to use
the amended httpRange-14 200 rule in concretely helpful communication.
Now you are asking about the reliability of the communication channel.
So even if you are right - and you are - I don't think this bears on
the question. The meaning of metadata expressed as statements
involving a URI does not change depending on what the communication
channel is, because URIs are *uniform* resource identifiers.

Not all metadata can be embedded, either.

> Jonathan also made this statement in an earlier mail:
>> If U is a dereferenceable
>> absolute URI, and M(<U>) for some metadata M, and a retrieval of U
>> yields a representation Z, then M(Z). E.g. if an information resource
>> has dc:title "Little mouse", then its associated representations do,
>> too. Conversely, if M(Z) consistently for Z retrieved from U, then
>> M(<U>).
>
> I do have a problem with the last part of this paragraph where the
> applicability of metadata is reversed. How are you supposed to determine
> whether M(Z) is consistent? Is this consistency at a specific point in
> time? How do I enumerate all possible representations of <U>?

OK, maybe not the best choice of words. Here is my reasoning.
Operationally M(<U>) says that if  GET(U) yields 200 Z, then M(Z). The
individual writing M(<U>) may not know exactly when the GET will be
done or by whom (or with what request details), so if M(<U>) is going
to be true then M(Z) may have to be true of a variety of Zs, or at
worst all future Zs. The easiest way to express this in FOL is using
universal quantification, and on a whim I colloquialized this to
"consistently", for which I apologize.

> I assume that httpRange-14 tries to avoid this consistency check through
> enforcing these rules globally on the web. Doesn't read like sound
> engineering and also seems like a hard task to me...

I'm not sure where enforcement comes into this. We're talking about
establishing prior agreements so that we can communicate with each
other. It's always good when statements are checkable, and these
statements are, but that doesn't mean they can be or need to be
checked all the time.

If you expressed the same information in some other way the same
questions about whether you're understood or believed or correct would
arise, so there is nothing special in this situation regarding those
things, as far as I know. We're basically just talking about protocol
design.

One way to enforce M(<U>) would be by doing GETs on U from time to
time and checking whether M(Z) holds when you get a 200. This is no
different from enforcing a claim that the temperature in a particular
location never gets above 25 C. If someone can be held accountable,
you work with them to lower the temperature, or if you can control the
temperature yourself, you turn on the air conditioner. Or you file an
insurance claim, etc.

If the statement comes from a reputable source, or the cost of
believing it when it is false is low, then there is less reason to
check.

Again, I am not saying that the httpRange-14 amended 200 rule is the
right tool for all jobs. I just want to clarify what it is, because
the first attack on it is always that it's nonsense or useless. That
doesn't make sense given how widely it's relied on. It may be "of
limited utility" (Larry's phrase) but I want to get past the
foolishness claim and confusion over how it works so that we can turn
the discussion to what needs to be done.

Jonathan

> Regards,
>
> Tore
>
> _______________________________________________________________
> <> dc:creator [
>   foaf:name "Tore Eriksson",
>             "トーレ エリクソン"@jp;
>   foaf:mbox_sha1sum "2bd9291b301f112775e118f96eb63314594b1a86";
>   foaf:workplaceHomepage <http://www.taisho.co.jp/> ].
Received on Friday, 24 June 2011 16:16:52 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 26 April 2012 12:48:36 GMT