Re: Use cases

On 02.01.2011 19:01, Benjamin Hawkes-Lewis wrote:
> On Sun, Jan 2, 2011 at 10:56 AM, Julian Reschke<julian.reschke@gmx.de>
> wrote:
>>> The hypermedia interface is enabled by vendors agreeing on the common
>>> interpretation of the semantics of a vocabulary and building
>>> interface that reflects that interpretation, not by any particular
>>> mechanism of enforcing the uniqueness of vocabulary terms.
>>
>> So?
>
> My objection to arbitrary vocabularies was that they break the uniform
> interface. You said they don't because of namespaces. I think this shows
> that's a red herring.

They break the same way as new media types, at least as long they are 
used in a format that's designed for this kind of extensibility.

> Agreed.
>
> But the implementors of the client software of the world wide web coming
> together to agree how they will evolve the uniform interface is
> different from producing gobbledygook only understood in "controlled
> environments" like an intranet with known software characteristics, or
> only understood via special communication between clients and servers
> like in an RPC service.

What does RPC have to do with this?

I understand that you're only interested in extending the format when 
"the implementors of the client software of the world wide web" agree. I 
happen to disagree with this point of view.

> ...
>> Just because there's more than one way doesn't mean the other way "can't" be
>> used.
>
> No need has been demonstrated.
> ...

It hasn't? So why didn't we stick MathML semantics into RDFa, Microdata, 
class attributes or data-* attributes?

>> I like progressive enhancement. It would be nice if it's always possible to
>> use.
>>
>> It works best if you start with data that's close enough to what HTML
>> already allows.
>
> When the text/html media type does not allow you to represent your data,
> maybe you're using the wrong media type …

I think you're missing the compound use case.

>>> Or as Fielding puts it:
>>>
>>> "Distributed hypermedia provides a uniform means of accessing
>>> services through the embedding of action controls within the
>>> presentation of information retrieved from remote sites. An
>>> architecture for the Web must therefore be designed with the context
>>> of communicating large-grain data objects across high-latency
>>> networks and multiple trust boundaries."
>>>
>>> http://www.ics.uci.edu/~fielding/pubs/dissertation/introduction.htm
>>>
>>> Breaking RESTful expectations and endangering end-users in this way
>>> is the exact opposite of what W3C should be encouraging.
>
> [snip]
>
>> What does the 2nd sentence have to do with what we're discussing?
>
> You agreed above that "unrecognized vocabularies break the uniform
> interface". How is that not "Breaking RESTful expectations"?
>
> You agreed above that "unrecognized vocabularies" require consumers to
> apply "untrusted JS". How is that not "endangering end-users"?

Please stop putting words in my mouth.

> How is the mission of the W3C irrelevant to the use cases this task
> force should address?

Did anybody say that the mission of the W3C is irrelevant?

Can we please focus on actual technical questions?

>>>> But sometimes, annotating the HTML clearly is not the best approach, in
>>>> which case alternate embeddable vocabularies may be the better choice.
>>>
>>> Prove it.
>>
>> We just added MathML and SVG, right?
>
> That suggests we sometimes need to expand the core vocabulary by means
> of the standards process, not that we need to bypass the standards
> process.
>
>>> Please give a real example of a resource that you imagine *cannot* be
>>> represented in terms of the uniform interface provided by text marked
>>> up with generic HTML/MathML/SVG semantics.
>>
>> Oh, so you say that after adding these, no new use cases will ever
>> surface?
>
> No, I'm trying to focus the discussion on solving real human problems.

So embedding 2D vector graphics is a "real human problem", but embedding 
3D graphics is not?

>> There are many more vocabularies that might qualify; the 3D stuff is
>> one, Music and Chemistry might be others.
>
> These are entire vocabularies rather than the concrete examples of
> resources I was hoping for, but thanks for raising them nevertheless.
>
> 3D and music are essential aspects of the human experience and chemistry
> is one of the fundamental sciences, so it is very important that W3C treats
> them seriously and gets them right by adding any required features to
> the core vocabulary, rather than just trusting in the magic of
> distributed extensible gobbledygook to meet the W3C's commitments to
> interoperability, internationalization, accessibility, security, etc.

See, *I* don't think it's a good thing that we *have* to add them to a 
core vocabulary. It's nice that we *can* do it, but it's very bad if we 
*have* to.

 > ...
>     - There are commercial incentives against making musical scores
>       available in an open format on the web, including bandwidth,
>       copyright, and vendor lock-in.
>     - We need to support multiple representations (different
>       scoring notations from different cultures, Braille, talking
>       music, possibly actual performance).
>     - There is more than one digital format for musical notation.
>     - Existing formats have interoperability problems.
>     - Existing formats confuse semantics and formatting.
>     - Some existing formats are hard to hand author (not least because
>       of XML).
>     - Rendering music notation is extremely complex.
>     - Browsers don't render the existing formats natively, although
>       there is some plugin support.
> ...

Yes. DRM can be bad. (I don't like it). New formats should address 
accessibility. There may be multiple formats. Some are hard to author. 
Some are hard to consume. And so on.

Could you please remind me what this has to do with the current 
discussion? Do you seriously think that the HTML WG and/or the WHAT WG 
are the best places to standardize all kinds of things?

> ...
>> The only difference here is that you want central control. That's a process
>> question.
>
> Central steering is critical to the uniform interface.
>
> The "process" (getting vendors and users and authors together in a place
> where multiple concerns like accessibility and internationalization are
> addressed) is critical to the quality of the uniform interface.
>
> If you don't think central steering is useful, if you don't think having
> at least one format that provides a baseline of access to the riches of
> the internet is handy, then you don't need the W3C or IETF. Just publish
> gobbledygook as text/html. The internet police can't stop you.
> ...

First of all, I truly believe that competition is good (*). To make 
competition workable, you need extensions points people can use without 
having to consult a specific Working Group.

At some point, converging on a common format *can* be a good idea. When 
this happens, producers (as opposed to "vendors") and recipients 
*should* work together. The place for this could be the W3C, or the 
IETF, OASIS, microformats.org, whatnot. It depends on the subject matter.

Best regards, Julian

(*) Should I mention RelaxNG?

Received on Sunday, 2 January 2011 20:34:53 UTC