W3C home > Mailing lists > Public > ietf-http-wg@w3.org > April to June 2014

Re: END_SEGMENT and END_STREAM redundant

From: Roberto Peon <grmocg@gmail.com>
Date: Sat, 19 Apr 2014 20:59:53 -0700
Message-ID: <CAP+FsNfLjN+YonvVzx2fya-As2_qQ4fB-D5sgzT4BCLBO2UeAQ@mail.gmail.com>
To: David Krauss <potswa@gmail.com>
Cc: Adrian Cole <adrian.f.cole@gmail.com>, HTTP Working Group <ietf-http-wg@w3.org>
On Sat, Apr 19, 2014 at 7:23 PM, David Krauss <potswa@gmail.com> wrote:

>
> On 2014–04–20, at 9:14 AM, Roberto Peon <grmocg@gmail.com> wrote:
>
> > The implementation and API on top of that allows or disallows the
> expression of such things to the application.
> > It is the duty of the protocol to make it possible to express these
> things, not to mandate how it is used,
>
> It’s probably just an editorial issue, but the spec wording currently
> doesn’t clarify the intent or motivating semantics of segmentation, as
> shown by K Morgan’s recent thread.
>
> If there were a nice diagram of a segment-message or some BNF like you
> wrote in your earlier reply to me, then applications might be more likely
> to assign bits to their intended meanings and less likely to put excessive
> demands on the API.
>
>
No arguments there. It would be a good idea to add text to be clear that
END_STREAM and END_SEGMENT have different purposes.


> > unless it is necessary for interop.
> >
> > The spec defines the minimum for interop: each bit has its own meaning.
> Interpretation is up to the application.
>
> There’s some circular reasoning here. Interoperability refers to what
> intermediaries may change, or to a lesser extent what synonymous
> bit-codings portable APIs (e.g. Javascript XHR) may merge. If an underlying
> representation may be changed according to the semantics it expresses, then
> relying on bits is not interoperable.
>
>
Interoperable implementations of the protocol may not understand each other
at the application layer. That is not a problem the protocol can solve.


> A different programmer sensibility is often applied to binary coding than
> to text, but it’s best to use the same approach either way. A format
> defines the expression of a variety of messages, and those messages
> comprise the only defined meaning.
>
>
I suspect we're arguing semantics at such a level at this point that it
doesn't matter, but the protocol cannot define a meaning: It defines a
grammar.

At this point I suspect we're mostly violently agreeing.
-=R
Received on Sunday, 20 April 2014 04:00:20 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:14:30 UTC