- From: <noah_mendelsohn@us.ibm.com>
- Date: Wed, 21 Oct 2009 19:58:42 -0400
- To: Mukul Gandhi <gandhi.mukul@gmail.com>
- Cc: Julian Reschke <julian.reschke@gmx.de>, www-tag@w3.org
Mukul Gandhi writes: > Isn't HTTP 1.1 sufficient for us? I don't think one ever knows for sure in advance, and for that reason I'm not entirely comfortable with your question. Certainly HTTP is an extensible protocol. People do discuss new headers from time to time. When HTTP 1.1 was created, it was to better meet the performance and other needs of the Web traffic that was being supported. Who knows? Maybe in a few years it will be decided that an HTTP 1.2 or even HTTP 2.0 is needed, perhaps to better support something like video streaming. Certainly, the fact that HTTP is so widely deployed tends to mean that major changes tend to impact a lot of people and implementations, and that somewhat raises the bar. The HTTP 1.1 bis charter [1], which covers the next round of work on HTTP, states: The working group will refine RFC2616 to: * Incorporate errata and updates (e.g., references, IANA registries, ABNF) * Fix editorial problems which have led to misunderstandings of the specification * Clarify conformance requirements * Remove known ambiguities where they affect interoperability * Clarify existing methods of extensibility * Remove or deprecate those features that are not widely implemented and also unduly affect interoperability * Where necessary, add implementation advice * Document the security properties of HTTP and its associated mechanisms (e.g., Basic and Digest authentication, cookies, TLS) for common applications In doing so, it should consider: * Implementer experience * Demonstrated use of HTTP * Impact on existing implementations and deployments Note that this does allow for removal or deprecation of features, and that would in fact represent a change to the conformance requirements as far as I know (albeit in areas that aren't widely or interoperably implemented anyway). Anyway, while it's possible in principle to state in advance that the community is attempting to freeze a specification for all time, I think that's rarely appropriate practice for the Web. The Web will last a very long time. The sorts of content and applications it will have to support, and the machines and networks that it will run on, will probably include things we can barely imagine now. I think all we can do is to try very hard to build specifications that will adapt as gracefully as possible, and to minimize incompatible or disruptive changes when new requirements do dictate that the specifications should be revised. Even mechanisms as core to the Web as URI's are being gradually adapted or augmented with new specifications like IRI. I would expect the same to be true of HTTP over time. Noah [1] http://www.ietf.org/dyn/wg/charter/httpbis-charter.html -------------------------------------- Noah Mendelsohn IBM Corporation One Rogers Street Cambridge, MA 02142 1-617-693-4036 -------------------------------------- Mukul Gandhi <gandhi.mukul@gmail.com> Sent by: www-tag-request@w3.org 10/21/2009 07:42 PM To: Julian Reschke <julian.reschke@gmx.de> cc: www-tag@w3.org, (bcc: Noah Mendelsohn/Cambridge/IBM) Subject: Re: has XML, XSLT, XQuery and XML Schema have reached a stable state On Wed, Oct 21, 2009 at 9:54 PM, Julian Reschke <julian.reschke@gmx.de> wrote: > HTML and HTTP do not appear frozen to me. I could see, work is still continuing with HTML. I am just curious, what further can community expect, for HTTP evolution? Isn't HTTP 1.1 sufficient for us? -- Regards, Mukul Gandhi
Received on Wednesday, 21 October 2009 23:59:25 UTC