W3C home > Mailing lists > Public > ietf-http-wg-old@w3.org > September to December 2000

Re: Conformance Test for HTTP 1.1

From: Carl Kugler/Boulder/IBM <kugler@us.ibm.com>
Date: Mon, 9 Oct 2000 13:39:51 -0600
To: "Caveman" <hoffmankeith@hotmail.com>
Cc: "Mark Nottingham" <mnot@mnot.net>, "Miles Sabin" <msabin@cromwellmedia.co.uk>, <http-wg@cuckoo.hpl.hp.com>
Message-ID: <OF09EFC5B9.E8F8C700-ON87256973.006A259D@LocalDomain>

>I just want to put my two cents into this conversation:
>I think the idea of doing compliancy testing is great.  And the idea of
>having one "check everything test" is also a good thought.  However, how
>we guarantee that the test scenarios created are actually following the

I was thinking along the lines of a script (or script fragment) for each
MUST in the spec.  MUSTs are supposed to be verifiable, right?  All
compliant implementations, regardless of manufacturer/developer, must do
the MUSTs, right?   Using scripts makes it easy for people to inspect a
script and correct it if it isn't according to spec.

>I think this is something better left to outside agencies to address.  The
>testing game tends to get to be too industry biased.  Whether intentionally
>or not you will see tests similar to this proposed one done and get totally
>different results depending on who does it.
>I know this actually sounds like a good argument to create a "standard
>test", but in my opinion this leads the doorway too wide open to start
>skewing the tests in favor of one manufacturer/developer vs. another one.  I
>realize that there are currently many industry leaders involved in this
>organization and they provide valuable insights.  However, they are just
>involved in the CREATION of standards, not in judging the conformance to
>In short, while this is a good idea with the best interests of everyone in
>mind, I think this is probably stepping outside of the charter of the
>----- Original Message -----
>From: "Mark Nottingham" <mnot@mnot.net>
>To: "Miles Sabin" <msabin@cromwellmedia.co.uk>
>Cc: <http-wg@cuckoo.hpl.hp.com>
>Sent: Friday, October 06, 2000 11:30 AM
>Subject: Re: Conformance Test for HTTP 1.1
>> I think proxies are the biggest target, because they're so hard to
>> correctly, and so much more complex. In my experience, there's a fairly
>> variance in how implementors choose to interpret the spec.
>> Of course, once you do one for proxies, it's relatively easy to get client
>> and server test suites out of it.
>> On Fri, Oct 06, 2000 at 10:24:14AM +0100, Miles Sabin wrote:
>> > Mark Nottingham wrote,
>> > > I've lately been considering starting discussion of
>> > > development of something within the W3C, as it was involved
>> > > in the development of the HTTP, and has an established
>> > > history of developing similar tools (although I'm not sure if
>> > > W3C can formally commit resources).
>> > >
>> > > If anyone has any thoughts about this, please share them,
>> > > because I'd like to get this moving.
>> >
>> > This sounds like a fine idea (tho', as you say, it's an open
>> > question whether or not the W3C would be able to commit
>> > resources).
>> >
>> > Do you have any particular emphasis in mind: server, clients,
>> > or proxies, or all equal weight on all?
>> >
>> > Cheers,
>> >
>> >
>> > Miles
>> >
>> > --
>> > Miles Sabin                       Cromwell Media
>> > Internet Systems Architect        5/6 Glenthorne Mews
>> > +44 (0)20 8817 4030               London, W6 0LJ, England
>> > msabin@cromwellmedia.com          http://www.cromwellmedia.com/
>> >
>> --
>> Mark Nottingham
>> http://www.mnot.net/
Received on Monday, 9 October 2000 20:41:28 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:16:36 UTC