W3C home > Mailing lists > Public > ietf-http-wg-old@w3.org > September to December 2000

Re: Conformance Test for HTTP 1.1

From: Caveman <hoffmankeith@hotmail.com>
Date: Sat, 7 Oct 2000 15:54:58 -0500
To: "Mark Nottingham" <mnot@mnot.net>, "Miles Sabin" <msabin@cromwellmedia.co.uk>
Cc: <http-wg@cuckoo.hpl.hp.com>
Message-ID: <LAW-OE45BXezZ7SApFZ00000b2a@hotmail.com>
I just want to put my two cents into this conversation:

I think the idea of doing compliancy testing is great.  And the idea of
having one "check everything test" is also a good thought.  However, how do
we guarantee that the test scenarios created are actually following the
"specs"?

I think this is something better left to outside agencies to address.  The
testing game tends to get to be too industry biased.  Whether intentionally
or not you will see tests similar to this proposed one done and get totally
different results depending on who does it.

I know this actually sounds like a good argument to create a "standard
test", but in my opinion this leads the doorway too wide open to start
skewing the tests in favor of one manufacturer/developer vs. another one.  I
realize that there are currently many industry leaders involved in this
organization and they provide valuable insights.  However, they are just
involved in the CREATION of standards, not in judging the conformance to
them.

In short, while this is a good idea with the best interests of everyone in
mind, I think this is probably stepping outside of the charter of the
organization.

-kh

----- Original Message -----
From: "Mark Nottingham" <mnot@mnot.net>
To: "Miles Sabin" <msabin@cromwellmedia.co.uk>
Cc: <http-wg@cuckoo.hpl.hp.com>
Sent: Friday, October 06, 2000 11:30 AM
Subject: Re: Conformance Test for HTTP 1.1


>
>
> I think proxies are the biggest target, because they're so hard to
implement
> correctly, and so much more complex. In my experience, there's a fairly
wide
> variance in how implementors choose to interpret the spec.
>
> Of course, once you do one for proxies, it's relatively easy to get client
> and server test suites out of it.
>
>
>
> On Fri, Oct 06, 2000 at 10:24:14AM +0100, Miles Sabin wrote:
> > Mark Nottingham wrote,
> > > I've lately been considering starting discussion of
> > > development of something within the W3C, as it was involved
> > > in the development of the HTTP, and has an established
> > > history of developing similar tools (although I'm not sure if
> > > W3C can formally commit resources).
> > >
> > > If anyone has any thoughts about this, please share them,
> > > because I'd like to get this moving.
> >
> > This sounds like a fine idea (tho', as you say, it's an open
> > question whether or not the W3C would be able to commit
> > resources).
> >
> > Do you have any particular emphasis in mind: server, clients,
> > or proxies, or all equal weight on all?
> >
> > Cheers,
> >
> >
> > Miles
> >
> > --
> > Miles Sabin                       Cromwell Media
> > Internet Systems Architect        5/6 Glenthorne Mews
> > +44 (0)20 8817 4030               London, W6 0LJ, England
> > msabin@cromwellmedia.com          http://www.cromwellmedia.com/
> >
>
> --
> Mark Nottingham
> http://www.mnot.net/
>
>
Received on Saturday, 7 October 2000 21:55:56 EDT

This archive was generated by hypermail pre-2.1.9 : Wednesday, 24 September 2003 06:33:40 EDT