- From: Nathaniel Borenstein <nsb@nsb.fv.com>
- Date: Sun, 18 Jun 1995 15:08:22 -0400 (EDT)
- To: hardie@merlot.arc.nasa.gov (Ted Hardie)
- Cc: rating@junction.net, www-talk@www10.w3.org
Excerpts from mail: 14-Jun-95 Re: KidCode: Next steps Ted Hardie@merlot.arc.na (17090*) > One set, in particular, I think needs to be addressed immediately: the > idea that several different approaches should be implemented and sites > should comply with all of them. This, I believe, is a major mistake. I'm not suggesting that there should be several incompatible implementations of the same approach, merely that there are several applicable technologies -- URL labelling, blacklists, rating authorities, etc. -- which are all mutually compatible and can be pursued independently. While it is remotely conceivable, I seriously doubt that any one of them will satisfy all the real and perceived needs of the community. Therefore, I see it as pointless to argue over which is "best", and instead I prefer to focus on how to do my particular favorite -- URL labeling -- in the best possible way. > Mr. Borenstein has put forward his suggestion as an Internet-Draft; I, > and many others, assumed from this action that he saw the issue as one > which should be addressed by a standards body. I agree with that assessment, An Internet-Draft is an Internet-Draft, nothing more. I *never* intended this to be something that went down the whole standards track. The goal here is to publish an Informational RFC, which can be implemented or ignored by anyone who sees fit. It was published first as an I-D rather than an RFC because we wanted to get constructive input on the best way to do URL-level protection. > By setting a single standard method, we are far more likely to get > uniform usage (even if we do not got uniform compliance, the yardstick by which compliance is met will be easier to measure against). I don't give a hoot how we measure compliance. I want rough consensus and working code. > By setting a standard method through a recognized standards body like > the IETF, we are also far more likely to get recognition outside of the > U.S. market context; we need the experience of a group which has > negotiated international cooperative networking standards. The basic misperception here is that we intended this to go down the standards track. We did not. Think of it as being analogous to the mailcap RFC, which is informational only, and specifies compatible naming conventions for those who want them, and which is happily ignored by those who think that mailcap files are a bad idea. > I personally believe that KidCode is overly complex, inelegant, and > very easy to circumvent, Well, it's trivial to circumvent. It's a VOLUNTARY standard. You can circumvent it by saying "I won't use it." So you'll get no argument on that score. As for "overly complex" and "inelegant", these charges are meaningless unless you explain them. Personally, I'm very concerned if it is in any way overly complex -- simplification in such matters is almost always good. I have trouble getting too concerned about elegance, however -- this is, after all, the Internet, and "elegant" is not the word I would choose to describe most successful Internet protocols. (It does, however, describe an awful lot of the ones that failed.) > Because of the > importance likely duration of those issues, we need to think very > carefully about scalability and extensibility; many of the current > proposals may be workable now, but will not scale to meet the needs of > the Web as it will be in just a few months time. First of all, I agree completely. However, I can't imagine ANYTHING that scales better than a voluntary URL-based approach. It's scalability is one of the key reasons I prefer the URL approach. What doesn't scale about it? Second, saying "we need to think very carefully" is fine -- motherhood and apple pie -- but some of us believe we have already done a fair amount of careful thinking. If there's something about the KidCode proposal that doesn't scale, I'd like to know what it is. Constructive input is ALWAYS welcome. > We need to think in > the context of what standards will be emerging; implementing a > proposal that will not work in the context of HTTP-NG or other > emerging standards is ultimately counter-productive. Well, if HTTP-NG breaks URLs, it's a non-starter, I predict. > Worries over > proposed legislation should not force into something which could, > easily, having a crippling effect on the growth of the Net or the Web. We're not trying to force anybody into anything. We are, however, trying to move as fast as possible towards a single set of conventions for those of us who want to implement URL-based labelling. People who want to pursue other mechanisms are free to do so, and I encourage those efforts. -- Nathaniel -------- Nathaniel S. Borenstein <nsb@fv.com> | When privacy is outlawed, Chief Scientist, First Virtual Holdings | only outlaws will have privacy! FAQ & PGP key: nsb+faq@nsb.fv.com | SUPPORT THE ZIMMERMAN DEFENSE FUND! ---VIRTUAL YELLOW RIBBON-->> zldf@clark.net (http://www.netresponse.com/zldf)
Received on Sunday, 18 June 1995 15:10:34 UTC