RAND clause and W3C's mission

The W3C is presently considering a policy that would permit patented
technologies in standards so long as they are licensed under "Reasonable
And Non-Discriminitory" terms.  This policy would be  disaster for the
internet (which has thrived on entirely open standards while proprietary
networking technologies collapsed) and a violation of W3C's stated goals
and principles:

* UNIVERSAL ACCESS: To make the Web accessible to all by promoting
  technologies that take into account the vast differences in culture,
  education, ability, material resources, and physical limitations of
  users on all continents

  Consider a technology that would add $1.50 to the annual TCO of running
  a webserver (irrespective of volume, purpose, etc.).  This would
  probably be considered a RAND license.  This might be prohibitively high
  for most residents of Kenya, or some small charities, or many dependant
  children.  These groups (and others) would be blocked off the web, even
  though they may have a great deal to contribute culturally and
  substantively.  

  Perhaps most important, it would block casual users, who
  at present say 'hey, the Redhat installer set up Apache for me, what the
  explitive, I'll try it out' and sometimes become significant, but who
  would not become involved if there were licensing obstacles.

  You may think that micropayments cannot act as selective
  disenfranchisement, but consider the history of poll taxes in the
  U.S.  I do not accuse you of having the same motives, but I fear you may
  have the same effects.

* SEMANTIC WEB : To develop a software environment that permits each user
  to make the best use of the resources available on the Web

  Many web users make use of scripts of various sorts (from one-line bash
  scripts full of lynx commands to thousand-line PERL programs) to more
  usefully use the content of the web.  I myself have used awk and sed to 
  create and interpret HTML documents, and I certainly could not have
  applied for developer licenses in order to do so.  To put it simply: a
  pattent on HTML would have made it illegal or unfeasible for me to make
  full use of existing web resources.

* WEB OF TRUST : To guide the Web's development with careful consideration
  for the novel legal, commercial, and social issues raised by this
  technology

  Software pattent law is unfinished and in desperate need of
  reform.  Many would argue that it is in need of illimination (I 
  need not reproduce the arguements here, anyone who is 
  unfamiliar with the arguments is encouraged to read 
  http://lpf.ai.mit.edu/Patents/against-software-patents.html by the
  League for Programming Freedom).  In any case, as a standards body, 
  the W3C should keep away from such potential landmines, and as a
  body of software engineers, it should not appear to endorse such an
  inadequate legal system.

* INTEROPERABILITY: Specifications for the Web's languages and protocols
  must be compatible with one another and allow (any) hardware and
  software used to access the Web to work together.

  The use of patented protocols would block out all the groups described
  earlier, as well as those who have ideological objections to pattents
  (such as the Free Software Foundation, a very important entity in
  computing).

* EVOULTION: The Web must be able to accommodate future
  technologies. Design principles such as simplicity, modularity, and
  extensibility will increase the chances that the Web will work with
  emerging technologies such as mobile Web devices and digital television,
  as well as others to come.

  A patented technology evolves only with the consent of the patent
  holder.  Suppose a corporation holding a patent on a critical web
  technology judges (probably correctly) that the web is becoming a 
  threat to some other profit center.  They would probably refuse to
  develop a second version of the technology that was useful to the
  threatening application.  So long as the patent lasted, no one else
  would be able to, either.

* DECENTRALIZATION: Decentralization is without a doubt the newest
  principle and most difficult to apply. To allow the Web to "scale" to
  worldwide proportions while resisting errors and breakdowns, the
  architecture(like the Internet) must limit or eliminate dependencies on
  central registries.

  I outlined above how a single patent holder could block web development
  by malevalence.  But it could happen just as easily through
  incompetence.  Some would doubt that a company could simply miss the 
  applications of a patent so totally as to not develop (or permit to be
  developed) a needed technology, but in truth the history of computing is
  littered with companies quite this blind (or, in many cases, with those
  companies' remains, not that that would help us any).  Even in less
  extreme cases, patent entaglements could easily delay a standard for a
  month, or a month *per patent*, which could lead to W3C standards 
  consistently emerging too late to do any good.  

  Avoiding reliance on a single node is a good design principle.  It's
  also a good meta-design principle.

Presumably the purpose of RAND is to benefit from superior (patented)
technologies.  However, a list of such technologies would be difficult to
assemble.  GIF comes to mind, but PNG is quite thoroughly equivalent, and
the precedent is not ecouraging (http://www.gnu.org/philosophy/gif.html).
The internet has surpassed proprietary networking technologies for thirty
years and has had no difficulty in evolving and innovating.  With the free
software/open source movement stronger than ever, there is no reason to
expect problems now.

I sincerely hope that you will remain true to your goals and principles,
and reject the proposed policy.

--Daniel Speyer
"May the /src be with you, always"

Received on Sunday, 30 September 2001 19:39:12 UTC