Self-censorship using URLs

Matthew C. Clarke (
Tue, 5 Sep 1995 15:49:14 +0200

Message-Id: <>
Date: Tue, 5 Sep 1995 15:49:14 +0200
From: (Matthew C. Clarke)
Subject: Self-censorship using URLs

I want to raise the ugly issue of censorship and suggest a possible scheme
which uses URLs to implement a helpful and hopefully non-threatening form
of self-censorship. I understand that various people and organisations have
been looking into the possibilities of self-censorship already, but I have
not seen details publicised of any firm proposals.

Please bear with my first contact with the URI working group with patience.
Has this sort of suggestion already been debated? How can I join in the
debate? To whom should I write in order to promote this idea?

Of course, there will be a lot of controversy about whether ANY form of 
censorship is acceptible on the 'net. However, I think that a modification to 
the syntax of URLs could be of benefit to all -- allowing freedom of speech 
while at the same time alerting readers to undesirable (or at least undesired) 

At the moment, a URL consists of two parts: a scheme followed by some
scheme-specific information. These parts are typically viewed as an access
method followed by a location. I propose that this should be extended to
include a third field which provides information about the information
content. Thus, 
the format of a URL will include:

   1. The access method (i.e. info about the item's syntactical structure)
   2. An indication of the content (i.e. info about the item's semantic content)
        -- although this could be an optional field
   3. The location.

For instance, a URL might be --

(I'm sure there can easily be agreement about whether to use quotes, colons
or some other delimiters.)

This format allows a simple way for any sort of comment about an item to be 
recorded as part of its URL. In particular, certain codes could be used as 
abbreviations for subject classifactions and censorship ratings. For instance, 
perhaps a code such as "V4" could become a standard way of denoting items whose 
content has a high degree of violence (rated from, say, 1 to 5).

The Content field of the URL could be used for many purposes and anyone's 
computer could make use of information filters based on this field. Someone 
could use the URL's Content field to search for items relating to golf. Someone 
else could filter out any items whose URL included a "V" rating over 2. There 
need be no rigid enforcement of such codes. If someone makes an item with 
violent content publically accessible but fails to give it an appropriate "V" 
rating, then the normal net-pressure could show them that they have erred.

The debate about whether and how to censor electronic information is sure
to become hotter. The scheme I have described will provide a useful feature
for many purposes, one of which is an Internet convention on
self-censorship. This is an idea still in embryo, but surely something of
this type could benefit everyone.


Matthew C. Clarke <>
University of Natal, Pietermaritzburg, South Africa
(PGP Public Key available on request or by Fingering