W3C home > Mailing lists > Public > www-tag@w3.org > May 2002

Re: New issue: error recovery practices (Re: Proposed TAG Finding: Internet Media Type registration, consistency of use)

From: Al Gilman <asgilman@iamdigex.net>
Date: Thu, 30 May 2002 11:04:44 -0400
Message-Id: <>
To: "Simon St.Laurent" <simonstl@simonstl.com>, "www-tag@w3.org" <www-tag@w3.org>

At 09:22 AM 2002-05-30, Simon St.Laurent wrote:

>On Thu, 2002-05-30 at 07:47, Al Gilman wrote:
>> The simplest way to derive 'don't recovery quitely' from the business
>> notion of transparency is the following: to the extent that the
>> automated processing [by the web insfrastructure as people go-between]
>> departs from what the author expected, it should do so in a way that
>> is clearly accountable to the user, the other stakeholder with
>> standing in the transaction.  Don't make the transaction suddenly
>> behave as though there is another stakeholder whose interests are 
>> being pursued; show clear traceability to the interests of the 
>> principals to the transaction.  Always interrupting the user on 
>> fault-detection events is overkill; the users will reject this.  So we
>> need a more indirect 'effective control' policy.  But it has to 
>> satisfy this top-level transparency requirement; it has to work in a
>> way such that the user is aware of the process and is convinced that 
>> they are in command of it.  
>Just as they are in command of what happens when they get a 404 message?

Yes, absolutely.

>Users are a lot more resilient and generally smarter than software in
>having flexible responses to errors.  They also learn better.  I don't
>think users would reject being interrupted provided that it worked like
>a 404 rather than a "I don't understand this script so I'll give you
>half the page".

In most cases the users are happy for the browser to 'upgrade' something which has an interpretation as HTML to HTML.  You and I are the exception, here.  I went into this a bit in my reply to Steven.

We need better accountability for erroneous HTTP headers and uncontrolled trangendering of documents by clients.  Accountability to content creators.  But we don't have the loops in the architecture for this yet.  It's all open-loop.

>> It is the user that one appeals to because they are presumed to be 
>> more immediately available.  In a real-time collaboration scenario, all 
>> parties are available to exercise initiative in establishing a mutually 
>> acceptable adjustment in the expected pattern of infrastructure 
>> utilization.
>And Web development is NOT  real-time collaboration.  

Web operation spans real-time telecommunication, telecolaboration, multimedia presentation delivery subsuming canned and realtime components, and the more historical asynchronous push to persistence and pull from persistence mode of interaction.  It's all in there.

Anything less is asking for a secure home in 'of historical importance' land.

>If you can't be
>bothered to test your pages before you present them to the users, you
>deserve all the ire you get for interrupting them.  Vendors who produce
>browsers which discourage such testing deserve equal and perhaps more
>You business-vocabulary stories are far too rosy and generous for a Web
>that often works - and works best - on a does this work/does this not
>work/okay I'll go someplace else if it doesn't basis.

You have to distinguish what works -- 404s and user choice of another resource to pursue -- from where we need to extend this closed-loop success: service chains that thread through a more multi-tiered industry.

It may be that the first place the incumbent practice majority-honors a text/plain indication on something that has a plausible interpretation as text/html is in the traffic from the HTTP binding of a Web Service that states the text/plain character of the traffic in its WSDL.  Mabye not unless it is text/plain served over SOAP.  Who knows?  I don't know.   But I do think that we stand a better chance for strictness in emerging practice than in retrofit.

There are simple reasons why things like META HTTP-EQUIV and META REFRESH for redirects emerged.  This made the content provider less dependent on the server operator.  When one couldn't count on the server operator to be responsive to the niceties of typing.

The theory that the server operator and the content provider were one agent didn't hold up in the realities of the business tiering of the service delivery chain.  This undercut the proposition that the header field metadata were part and parcel of the utterer's created object.

>Helping people frequently means telling them no.

Customer satisfaction means not having to say this too often.

If there is no feasible option in view, one should think again, harder, before saying simply 'no' to a customer.  Sometimes it is a matter of bringing the feasible option(s) into view.

WCAG 2.0: if there are bars to some users using a resource, there have to be options; and the user has to be able to recognize that there are options and to exercise the option.

The silent 'upgrading' of de_facto text/html documents to HTML processing is functionally recognizable as a hardwired user preference.  It is arguably de_facto a strong-majority user preference.  So we have to go after the 'hardwired' and not the 'preference.'  The processing should be more inclusive of those who don't share this preference.  So the presumption of this preference should be configurable.

Pushing for more change than that will result in valid push-back in terms of "this is not what our users want."  We don't need to go there; don't.


>Simon St.Laurent
>Ring around the content, a pocket full of brackets
>Errors, errors, all fall down!
Received on Thursday, 30 May 2002 11:05:29 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:55:51 UTC