W3C home > Mailing lists > Public > w3c-wai-ig@w3.org > October to December 2000

RE: belittling designers, two kinds of accessibility

From: Al Gilman <asgilman@iamdigex.net>
Date: Fri, 20 Oct 2000 14:04:48 -0400
Message-Id: <200010201744.NAA1447549@smtp1.mail.iamworld.net>
To: Wayne Myers-Education <wayne.myers@bbc.co.uk>, w3c-wai-ig@w3.org
At 03:38 PM 2000-10-20 +0100, Wayne Myers-Education wrote:
>Anne,
>
>To respond:
>
>> The web is probably no longer what was originally 
>> envisioned, but the
>> original vision was limited and didn't take into account either the
>> mushrooming of bandwidth and technology, or the popularity 
>> that the web has
>> gained.
>
>Exactly what is limiting about a vision of a system of documents that is
>independent of the kit used to view them? 

AG::

But that's not the Web, even originally.  That is an SGML idea, which is only
accidentally involved in the origins of the Web.  The Web idea was not of
"documents independent of the kit used to view them."  The Web idea required a
user agent that implemented hyperlinks encoded as URI references.  Tim used
SGML because it was there, and did something he needed, which was to provide a
very simple way to hide the hrefs in the natural-communication medium. 
Separation of content vs presentation was not an issue involved in the web
innovation.  Nor was the fact that the initial medium was style-free text. 
That was an accident of the initial implementation.

The central features of Tim's innovation were a) the hypermedia premise, that
links to more information were marbled through natural-communication media,
and
b) the URI premise, that the media carrying the hyper-references did not
distinguish in the surrounding media between references to ftp, gopher, WAIS,
local files, or HTTP.  To the hypermedia, they were all just references to
external resources.  Period.  The differences were strictly hidden within the
URI itself.

This put all external references into one superclass of object, and made the
URI namespace the World- [i.e. Internet-]wide scope of external references for
any Web media.  Any Web document could link to any Web resource and the user
_could get there_.

HTTP was a convenience function, and not an essential premise of the
architecture even 'though the as-build Web often acts as though there are no
URIs other than HTTP URLs.  But there was no Web-intrinsic principle of
device-independence; only embedded actionable references into one unified
namespace.  That was the essential innovation of Tim's synthesis of the Web.

>                                         And the  mushrooming of bandwidth
>on which planet are you talking about? Here in the UK, I am going to be
>stuck with a 56K modem for some considerable time to come, and I know that
>there are many many people around the world who connect to the internet at
>still lower speeds. Fortunately, the original vision of the web is not
>limited to a specific level of bandwidth. Your vision of what the web might
>become, frankly, is limited to a certain minimum level of bandwidth, below
>which, there is no access at all for anyone. Thanks but no thanks. And
>anyway, that ain't the web.
>
>> At some time in the years to come, the "original" 
>> concept of the
>> web will go away and the "horseless carriage" as a toy of the 
>> privileged
>> will be gone ... it's almost there already!  
>
>Having large amounts of bandwidth *is* currently only for the privileged,
>and while this remains the case, the original concept of the web is here to
>stay. Thankfully. As for the horseless carriage analogy, now that we have so
>many horseless carriages running around on the roads we have serious
>consequent problems due to congestion and pollution. I wouldn't take the
>analogy too far, but I'd point out that the average speed for vehicles on
>the road in London, where I live, is the same today as it was 100 years ago,
>due to congestion and despite the fact that we 'all' have cars - in fact
>precisely because we 'all' have cars - and they get in each others' way.
>Similiarly, if everyone had a massive bandwidth connection to the net it is
>not clear that this would speed stuff up much by itself, since the backbone
>of the net too would have to see a concomitant bandwidth increase of
>gargantuan proportions to handle the vastly inflated overall volume of
>traffic that would be the likely result.
>
>If we all had home pages one megabyte in size (ouch), I suspect it would be
>just as annoyingly long a wait to download on a fast connection as it is
>today with 100K home pages (ouch) over 56K modems. I'm not sure of the exact
>maths but suspect it isn't good. Meanwhile, textfiles are small and lean and
>multi-media isn't. And thus will it forever be.
>
>> But I cannot say that any one of them came to the web 
>> expecting to cherish
>> "documents" ... Should they be sent back to tv just because 
>> the originators
>> of the web didn't envision they would be here? Perhaps 
>> purists can argue
>> so, but these people are too real to me for those arguments 
>> to have much
>> flavor. 
>
>I'm sorry, Anne, but this argument sounds absurd to me. It is as if you have
>been given a fish and are complaining because it is not a bicycle. No-one is
>sending anyone back to tv. No-one is asking anyone to cherish documents.
>No-one is accusing anyone of not being real. But the web is still the web -
>a device-independent document collection. And a large collection of putative
>universally understandable multi-media documents is a large collection of
>putative universally understandable multi-media documents - and not the web.
>Crack the problem of device-independent multi-media and you may be onto
>something - SMIL may have part of the answer here - but it's still not going
>to change what the web is. That fish is still a fish, even if imaginative
>folks like ourselves can close our eyes and imagine wheels and a chain on
>it. The best bicycles - and I am being wholly serious - are not based on
>fish. Similarly, the bicycle here - the vision of device-independent
>multi-media universally understandable by all, ought not be based on the
>fish that is web, but should be built as a bicycle from the ground up.
>
>Hope this make sense.
>

AG::

It all makes perfectly logical sense.  What it fails is the reality check.  It
fails to face the facts of the Web either at its inception or today.  The
preponderance of new pages added to the Web content and the preponderance of
current hits against web pages today are part of a post-Mosaic genre where
point and click is much more intrinsic than "separation of content from
presentation."  It was the use in Mosaic of what had been learned through the
GUI revolution to _reduce_ the separation between what you see and what you
get
that was the final straw launching the Web into hyper-growth-rate.

I occasionally grumble about NCSA sometimes sounding as if it invented the
Web.  Tim made the most critical contribution, and he did not use a GUI.  But
that is the view of a computer scientist.  A marketing economist might well
say
that most of the current Web market is predicated on a Web which also presumes
PPP and the point-and-click GUI in addition to Tim's seminal work.

What we are actually facing today is a segmentation of the Web market in some
national/cultural areas that is not the same world-wide.  In Japan the DoCoMo
phone processes standard websites fine.  If somebody tried to sell that in the
U.S., they would fail, because the "standard websites" in the U.S. are more
visual-premise, bandwidth-hogs than the "standard websites" in the Japanese
market.  This is not to say they are wrong, or not the Web.  But they are
different, and the idea of one HTML document that fits all clients is limited
in what range of client processes (including communication bandwidth) it can
span.  Refer to the proceedings of the recent Device-Independence workshop for
discussion of how the content, and not just the presentation needs to
change to
address devices of different capability (if you divide presentation from
content more or less along the same lines as in HTML 4 vs. CSS2).

<http://www.w3.org/2000/10/DIAWorkshop/>http://www.w3.org/2000/10/DIAWorkshop/

The World-Wide ambitions of the Web to unify communication had better worry
about what is happening in response to broadband vs. mobile differences in the
U.S. because even 'though it is not world-wide yet; it is definitely the
future.  The strategy that "you just have to serve universal documents" cannot
necessarily be sustained in terms of the actual cost/effectiveness tradeoffs
facing the information-sourcing communities.

Making web pages or services 'universal' can only be justified to the point
where they are still saving you money by reducing the workload spent rendering
comparable service via email, phone, and personal assistants.  No web
documents
are totally device-independent.  For the best we know how to do [WCAG 1.0] the
class of devices on which they depend is broad enough to retain the
flexibility
to dodge the preponderance of sensory and motor disabilities.  [We're still
working on cognitive.]  But these flexible documents still can't deliver any
communication, let alone effective communication, without some device or
other.  And as the clients get more diverse in capabilities, we have to be
prepared to seek the point of commonality between different Web discourses
deeper and deeper within the server processes.

What we [e.g. in WCAG 1.0] call 'content' has measured, not categorical,
independence of device and client processing availability assumptions.

Al

>Cheers etc.,
>
>Wayne
>
>
>This e-mail, and any attachment, is confidential. If you have received it in
error, please delete it from your system, do not use or disclose the
information in any way, and notify me immediately. The contents of this
message
may contain personal views which are not the views of the BBC, unless
specifically stated.

Al

>  
Received on Friday, 20 October 2000 13:39:35 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 19 July 2011 18:13:50 GMT