W3C home > Mailing lists > Public > www-html@w3.org > August 2006

Re: XHTML Applications and XML Processors [was Re: xhtml 2.0 noscript]

From: David Woolley <david@djwhome.demon.co.uk>
Date: Thu, 3 Aug 2006 22:15:59 +0100 (BST)
Message-Id: <200608032115.k73LFxv03412@djwhome.demon.co.uk>
To: www-html@w3.org

> First, a "page" weighing in at 50 kb will take roughly 20 seconds to
> fully load at 56 k baud.  This is certainly slow (I agree), but users at

A 50 kilo bit message will take less than a second, even if it
uncompressible.  A 50 binary Kilo byte one will take about 7 to 8 seconds,
again making the unlikely assumption that it is uncompressible.

However, as a user of 33 kbps modems, actual commercial web page download
sizes are a real problem.

> this connection speed are accustomed to this speed of page delivery,
> dynamic content or not.  So it may be frustrating, but I do not see
> "harm".

There are probably several classes of users of modems:

1) those who have only ever accessed typical commercial pages and have
never experienced a broadband or LAN connected web site and therefore
just accept that it is of the nature of the web to be slow.  I doubt
if many of these exist.

2) those who only access typical commercial sites, but have experienced
broad band, who would probably prefer much more responsive sites, but
consider the cost (money or hassle) of upgrading to be too high (possibly
an older user whose system was set up by a relative, or someone in a
rural area of a poor country).

3) those who also access sites written for information content, rather
than appearence, or otherwise understand HTML, who may not want to
upgrade for the above reasons, but are frustrated by the inability of
many web authors to do such a simple thing as creating a fast web site.

Some other observations:

I think it is a reasonable expectation to be able to interact as soon
as an element appears.

However, I would note that typical commercial designs actually frustrate
the appearence of the user interface elements by:

a) having the start of the download filled up by a large scripting library;

b) using auto layout tables whose size depends on images, so that the 
browser doesn't have enough information to render the page until it has
nearly all arrived (I think some browsers do a fix up, rather than

so it would seem that authors don't often care about fast interaction.
It's not even as though they want to force people to read the advertising,
in most cases.

I'm also aware of cases where people have deliberately delayed the enabling
of HCI elements until all the scripting, etc. has loaded, because it makes
things more predictable that way.

When I've though about this in the past, I've concluded that the current
specifications are inadequate when there are multiple download streams.
E.g., if, as ought to be good practice, the scripting library is in a .js
file, does anything specify what happens if a function is called before
the parallel load of the scripting file completes (and the declaration
hasn't yet been seen)?
Received on Thursday, 3 August 2006 21:17:05 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 15:06:13 UTC