W3C home > Mailing lists > Public > www-talk@w3.org > March to April 1996

Re: Improving traffic condition@inet

From: Peter Teoh <peter@iti.gov.sg>
Date: Fri, 22 Mar 1996 13:48:10 -0800
Message-Id: <3153201A.5C5B@iti.gov.sg>
To: www-talk@w3.org
Cc: Mike Wexler <mwexler@frame.com>, peter@iti.gov.sg
Mike Wexler wrote:
> 
> > SI HOMBRE
> > I suggest that the gif files have to been compressed, too.
> > GIF --> JPEG
> > The quality doesn't decrease very much... but the quantity does.
> I would strongly recommend against this. JPEG is often small than GIF
> for photographic images. This is not necessarily true of computer
> generated artwork. It is also not necessarily true for small (thumbnail)
> types of images. For computer generated artwork, JPEG is can significantly
> degrate the quality of text or other sharp lines that occur in the
> picture.



Gao Hong wrote:
> 
> I have an idea about improving the traffic condition on the Internet.We all have the boring experience of waiting for the document to 
transfer from the Inter
> 
> I mean we define a compress standard using on the internet.  

> Gao Hong

The internet is dominated by WWW traffic (meaning jpeg, gif, and HTML
scripts). I think this is because of the nature of HTML.   Because so
many web pages itself have graphics on it, most of which are not 
necessary.   Several ways to solve the traffic problem are:

1.   Change the HTTP protocol itself.

Define priorities in the different types of image files and text
files.   Some images should be transmitted at higher priorities than
others, whereas some can be transmitted at a slower rate.   Default is
images data will receive low priorities.

The net effect is that WWW may not look so nice without the graphics,
but 


2.   Have a international/national organisation to look into the
traffic monitoring.   Slowest links should be identified - these
are the bottleneck to the entire transmission process.

Congestion in these should then be diagnosed, and if necessary,
users should be made to pay for the 

3.   Implementing some kind of accounting system at the server end.
For example, each user have only a limited amount of network allocation
per unit of time, and if exceeded, the HTTP protocol will slow down the
transmission for this user.

It can also be generalised to higher level, where traffic is tracked 
down
to user level, whether it is FTP, telnet, or raw socket transfer, 
and then automatically slow down the transfer when traffic for each
user exceeded his limit.   This will prevent people from transmitting
images (which is rather useless, unless for medical use etc) needlessly.

4.   Invent a new kind of algorithm for HTTP - whereby the request
is not retrieve real time.   Ie, I can specify a range of HTTP address,
and then the daemon will retrieve the HTML scripts when traffic is low,
(eg, detected at a interval of 10 mins).  Then I can see all the
HTML scripts page by page, through all the address that I want to
see, but at a later time.

Moreover, WWW page designer can specify which hyperlinks are important
to be retrieved simultaneously for this kind of offline access.

And most important of all, more contents can be put into one page now,
since realtime response is not an important issue from user's 
perspective.

5.   Graphics opting out option.   User should be able to specify that
graphics is not needed to be sent.   Then server will only send the
graphics when it deemed high priority for the users' understanding.

Any comments?

-- 
Peter Teoh 					Information Technology 
Institute
Internet : peter@iti.gov.sg			Science Park II
Tel : 65-7705585				11 Science Park Road
Fax : 65-7791827				Singapore 117685
Received on Friday, 22 March 1996 01:15:24 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 27 October 2010 18:14:19 GMT