Re: Bandwidth

James Green (
Tue, 25 Nov 1997 13:12:52 +0000 (GMT)

From: James Green <>
Message-Id: <>
Date: Tue, 25 Nov 1997 13:12:52 +0000 (GMT)
Subject: Re: Bandwidth

On Mon, 24 Nov 1997 19:26:44 +0000 James R Grinter <> 

> and indeed, (I notice you're a customer of ours) we operate one.
> Details at

I know, I use it at home.

> There's also a big cluster of them on JANET (the network that Essex
> Uni is on), details at I think.

Yes, I know, occassionally tells me that the requested file isn't in 
it's cache (good one JANET).

If I had wanted a system like caching, I wouldn't have suggested a 
theoretical solution to the problem. I know what a cache does and how 
to use one, but the high-traffic problem still occurs daily, so a 
different solution was one that I was talking about. Forget caches 

Whilst they do help enormously, they do not solve the problem entirely. 
For instance, in a live broadcast, people can hardly get the entire 
file from a cache went the EOF hasn't been transmitted yet, can they? A 
better system of single-distribution is needed.

Let me make myself perfectly clear with a further example:

A file (not cached yet as it is a new one) is now available on an 
American site. As soon as this becomes apparent, people from Britain, 
Netherlands, Sweden, France, Germany, Italy .... Australia, Japan, 
etc., want it (it's popular). Seeing as half of these (I don't know 
real statistics) aren't using a cache, the file is ordered by their 
ISPs directly from America.

The file is then transmitted once for each person (say 150 to Britain, 
200 to Australia, 600 to Japan) through various networks clogging them 
up considerably. Eventually, 600 identical files get to Japan, 150 to 
Britain, etc.

Now, why couldn't there have been one file going to each country that 
requests it, it gets stored temporarily in the country's main centre 
(like Telehouse or LINK in the UK) where the appropriate ISPs are given 
access to the file and they transmit it to their users.

Think about a 20Mb file, all those bits of those copies would be cut 
down to just a few bits of one file and transmitted intelligently. What 
a ease to the traffic and severe speed increase for the rest of us!!

Sure, a lot of Internet infrastructre would have to be re-configured, 
and individual countries would have to have their ISPs co-operate to 
provide centres rather than ISP shared lines going in to the country, 
but it makes a lot more sense than the current 'just get the file' 

Of course, once the file has reached the ISPs, caching can take over, 
but they cannot hold live data which is being transmitted (well they 
can, but you know what I mean), and they certainly cannot hold copies 
of every file on the Internet, but they do do a great job of speeding 
the job up.

Now do you get what I am talking about? Maybe someone at Cisco (however 
you spell it) is reading???


> >   When anyone requests a file, for example and 2mb .mov from a server 
> >in USA  to themselves in Britain, it gets downloaded uniquely, i.e. 
> >just for them.
> >
> >So then, what if an event was being shown, like the ones from 
> >Microsoft's web site frequently advertised, which attracted many people 
> >from Britain to watch it? Presumably, taking x to be the number of 
> >viewers, their would be x number of copies of the stream being 
> >broadcast from the server, across the atlantic, via Telehouse and/or 
> >LINK, to their ISPs to themselves. Why???
> >
> >Surely, intelligent routers would say, hang on, if x number of requests 
> >are coming from Britain, let's only send one, and have a final point of 
> >separtion in Britain where a server gives the incoming (one) stream to 
> >the many?


James Green

Term e-mail:   |   Home e-mail: