W3C home > Mailing lists > Public > whatwg@whatwg.org > May 2012

[whatwg] Bandwidth media queries

From: Matthew Wilcox <mail@matthewwilcox.com>
Date: Wed, 16 May 2012 19:48:04 +0100
Message-ID: <CAMCRKi+pdkLYHPZm6nHfkGDSgv8oJhQiVUik6CeV1a8ADC=B3A@mail.gmail.com>
To: WHATWG List <whatwg@whatwg.org>
First off I know that a number of people say this is not possible. I
am not wanting to argue this because I don't have the knowledge to
argue it - but I do want to understand why, and currently I do not.
Please also remember that I can only see this from an authors
perspective as I'm ignorant of the mechanics of how these things work

The idea is to have something like:

<link media="min-bandwidth:0.5mps" ... />
<link media="min-bandwidth:1mps" ... />
<link media="min-bandwidth:8mps" ... />

This make an obvious kind of sense to an author. One of the issues I
see people raise when they attempt to explain why this won't work is
that bandwidth is variable, and hard to measure (and something about
CSS being stateless I think).

Here's why this confuses me...

If you're a browser you are the software interpreting the instructions
of a given language: CSS in this case. You're an interpreter. So, if
CSS essentially asks you as a browser "what's the available bandwidth"
- why can the browser not just supply a value back to the CSS?

The next thing that comes to mind is: How do we get the value? Again,
I hope I'm not being too ignorant but the process *feels* like it
should go like this:

All browsers have a Web Inspector and all Inspectors have a Network
tab. That network tab tracks and logs everything we'd need to know to
figure out a reasonable approximation of the bandwidth available to
*that open tab*. This isn't an argument so much as a statement of
fact: all browsers can currently do this - it's just not exposed to
CSS to interrogate.

Why can we not be smart about this and follow a procedure like this:

As soon as a request is made for a web page, start tracking the
network activity. The browser knows when it sent a request, how long
it took to receive a response (thus the latency), the size of the
response and how long it took to deliver. It can start doing this with
the favicon. It can then analyse that favicon's data and come up with
a ballpark bandwidth. Subsequently it can do the same thing for every
single http request made to that domain. And it could average out the
results to get a more accurate bandwidth and latency measure of what
is currently available for the browser to use, re-adjusting the value
sent to the CSS as it goes.

No doubt that would have some wobbliness due to the low sample rate on
an initial connection, so what is to the stop the browser being
smarter and keeping a log of connection speeds as a good starting
point for future requests - for example, what was the average speed to
this domain for all responses in the previous 10sec of requests? Or
even more generally; what was the average speed available for *any*
domain in the last five minutes?

Thanks for your time,

Received on Wednesday, 16 May 2012 18:48:36 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:42 UTC