W3C home > Mailing lists > Public > public-bpwg-ct@w3.org > October 2008

Signalling to proxies

From: Rotan Hanrahan <rotan.hanrahan@mobileaware.com>
Date: Thu, 30 Oct 2008 23:22:44 +0000
To: public-bpwg-ct <public-bpwg-ct@w3.org>, "public-powderwg@w3.org" <public-powderwg@w3.org>
Message-ID: <2F80C03A-E457-4905-8B5A-06ED7DE15313@mimectl>
Just a general observation about signalling to proxies...

One way to signal to a transforming proxy is to have a cache-control: no-transform header in your HTTP response. This signal says "I'm already good for this context".

In future, we will need to signal additional information, beyond just the current HTTP response. For example, suppose I want to be proactive and inform any proxies that I have a permanent alternative URL for mobile users? (My preference is that only one URL is ever needed, but I accept the reality of the current Web that multiple URLs are quite common.)

We need to help site designers to be proactive (in an indistry-recognised way) to assist the proxies to "do the right thing", not just on a page-by-page basis, but on a per-site basis. Proper signalling between the client, proxy and origin server would certainly help*. Proactive signals I'm thinking about would include, for example:
- I'm a desktop site, but you may remember this redirect to a contextually-sensitive equivalent.
- I'm a desktop site, but I can serve mobile content if you ask for it.
- You may optimize images from this site, but you should not restructure and/or recode content.

The CT guidelines are excellent, insofar as they give concrete information and examples to site designers and proxy implementers regarding what they should do. Very valuable advice. Much like the plentiful advice on the use of the de-facto robots.txt "standard".

What I'm thinking is that perhaps it's time to create some concrete examples of how to signal this higher-level concerns to proxies, possibly borrowing from the success of the example.com/robots.txt URL pattern, and introduce an example.com/proxies-powder.xml "well known URL pattern". Yes, that would be a POWDER resource, specifically to assist proxies, and specifically at a known path so that proxies can sniff them in advance (much like search engines currently sniff the robots.txt before proceeding to index the site). Pretty soon, we could all be using little POWDER files on our sites to ensure that the adaptable is adapted, the adapted is not re-adapted and everything else in-between is treated appropriately.


---Rotan



* Though obviously the legacy Web will still need special treatment.
Received on Thursday, 30 October 2008 23:24:38 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 30 October 2008 23:24:39 GMT