- From: Jo Rabin <jrabin@mtld.mobi>
- Date: Wed, 01 Oct 2008 20:10:11 +0100
- To: Francois Daoust <fd@w3.org>
- CC: public-bpwg-ct <public-bpwg-ct@w3.org>
If you don't supply a User-Agent at all a lot of sites break, according to some stuff I did a while ago. But yes, this is at the heart of what we are trying to establish. If, as a Content Provider, you do differentiate on User Agent and not Accept then that's interesting and that's what we are in the game to promote, I think. I'm sorry that it's not more prevalent in your sample, Francois. Jo On 01/10/2008 15:53, Francois Daoust wrote: > > I've been masquerading my User-Agent header lately to browse the Web, > using a non-existing User-Agent with no link whatsoever to any existing > one. > > I was expecting to see things break one way or the other, but the thing > is I had no real problem so far. > I see a few sites that return an "application/vnd.wap.xhtml+xml" > content-type that is not recognized by my browser, but this typically is > an indication that they have a mobile-optimized version, so not what I > would consider to be a big problem. > > So I'm wondering. Can anyone point out a few web sites that returns a > rejected response when queried with a "weird" User-Agent? (either > through a 406 status, or through a 200 status code with a "sorry" > message) I suppose I'm only browsing modern Web sites, not "legacy" ones. > > Thanks, > Francois. >
Received on Wednesday, 1 October 2008 19:11:09 UTC