- From: Frank Ellermann <nobody@xyzzy.claranet.de>
- Date: Mon, 4 Aug 2008 21:09:04 +0200
- To: ietf-http-wg@w3.org
Julian Reschke wrote: > I'm still not sure how this is a problem, unless you can > show that this causes interop problems somewhere. The <qvalue>s are supposed to mean something. In the case of Accept-Language: en,de,frr;q=0.1,frs;q=0.1 I'd guess (and implement) that this means "user wants 'en' or 'de', if both are unavailable frr or frs". If 'en' *and* 'de' are available toss a coin, or take always the first / last, the user doesn't care. Ditto if frr *and* frs exist, but no 'en' and no 'de'. So far it is simple, and no reason to talk about it in the spec. Now back to our Accept-Charset cases cases, assume that Koi8-R, BOCU-1, and UTF-8 are available: All match *;q=0.7. Toss a coin ? Take always the last, i.e. for utf-8;q=0.7,*;q=0.7 take the * and toss a coin ? This is _relatively_ harmless (unless you get BOCU-1 and your UA can't handle it, obviously). But it is strange when the <qvalue> of * isn't one of the smallest non-zero <qvalue>s, and I'm surprised that folks here refuse to see it. But why ? If everybody does something else the wildcard can be simply deprecated. Or if desired fixed in some way, e.g., state that a wildcard <qvalue> greater or equal than the smallest non-zero <qvalue> elsewhere MUST be interpreted as *;q=0.001. Or SHOULD. Or something better than "dunno, who cares, toss a coin, get BOCU-1". Frank
Received on Monday, 4 August 2008 19:08:54 UTC