Re: please reivew mobileOK Basic Tests 1.0

Sean Owen schreef:
>>  First, section 2.3.2 HTTP Request states:
>>
>> Include an Accept header indicating that Internet media types 
>> understood by
>> the default delivery context are accepted by sending exactly this 
>> header:
>>  Accept:
>> application/xhtml+xml,text/html;q=0.1,application/vnd.wap.xhtml+xml;q=0.1,text/css,image/jpeg,image/gif 
>>
>>
>>
>>  I think this is incorrect, text/css should NOT be included in the 
>> Accept
>> header, and image/jpeg and image/gif ONLY if the UA is expected to 
>> support
>> showing these images independantly of a document (the mobileOK tests 
>> should
>> explicitly check whether this is supported). The client after all 
>> does not
>> know how to handle a text/css file independently of XML markup.
>>
>>  Instead, it should send an "Accept: text/css" header when the CSS files
>> that are linked using <link rel="stylesheet">, <?xml-stylesheet?> or
>> @include. Similarly, images referenced from <img> should send an 
>> "Accept:
>> image/jpeg,image/gif" header. Aside from checking the Accept header 
>> for the
>> main page, the mobileOK tests should also check that Accept headers send
>> these values for stylesheet and image requests.
>
> It's an interesting point. RFC 2616 says in 14.1 that Accept *can*
> indicate desired types this way, and 10.4.7 says that servers can even
> return a type not listed in the Accept header in some cases. I bring
> it up to make the point that Accept seems to be advice, and doesn't
> have to exactly enumerate acceptable types.

Quote from 14.1:
"The Accept request-header field can be used to specify certain media 
types which are acceptable for the response."

Thus, if the request for a <link> can’t handle text/html but only 
text/css, then that is what should be indicated. Saying Accept: 
text/html would say that the link *does* accept and can process 
text/html for that resource, which would be plain wrong. A similar 
example for images.

Section 10.4.7’s note isn’t relevant here, that’s about what the server 
returns, not about what headers the UA sends. That’s about whether it 
chooses to return no content at all or content that the UA might not be 
able to process. E.g. when a browser which indicates Accept: text/html 
browses to an .xml document, it might (and would probably) still be 
preferably to send it and have the browser handle it either as 
text/plain or prompt to save it than to show an 406 Not Acceptable error.

> It seems standard practice that user agents send a fixed string
> listing all of what they support each time.

Well I don’t know about that, but it doesn’t make sense if they did, 
unless they have some way to process text/css when it’s served 
stand-alone. I also wouldn’t understand the reason why they would do so, 
it’s virtually effortless to serve a different Accept type based on the 
data that is expected, and save them quite some transfer overhead on 
many requests as well.

> Finally it's conceivable that one might vary a document a little bit
> based on other types in the Accept header, like linking to JPEG versus
> PNG images in an <img> tag. I don't know that this is at all common
> though.

The image type that is served should be varied based on the Accept 
header of the *image’s* request. If that contains image/png, PNG can be 
served. Doing this like you say is wrong and can never be depended upon, 
and should certainly not be explicitly supported by a test suite.

That said, most UAs will probably send the image/ types in the Accept 
header of their primary request because they can usually handle images 
stand-alone. Maybe there are one or two sites that depend on this as you 
described above, but they’re definitely wrong and not OK.

What I’m suggesting is:

Main request:
   Accept: 
application/xhtml+xml,text/html;q=0.1,application/vnd.wap.xhtml+xml;q=0.1,image/jpeg,image/gif 

Image request:
   Accept: image/jpeg,image/gif
Stylesheet request:
   Accept: text/css

> If it's not wrong, and mimics real-world user agents a little more,
> we'd very slightly prefer to keep the fixed Accept header as-is.

Well it’s incorrect usage of the Accept header. And I don’t think 
mimicking that is a good thing. The W3C should give the right example.

If sites actually *depend* on UAs sending static Accept headers for all 
their requests, they are definitely NOT mobileOK, because they would 
break on any UA that does properly send their Accept headers, and thus 
prevent UAs from doing so, if widespread. This is certainly not 
something that any W3C test should accept.

>>  Second, in that same section, I think saying that UAs must send 
>> 'exactly'
>> this header is not desirable. That would prevent UAs from handling
>> additional media types, such as image/png or image/svg, and limit
>> innovation. After all, the UA would not be able to claim a mobileOK 
>> label
>> anymore. The spec should say that UAs must send exactly this or a 
>> superset
>> of this header.
>
> mobileOK Basic tests whether a resource can be delivered in a way that
> is compatible with an abstract baseline device profile, the "Default
> Delivery Context". This profile only assumes GIF and JPEG support, so
> it would be undesirable for a mobileOK Basic tests implementation to
> send a header that says that PNG is supported. The test demands that
> you demonstrate support or GIF or JPEG, so it doesn't help to add more
> types.

Yeah, sorry, I misunderstood mobileOK to be not only a stamp that could 
be given to web sites, but also to UAs.

>>  Sixth, in 3.10 LINK_TARGET_FORMAT, it states:
>>
>>
>> If the Content-Type header value of the HTTP response is not consistent
>> (determined in the same way as specified in 3.3 
>> CHARACTER_ENCODING_SUPPORT
>> and CHARACTER_ENCODING_USE) with the Accept-Charset header in 2.3.2 HTTP
>> Request, warn
>>  This should be a FAIL condition. Character set mismatches are very
>> undesirable (especially from an i18n perspective) and will create
>> significant hindrances for most non-English users, whose languages have
>> accents or even do not use our alphabet at all.
>
> From my reading of RFC 2616, 14.2, it's allowed to send back a
> character encoding which was not listed in Accept-Charset. It's not
> desirable, and this triggers a warning.
>
> Why not a fail? This test covers external resources, which are
> possibly outside the author's control. Some felt strongly that one
> shouldn't FAIL (maybe only temporarily) on account of an external
> resource. There are arguments both ways here, but that's why this is
> considered a warning and not a failure.

I think you mean section 10.4.7, not section 14.2 :). But how about, 
FAIL unless it returns ISO-8859-1, in which case you WARN?

As ISO-8859-1 and UTF-8 are both compatible with ASCII and thus most 
ISO-8859-1 documents will still be reasonably legible on UTF-8 only 
devices, like Dutch websites on my Japanese DoCoMo phone, in that case 
you could issue a WARN (in spirit of section 10.4.7 of HTTP).

Though I’d prefer it to be a FAIL :), because it’s annoying and 
shouldn’t be OK-ed, imho. The amount of characters used outside of ASCII 
varies per language, for English people it’s hardly any bother and Dutch 
also has reasonably few and there it is a relatively minor annoyance, 
but German and French and Scandinavian languages use accented characters 
a lot more and for those languages it would either render the page 
unusable or at the least cause major annoyance.

I guess you could say that labeling it just a warning and not a FAIL is 
a very English-oriented decision :), and kind of discriminating towards 
other languages which DO use the characters outside the ASCII range.

>>  If you want to support ISO-8859-1 in some way to make it easier for
>> existing sites to server with the mobileOK label, ISO-8859-1 should 
>> simply
>> be processed appropriately and added to the Accept-Charset header.
>
> Sites are welcome to send back ISO-8859-1 whenever they like,
> according to Accept-Charset or anything else. But as far as mobileOK
> Basic Tests are concerned, we want to see the ability to return UTF-8
> and test for that. That doesn't mean you always have to return UTF-8
> to real devices.
>
> To put it slightly differently, would you want to force Japanese sites
> to return ISO-8859-1 to pass the tests?

No, of course not :). I was speaking of *adding* to the Accept-Charset 
header, possibly even with a quality indicator like "Accept-Charset: 
UTF-8, ISO-8859-1;q=0.5", it would just be one of the accepted response 
character sets, if accepting it is what you want to do in this test. It 
could be somewhat justified with the ASCII-compatibility I mentioned 
above and because at least English-language sites don’t use many 
characters outside the ASCII range anyway.

If a UA only indicates support for UTF-8 though, which I presume you are 
testing because it is the common denominator between UAs, then I do not 
think delivering another character set that is not understood by the UA 
should result in just a warning. My Japanese DoCoMo N902iS phone’s iMode 
browser indeed understands only Shift-JIS and UTF-8, it messes up on 
accents on Dutch ISO-8859-1-encoded sites I visit. I would prefer that 
those sites could not use the ‘mobileOK’-label unless they started using 
UTF-8.

Maybe the best test would be: if received document’s encoding is not 
UTF-8, check if it’s an ASCII-compatible encoding. If false, FAIL. If 
true, iterate over the document and check whether ALL characters are in 
the ASCII range (that is, code points < 128). If so, WARN, otherwise FAIL.

>>  Seventh, in section 3.18 POP_UPS, target attributes on links with 
>> values
>> "_self", "_parent", or "_top" are accepted. All of these should FAIL,
>> however, since their presence does not make sense (and is a waste of
>> bandwidth) considering the requirements put forth in 3.13 NO_FRAMES.
>
> This is a good point. They're a small waste of bandwidth.

I agree with that, I am not normally a sucker for bandwidth-arguments, 
but I saw similar restrictions imposed in 3.12 MINIMIZE so I thought I’d 
mention it.

However, I’m just commenting on the fact that it doesn’t make sense for 
them to be there in the first place, given that frames aren’t ‘mobileOK’ :).

Ok, that’s it. I hope my arguments above where coherent, I think I 
mostly made sense ;p. To summarize, I do not think it is correct to test 
whether a site is OK by sending Accept headers in an incorrect manner, 
and I do not think a warning is sufficient for sites that send their 
content in a non-UTF-8-encoding when one was requested. I kinda like the 
algo I mentioned in that last paragraph, as it explicitly analyzes the 
document for the thing that the assumption of sending ISO-8859-1 to 
UTF-8-only UAs being acceptable is based upon.


~Grauw

-- 
Ushiko-san! Kimi wa doushite, Ushiko-san nan da!!
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Laurens Holst, student, university of Utrecht, the Netherlands.
Website: www.grauw.nl. Backbase employee; www.backbase.com.

Received on Monday, 11 June 2007 03:24:17 UTC