W3C home > Mailing lists > Public > public-html@w3.org > June 2007

Re: please reivew mobileOK Basic Tests 1.0

From: Sean Owen <srowen@google.com>
Date: Wed, 13 Jun 2007 03:30:50 -0400
Message-ID: <e920a71c0706130030g60176a51h1ff335235d14363a@mail.gmail.com>
To: "Laurens Holst" <lholst@students.cs.uu.nl>
Cc: public-bpwg-comments@w3.org, "public-html@w3.org" <public-html@w3.org>

Thank you everyone for all the replies on this thread. They deserve
responses, which I will send, being brief so as to respect the wide
distribution list. mobileOK does have a good deal to say about HTML
and will benefit from more scrutiny from experts in this regard. I
also think I can assuage some of the concerns expressed here by
explaining more of the underlying reasoning, and realities of the
mobile context.

I'll voice my own opinion rather than an official group statement
here, though I believe and hope I capture the majority opinions.


On 6/10/07, Laurens Holst <lholst@students.cs.uu.nl> wrote:
> Quote from 14.1:
> "The Accept request-header field can be used to specify certain media
> types which are acceptable for the response."
>
> Thus, if the request for a <link> can't handle text/html but only
> text/css, then that is what should be indicated. Saying Accept:
> text/html would say that the link *does* accept and can process
> text/html for that resource, which would be plain wrong. A similar
> example for images.

I don't disagree with you. Let me bring this up on our next call. That
said let me present some of the other factors to consider.

One sub-point I'd like to make is that it's not wrong for the user
agent to say nothing about what it accepts. It's also not wrong to
list everything it accepts every time, according to HTTP. If a request
for a CSS file retrieves a text/html document, well, sounds like the
site is quite broken. This is not a user agent problem.

We're not talking about whether the defined test implementation
violates HTTP -- just whether it can be made friendlier.


> Well I don't know about that, but it doesn't make sense if they did,
> unless they have some way to process text/css when it's served
> stand-alone. I also wouldn't understand the reason why they would do so,
> it's virtually effortless to serve a different Accept type based on the
> data that is expected, and save them quite some transfer overhead on
> many requests as well.

I agree that it would be a nice optimization for user agents to tailor
the Accept header to each request to save bandwidth. mobileOK Basic
does not specify tests on user agents and isn't trying to give an
example of ideal user agent behavior -- it's specifying how an
artificial user agent in a test implementation behaves.


> The image type that is served should be varied based on the Accept
> header of the *image's* request. If that contains image/png, PNG can be
> served. Doing this like you say is wrong and can never be depended upon,
> and should certainly not be explicitly supported by a test suite.

What if the page wants to embed a link to image.png or image.gif at
the time that the enclosing HTML page is served? I don't know if this
is common, but, it's not inconceivable.


> Well it's incorrect usage of the Accept header. And I don't think
> mimicking that is a good thing. The W3C should give the right example.

I don't believe it is incorrect according to HTTP, but I agree that
it's not necessarily ideal.


> If sites actually *depend* on UAs sending static Accept headers for all
> their requests, they are definitely NOT mobileOK, because they would
> break on any UA that does properly send their Accept headers, and thus
> prevent UAs from doing so, if widespread. This is certainly not
> something that any W3C test should accept.

These tests aren't so much a test of how sites process HTTP headers,
they're checking the content that comes back. They specify an Accept
header as a helpful hint to induce the site to give back mobile
content in the right formats, but, the burden is on the site to get
the right content back. In all cases, the request will indicate that
the desired content type is acceptable.

You're suggesting that the tests be friendlier, and yeah I don't
disagree. I mean, if a request for an image says "I can accept
image/jpeg or text/html" and the site chooses text/html, you wouldn't
want to call that OK either, right? still it's a fair point.


> I think you mean section 10.4.7, not section 14.2 :). But how about,
> FAIL unless it returns ISO-8859-1, in which case you WARN?
>
> As ISO-8859-1 and UTF-8 are both compatible with ASCII and thus most
> ISO-8859-1 documents will still be reasonably legible on UTF-8 only
> devices, like Dutch websites on my Japanese DoCoMo phone, in that case
> you could issue a WARN (in spirit of section 10.4.7 of HTTP).

Then the test is not determining whether the site can return UTF-8,
which is the central aim of the test.

You may say that the DDC should then be specified to support
ISO-8859-1. I know that in an attempt to be inclusive (of Japanese
phones in particular), this wasn't specified. At the least, this is a
change to be considered for the Best Practices document and not for
the tests based upon them, since Best Practices version 1.0 was
finalized with only UTF-8 support specified, and these tests are to be
based on BP version 1.0.

My guess is that there would be resistance to saying that ISO-8859-1
support may be assumed, again because of the Japanese market. The
assumption here is that UTF-8 support is widespread, so, yes, the Best
Practices document is specifically telling sites that they need to
just be *able* to send back text encoded as UTF-8 -- they're not
required to do it all the time.


> If a UA only indicates support for UTF-8 though, which I presume you are
> testing because it is the common denominator between UAs, then I do not
> think delivering another character set that is not understood by the UA
> should result in just a warning. My Japanese DoCoMo N902iS phone's iMode
> browser indeed understands only Shift-JIS and UTF-8, it messes up on
> accents on Dutch ISO-8859-1-encoded sites I visit. I would prefer that
> those sites could not use the 'mobileOK'-label unless they started using
> UTF-8.

Agreed! that's why the tests only accept UTF-8.

The test that concerns the page being tested itself will FAIL if it's
not encoded in UTF-8. The test you were referring to concerns external
resources, and there, encoding problems generate a warning only. Yes,
I agree it looks inconsistent. The reason is again that some people
felt strongly that the mobileOK-ness of one document should depend on
an external resource. If I link to bbc.co.uk and they don't follow the
UTF-8 rule, should I FAIL mobileOK? that's the reasoning here, at
least.


> However, I'm just commenting on the fact that it doesn't make sense for
> them to be there in the first place, given that frames aren't 'mobileOK' :).

Yes, the only possible cases I can imagine where you would want to
leave these in a mobile document even though they won't have any
effect are fairly contrived. I will also bring this up again on our
call.
Received on Wednesday, 13 June 2007 07:31:07 UTC

This archive was generated by hypermail 2.3.1 : Monday, 29 September 2014 09:38:45 UTC