Re: please reivew mobileOK Basic Tests 1.0

On Jun 7, 2007, at 19:45, Dan Connolly wrote:

> I recently realized that this spec has various things
> to say about how people should use HTML, so this working
> group should be looking at it:
>   W3C mobileOK Basic Tests 1.0
>   W3C Working Draft 25 May 2007
>   comments due to by 22 June 2007
> If you have any comments that you think should be endorsed
> by this Working Group, also send them here.

Further quotes from the draft:

>    mobileOK Basic is a scheme for assessing whether Web resources (Web
>    content) can be delivered in a manner that is conformant with  
> Mobile
>    Web Best Practices [BestPractices] to a simple and largely
>    hypothetical mobile user agent, the Default Delivery Context.

The draft is premised on a vision about mobile browsing that assumes  
special mobile content. Instead of implying a separate Mobile Web, I  
think the W3C should push for one World Wide Web with mobile browsers  
that can access general Web content.

Mobile access to general Web content can be accomplished in at least  
two ways:
1) Putting a World Wide Web-ready browser engine on the mobile device  
(e.g. Minimo, the new S60 Browser, Opera for Mobile)
2) Using a distributed UA that puts a thin front end on the mobile  
and keeps the main engine on an intermediate server (e.g. Opera Mini)

The premise of mobileOK seems to be that you take the non-Web-ready  
thin browser and expect origin servers out there take special steps  
to accommodate it.

> It is not a test for browsers, user agents or mobile devices,
>    and is not intended to imply anything about the way these should
>    behave.

In practice, the draft is implying expectations about UA behavior.

>    Content passing the tests demonstrates that the content provider  
> has
>    taken basic steps to provide a functional experience for mobile  
> users.

I don't like the implication that pointy-haired managers are likely  
to take statements like this and bother their teams about hunting a  
badge of approval instead of testing that their sites work with  
browsers that run on mobile devices and are capable of browsing the  
real World Wide Web.

> 1.3 Claiming mobileOK conformance
>    A standard mechanism will be defined that allows content  
> providers to
>    claim that a URI or group of URIs, such as a Web site, conforms to
>    mobileOK Basic or mobileOK Pro. It will be possible to make  
> claims in
>    a machine-processable form. It will also be possible to notify end
>    users of the presence of the claim by means of a human-readable  
> mark.

I think testing content along the lines of mobileOK should be part of  
the internal quality assurance process of content providers. I think  
it should not be part of the external marketing process.

When people are just hunting the badge for marketing purposes, they  
may make silly workarounds to please the testing software while  
actually making the user experience worse.

>      * Include an Accept header indicating that Internet media types
>        understood by the default delivery context are accepted by  
> sending
>        exactly this header:
> Accept: application/xhtml+xml,text/html;q=0.1,application/ 
> vnd.wap.xhtml+xml;q=0
> .1,text/css,image/jpeg,image/gif

The main request should not include the CSS type. The requests for  
style sheets should only list the CSS type. Requests for images  
should only list image types.

It is rather sad that the supported image formats do not include PNG.

>      * Include an Accept-Charset header indicating that only UTF-8 is
>        accepted by sending exactly this header:
> Accept-Charset: UTF-8


>      * Check for consistency with HTTP headers, as follows:
>        For each meta element with an http-equiv attribute:
>        If a matching HTTP response header does not exist, warn
>        If a matching HTTP response header exists but its value differs
>        from the content attribute value, warn

These two should not apply at all to Refresh as Refresh is not used  
on the HTTP level in the real world. On the other hand, they should  
both be failures for the cache control because caching proxies should  
be able to work on the HTTP level without looking inside payload. For  
XML media types, the meta charset is always bogus so both cases  
should fail to avoid people depending on the bogus meta charset. For  
text/html, the case where the real HTTP header and the meta charset  
disagree should be a failure, because the disagreement is a symptom  
of something being wrong in the content production or serving process.

>  (note that use of the style attribute is deprecated in XHTML Basic  
> 1.1)

Obsoleted, actually.

>    In the course of assembling the CSS Style:
>      * observe the CSS Level 1 cascade

Specs written today should probably reference CSS 2.1 instead of  
Level 1.

>    For XML 1.1 [XML11] it is defined in section 1.3 as consisting  
> of the
>    same characters with the addition of NEL (#x85) and the Unicode  
> line
>    separator character, (#x2028).

Surely an XML 1.1 document cannot get mobileOK approval.

>    In the following, note that HTTP headers should be used rather than
>    meta elements with http-equiv attributes, which are commonly not  
> taken
>    into account by proxies.

The "should" should probably be a "must" for consistent results.

>    If any cache related header contains an invalid value, warn

Why not fail?

>    The DDC is defined to support only UTF-8 encoding,


> and hence this test
>    fails if a resource cannot be encoded in UTF-8.

s/cannot be/is not/

>    If the HTTP Content-Type header does not specify a character  
> encoding:
>    If there is no XML declaration, or UTF-8 character encoding is not
>    specified in the XML declaration, FAIL

XML provides an unambiguous default. Is there a practical reason, due  
to broken real-world UAs perhaps, not to PASS defaulted UTF-8?

>    If the HTTP Content-Type header specifies an Internet media type
>    starting with "text/":

This should apply to text/html.

>    If there is no meta element with http-equiv attribute that  
> specifies
>    UTF-8 character encoding, FAIL

Note that the current HTML 5 draft uses an attribute called charset.

Having a meta charset in a document that is served using an XML type  
(text/xml, application/xml and */*+xml) should probably be a warn at  
minimum considering that a charset meta in XML is bogus.

>    If character encoding is specified in more than one way, and not  
> all
>    values are the same, FAIL


>    If the document's Internet media type is "text/html" or
>    "application/vnd.wap.xhtml+xml", warn

What's wrong with HTML served as text/html?

>    If the document does not contain a DOCTYPE declaration, FAIL

I think the W3C should promote doctypelessness for application/xhtml 
+xml. See

However, documents that rely on the WAP dollar substitution must have  
a doctype that activates the dollar substitution in Opera. Still,  
relying on the dollar substitution is a bad idea.

>    If the document is an HTML document and it fails to validate  
> according
>    to its given DOCTYPE, FAIL
>    If (regardless of its stated DOCTYPE) the document does not  
> validate
>    against the XHTML Basic 1.1 DTD:
>    If it does not validate against the XHTML-MP 1.2 DTD, FAIL

The spec is lacking sufficient guidance on how to validate an HTML  
document against an XML DTD. Should perhaps an HTML5 parser be used  
with a DTD validator that is decoupled from an XML parser?

Requiring content to validate against a mobile profile DTD does not  
promote the unity of the World Wide Web.

>    For each included resource (see 2.3.6 Included Resources):
>    If the response specifies an Internet media type that is not
>    "text/css", "image/jpeg" or "image/gif", FAIL

Is there a good reason to exclude PNG?

>    If the element's value attribute is missing or empty, and an  
> inputmode
>    attribute is not present, warn

This seems excessive as it is quite likely that things will be just  
fine without content micromanaging the input mode on the UA.

>    If an alt attribute is not present or consists only of white space,
>    FAIL

This is a bad idea because it encourages badge hunters to include  
bogus alt text that actually harms accessibility. Tests like this  
only lead to an arms race where the placeholder data always gets a  
step more complicated than what the testing tools can detect as a  

>    If the innermost nested object element content consists only of  
> white
>    space, FAIL

See above.

>    If the document contains any basefont, bdo, center, del, dir, font,
>    ins, menu, s, strike or u elements, FAIL

del and ins are legitimate in both HTML 4.01 and in the current HTML  
5 draft. menu is legitimate in HTML 5.

>    If the document contains any b, big, i, small, sub, sup or tt
>    elements, warn

These elements are relatively common and harmless in practice. This  
warning seems excessive.

Henri Sivonen

Received on Monday, 11 June 2007 11:04:40 UTC