Addendum to Mobile Web Best Practices

W3C Working Group Note 10 February 2009

This version:
http://www.w3.org/2005/MWI/BPWG/Group/TaskForces/mobileOKPro/drafts/ED-mobileOK-pro10-tests-20090210
Latest version:
http://www.w3.org/2005/MWI/BPWG/Group/TaskForces/mobileOKPro/drafts/latest
Previous version:
http://www.w3.org/2005/MWI/BPWG/Group/TaskForces/mobileOKPro/drafts/ED-mobileOK-pro10-tests-20081028
Editor:
Kai Scheppe, Deutsche Telekom AG
Substantial contributions made by:
Phil Archer, Family Online Safety Institute
Alan Chuter, Technosite
Jo Rabin, DotMobi
Dave Rooks, Segala
José Manrique López de la Fuente, Fundación CTIC

Abstract

This document supplements W3C Mobile Web Best Practices 1.0 by providing additional assessments for conformance to Best Practice and some additional interpretation of the Best Practices.

This document is non-normative, but it is recommended to follow the advice given here to further improve mobile friendly content.

Status of this Document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.

This is a public Working Group Note produced by the mobileOK Pro Tests Taskforce of the Mobile Web Best Practices Working Group as part of the Mobile Web Initiative . Please send comments on this document to the Working Group's public email list public-bpwg-pro@w3.org, a publicly archived mailing list .

This document was produced under the 5 February 2004 W3C Patent Policy . W3C maintains a public list of patent disclosures made in connection with this document; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) with respect to this specification must disclose the information in accordance with section 6 of the W3C Patent Policy.

Publication as a Working Group Note does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress. Other documents may supersede this document.

Table of Contents

1Introduction
1.1 Purpose
1.2 Relationship to mobileOK Basic Tests
1.3 Scope
1.4 Audience

2 Sampling scheme

3 Tests
3.1 Access Keys
3.2 Auto Refresh
3.3 Avoid Free Text
3.4 Background Image Readability
3.5 Balance
3.6 Device Capabilities
3.7 Central Meaning
3.8 Limited
3.9 Clarity
3.10 Color Contrast
3.11 Content Format Preferred
3.12 Control Labeling
3.13 Control Position
3.14 Cookies
3.15 Deficiencies
3.16 Error Messages
3.17 Fonts
3.18 Graphics for Spacing
3.19 Link Target ID
3.20 Minimize Keystrokes
3.21 Navbar
3.22 Navigation
3.23 Non-text Alternatives
3.24 Objects or Scripts
3.25 Page Size Usable
3.26 Page Title
3.27 Provide Defaults
3.28 Scrolling
3.29 Structure
3.30 Style Sheets Size
3.31 Style Sheet Support
3.32 Suitable
3.33 Tab Order
3.34 Tables Layout
3.35 Tables Support
3.36 Testing
3.37 URIs
3.38 Use of Color

Appendices

A References (Non-Normative)


1 Introduction

1.1 Purpose

The purpose of this document is to help content providers conform to Mobile Web Best Practices, by providing additional evaluations for their content and by interpreting and clarifying Best Practices in some cases.

Mobile Web Best Practices contains sections against each best practice called "What to Test". The evaluationsin this document supplement those tests.

1.2 Relationship to mobileOK Basic Tests

mobileOK Basic Tests describes machine executable tests for conformance to Best Practices, when delivering content to the hypothetical extremely basic mobile device called the "Default Delivery Context" (DDC).

Many of the tests described in mobileOK Basic Tests are not useful when determining suitability of content for use on more advanced devices. Indeed, content that is suitable for the DDC (and hence mobileOK Basic conformant) and is not available in other forms that take advantage of capabilities of more advanced devices may result in a poor user experience on such devices.

This addendum then provides a set of evaluations that fill the gaps left by the limitations of automated tests and thus completes the set of Best Practices.

1.3 Scope

The scope of this document is commentary on Mobile Web Best Practices and mobileOK Basic Tests.

1.4 Audience

This document is intended for creators, maintainers and operators of Web sites. Readers of this document are expected to be familiar with the creation of Web sites, and to have a general familiarity with the technologies involved, such as Web servers, HTTP, with Mobile Web Best Practices and with mobileOK Basic Tests 1.0.

2 Sampling

2.1 Evaluation Scope

While most evaluations apply to single pages or delivery units, some, such as ACCESS_KEYS, NAVIGATION and URIS are tested across multiple pages. To perform the evaluations correctly and report on them it is necessary to define the scope of the evaluation. This may be expressed using URI patterns (e.g. groupings using POWDER).

3 Evaluation procedures

3.1 Access Keys

Relevant device properties
Support for access keys, i.e. a set of keys that can be used as accesskeys
Interpretation of the Best Practice

This evaluation applies to primary navigation links that occur across all pages within a given web space which is itself defined by the presence of those navigation links. Such navigation links should be associated with access keys. Furthermore, the access key assignment should be identical for those links.

Access keys may be indicated to end users in any of three ways

  • Link decoration
  • Summary page
  • Listing on a sitemap
Evaluation procedure

Where there are elements, particularly navigation links and form controls, that would benefit from quick access:

  • Check if access keys have been assigned at least to the primary navigation links found on all pages within a Web space defined by the presence of those links
  • Check that the defined accesskeys are usable on the target device; by default, use numeric access keys
  • Ensure that access keys for primary navigation links are identical across all pages
  • Make sure to indicate which access are being used through at least one of the methods above

3.2 Auto Refresh

Evaluation procedure
  1. Look for pages that use automatic refresh (e.g. as highlighted by a mobileOK evaluation)
  2. For such pages, make sure a link is provided to a non-refreshing version of the page

3.3 Avoid Free text

Relevant device properties
Maximal size of usable select list
Evaluation procedure
For each free-text input field (<input type="text">, <input type="">, <input type="password"> <textarea>), check if the field can be replaced by a series of radio buttons, checkboxes or a select menu with a number of values that fit with the devices limitations
Examples
  • Selecting ZIP/Post codes from a list, perhaps within a limited geographical area of interest or in a sequence of steps where each list of options is dependent on the previous data entered
  • Selecting a country from a drop down list, ideally with the most likely choice(s) at the top as well as in the list in their alphabetical position
  • Don't use a text entry field for entering yes or no

3.4 Background Image readability

Relevant device properties
Screen contrast, number of supported colors
Interpretation of the Best Practice
The use of patterned or photographic background images behind text is discouraged but not prohibited.
Evaluation procedure
When background images are used, check that the color contrast ratio between the overlying text and each color used in the background image is sufficient. The Ishihara Test for Color Blindness can be used for example to show background and text with patterning and insufficient color contrast.
Examples

The contrast between this text and its black background color is 5:1

3.5 Balance

Relevant device properties
Support for non-linear navigation across links
Evaluation procedure

If the targeted device does not support non-linear navigation across links (i.e. a user can only reach a link after having navigated through all the preceding links on the page), ensure the page doesn't use more than 30 links.

This may be mitigated by special purpose pages, such as site maps or link lists, but is highly dependent on the individual context.

3.6 Device Capabilities

Relevant device properties
Screen width, markup language, character encoding, image format page width, colors, style sheets, http, script. For details see the Default Delivery Context.
Interpretation of the Best Practice
This evaluation cannot cover all the possible capabilities that might be found in devices. Unlike the DDC, many real devices support scripting, XML HTTP requests, DOM capabilities, cookies and CSS including media queries. However, it is usually possible to detect a presentation that is artificially constrained by the limited and largely hypothetical Default Delivery Context. Rather trying to find where device capabilities are not utilized it is more conclusive to identify those instances where the content is wrongfully limited on a device more capable than the DDC.
Evaluation procedure
  1. For each property of the DDC (e.g. screen width), identify some unadapted, original content that would exceed the value given by that property of the DDC
  2. Request content on a device more capable than the DDC for that particular property
  3. Check if content is rendered according to the constraints of the DDC
Examples
  • Web sites that offer video streams should not offer such content for the DDC, however, it should be offered to users of devices that do support video.
  • Presentation should adjust itself to the optimal display for this device and not be unnecessarily limited to 120 pixels wide.
  • JavaScript should be used for form validation on devices that support it as this can reduce network traffic and latency.
  • In forms scripting may be used to adjust the presentation or state of a control based on the input already supplied.

3.7 Central Meaning

Evaluation procedure
  1. Determine what is the main content of the page. This is content that is unique to the page and does not repeat across several pages.
  2. Check if none of the main content of the web page is visible without scrolling

3.8 Limited

Evaluation procedure
  • The information contained in the link text should describe (e.g. contextual information, content type, file size) the content that will be retrieved.
  • Content provided or linked to should be suitable for the mobile context, unless provided in a controlled environment where its unsuitability is known a priori and acceptable.
Examples
Content may be suitable in a controlled environment, with logon for entry, for example for retrieval of large medical scans on mobile devices.

3.9 Clarity

Evaluation procedure
Check if the text is, for the context, considered to be unnecessarily complex or verbose.
Examples
  • Consider the purpose of the page. If the purpose of the page is to deliver a specific piece of information, that information should be readily extracted without excessive scrolling or scanning of text.
  • In the English language the Fogg test can give an indication of complexity. A level of roughly 7 or 8 would be ideal. For other languages this does not apply.

3.10 Color Contrast

Relevant device properties
Screen quality, color depth, resolution
Evaluation procedure
Check if the contrast ratio is less than 5:1, according to http://www.w3.org/TR/2007/WD-WCAG20-TECHS-20070517/Overview.html#G18

3.11 Content Format Preferred

Evaluation procedure
  1. Determine which alternative formats are available.
  2. If alternative formats are available change the quality factors of the accept headers to express a preference for an available alternative format.
  3. Check if the content which is delivered reflects that change.
Examples

Typical changes to preferences such as the following should be reflected accordingly:

  • GIF and JPEG
  • Character encoding
  • Markup language

3.12 Control Labeling

Evaluation procedure
  • Check if there are label elements if user-visible form elements are used
  • Check if the for attribute is defined and the form control is contained within the label element
  • Check if the for attribute is present and corresponds to the id attribute of a form control
  • Check if the label describes the purpose of the form control
Examples
A label which requests a persons name but is associated with a birth date field, fails.

3.13 Control position

Evaluation procedure
  • Check if CSS is turned on and labels are clearly associated with a form control
  • Check if the label can be clearly associated with a form control
  • Check that the association is not only achieved by means of positional CSS
Examples
See Gez Lemon's article on Label Positioning.

3.14 Cookies

Relevant device properties
Device can accept cookies
Evaluation procedure
  1. Identify the main functionality of the site that is important for the user and may rely on state. See examples.
  2. Evaluate the funcionality with cookies supported correctly.
  3. Disable cookies supported. Evaluate the funcionality and compare with that evaluated in 2.
  4. Check if the content or function can be accessed without cookies
Examples
A site that requires a user to login might store that login in a cookie to save the user typing in their credentials each time they visit. If cookies are unavailable, an acceptable degradation would mean that the user was prompted for a login each time they visited that page, but would browse the site without further logins from then on. A poor (and unacceptable) cookieless degradation would render a site useless by always checking for a non-existent cookie and so not letting the user past the login page.

3.15 Deficiencies

Limitations of this test
This test cannot test all existing deficiencies of devices. Rather the test limited to the author/provider being aware of common deficiencies and do something about those. This test is not intended to test pixel perfect rendering across devices, but rather focuses on usability across devices.
Evaluation procedure
If there are common deficiencies of devices, which impinge significantly on usability of the content being offered, measures should be taken to work around those deficiencies.
Examples
Some devices which render tables badly will wrap columns by default, if the meaning of the page is dependent on layout which does not render well on these sorts of devices then an alternative rendering should be used when that user-agent is detected.

3.16 Error Messages

Evaluation procedure
  1. Provoke a server-side error. For example: Request a URL known not to correspond to any endpoint on the site.
  2. Examine the content of the error page. It should:
    • Explain in non-technical language the error which has occurred.
    • Be suitable for the DDC, at a minimum, (i.e. it must comply with all other tests).
    • Provide at least one of the following links: site home page, back, reload.
    • Be in the language the user was reading on the site when the error occurred.

3.17 Fonts

Relevant device properties
Font availability, font size
Evaluation procedure
  1. Check for presence of font-related styling, by means of the font element, or the bold, italic or underline elements, or CSS.
  2. Examine the content with CSS enabled and disabled and check if the meaning is significantly different.
  3. Examine any font-related elements present and the effect they have on the meaning of the content. Check if they are used to convey meaning significantly.
  4. Check for the presence of the "face" element and if only one font has been defined.
Examples
  • Emphasis expressed using bold or underline.
  • Quotes that are indicated by italics rather than enclosed in quotation marks or quotation markup.

3.18 Graphics for Spacing

Evaluation procedure
  1. Check if the content complies with GRAPHICS_FOR_SPACING MobileOK Basic Test.
  2. View all images in a page, for example in a separate list of images, or by outlining them. For XHTML these will be included using the <img> element.
  3. Determine visually whether any of the images do not convey information and are used for spacing.

Note: Spacer images do not convey useful information. They are normally very small.

Examples

3.19 Link Target ID

Evaluation procedure
  1. Check if each link in the text is described by attributes as follows:
    • If the target content is in a language different to that of the tested content, it should be correctly marked with hreflang attribute
    • If it is in a format other than that of the tested content, it should be correctly described by the type attribute
    • If it uses a character encoding different to that of the tested content, it should be described by the charset attribute
  2. Check that the link text (including alternative text for any non-text elements) clearly describes the information obtained when activating the element
  3. Select elements pairwise. Check that two links with same link text (including alternative text for non-text elements) and same title attribute (if provided) point to the same resource.

(See UWEM 1.0 13.1 for more details)

Examples
  • Link with only the text "Click here."
  • Multiple links in same page with same content but pointing to different things.
  • Link from an HTML page to a large video file that does not mention format or size.
  • Link to content in a language different to that of the current page.

3.20 Minimize Keystrokes

Relevant device properties
Input mode
Evaluation procedure
This evaluation is covered by AVOID_FREE_TEXT, URIS, CENTRAL_MEANING and PROVIDE_DEFAULTS.

3.21 Navbar

Relevant device properties
Screen width
Evaluation procedure
  • Check if there are basic navigational elements located above the rest of the content
  • Check if all navigational element fit on a single line in the DDC
  • Check if, upon loading the page, enough of the main content is visible without scrolling
Examples
Navigation bar consisting of only: Home; Up; Down; Site map; Search...

4.22 Navigation

Evaluation procedure
  1. For all the pages in the sample, examine the navigation mechanisms in the pages. These include inline links, groups of links (for example menus) in different parts of the page.
  2. Check if navigation mechanisms are similar throughout pages of the sample, other than for changes that are necessary within the context of a given page.
Examples
  • Good example: A site navigation menu is present on all pages and only changes the presentation of the section and the current page
  • Bad examples:
    • A site navigation menu is present on all pages and changes structure and presentation of the links
    • Bad example: A site navigation menu is present on some pages but is not on the current page
    • Bad example: A site navigation menu is present on two pages of the sample but in a different place
    • Bad example: Inline links on other pages have a descriptive title but on the current page they do not

3.23 Non-text alternatives

Evaluation procedure

Referring to http://www.w3.org/TR/WCAG10/#gl-provide-equivalents http://www.w3.org/TR/WCAG20/#text-equiv

Non-text elements include images, graphical representations of text (including symbols), image map regions, animations (e.g., animated GIFs), applets and programmatic objects, ascii art, frames, scripts, images used as list bullets, spacers, graphical buttons, sounds (played with or without user interaction), stand-alone audio files, audio tracks of video, and video.

Check if content meets Basic Test NON_TEXT_ALTERNATIVES

  • Check if a text equivalent has been provided for every non-text element that, when presented to the user, conveys essentially the same function or purpose as auditory or visual content.
  • Content is "equivalent" to other content when both fulfill essentially the same function or purpose upon presentation to the user.
Examples
  • Good examples:
    • Null value (alt="") for decorative images such as rounded corners in frame
    • a text equivalent for an image of an upward arrow that links to a table of contents could be "Go to table of contents".
  • Bad examples:
    • alt value same as filename
    • alt=" " (space)

3.24 Objects or scripts

Evaluation procedure
Check if the document can be viewed or used,without objects or scripts active or present, such that the original intent of the web page is fulfilled.
Examples

From http://www.w3.org/TR/WAI-WEBCONTENT/wai-pageauth.html#tech-scripts:

Check that links that trigger scripts work when scripts are turned off or are not supported (e.g., do not use "javascript:" as the link target). If it is not possible to make the page usable without scripts, provide a text equivalent with the NOSCRIPT element, or use a server-side script instead of a client-side script, or provide an alternative accessible page as per checkpoint 11.4. Refer also to guideline 1.

Check that, with scripts turned off or not supported or objects not supported:

  • a given form can still be filled out and submitted
  • images are displayed and alternative text is shown

3.25 Page Size Usable

Relevant device properties
Bandwidth, CPU power, screen size
Evaluation procedure
Check if a given web page contains contiguous content, which could techically and semantically be separated into individual pages, and exceeds 3 screen sizes in length without page break.
Examples
  • a long article that extends over 3 screens full without some type of page break should be broken up into pages
  • display of navigational information, requiring continuous movement through the map would pass, because additional maps are loaded as needed

3.26 Page Title

Relevant device properties
Screen width
Evaluation procedure
  • Check if the title thematically describes the main intent of the page content
  • Check that the title does not repeat unchanged across more than 3 pages
  • Check if the title is too long to display on a screen matching the Default Delivery Context
Examples
  • a page with the primary purpose of offering ring tones along side small textual items should have this reflected in the title
  • a title such as "Main Portal" which is repeated across more than 3 pages would have to have to be adapted on the pages in question
  • conversely a title of "Uncle Tom's Cabin" in an ebook page, across many pages, would be perfectly acceptable

3.27 Provide Defaults

Evaluation procedure:
  • Submit the form without selecting any item. This will ensure that defaults, such as preselected values, will be used:
    • Check if the response is an error page
    • Check if the response is a page asking the user to fix some data
    • Check if the response, incorrectly, is the original page
  • Check that a form with a text or textarea input element does not contain a default value telling what to write there. See 3.12 Control labeling for options.
Examples
  • A country list might preselect the country code of where the page or service is most relevant
  • A ZIP code listing might have the local ZIP code preselected and surrounding ZIP codes at the top of the list
  • A date field might have today's date filled in

3.28 Scrolling

Relevant device properties
Input mode, screen size
Evaluation procedure

While checking a webpage with a device, and horizontal scrolling appears, for each element wider than screen size:

  • ensure the element is not decorative
  • ensure that the element requires the oversize display, as it would not easily be read and/or described otherwise
Examples
  • A map showing an entire trip, the information of which would be lost upon zoom
  • A X-ray image, intended for a portable medical device where zooming out would loose vital detail of the image

3.29 Structure

Evaluation procedure
  • Check that headers and list elements are properly nested according to their level
  • Check that lists elements are used to represent lists (ordered, unordered, or definition lists) and are not used for formatting effects
  • Check that headers elements are used to represent headers and are not used for formatting effects
  • Check that all information, which visually is shown as a group of related elements, uses the list element in the markup to provide that structure
  • Check that all informaton, which visually is shown as a block of text, uses paragraph or blockquote elements in the markup

3.30 Style Sheets Size

Relevant device properties
Memory size
Evaluation procedure
  • Check if whitespaces make up more than 10% - 20% of CSS content
  • Check if all style properties are used in the page
  • Check if are at least two CSS properties that could be mixed in a shorthand property
  • Check if there is at least one CSS RGB color property that could be written shorthand way

3.31 Style Sheet Support

Relevant device properties:
Style sheet support
Evaluation procedure
Check if content is readable or intelligible with stylesheets disabled

3.32 Suitable

Evaluation procedure
Examine the content to determine if the presentation is very obviously inappropriate in a mobile context.
Examples
  • a page showing Renaissance paintings in full size is not appropriate in a mobile context
  • a page showing a listing of Renaissance Paintings with thumb nails is appropriate in a mobile context
  • a page offering large downloads of applications that mobile devices do not support, such as professional image manipulation software, desktop operating systems etc. is not appropriate in a mobile context
  • a full size x-ray image for in-field access by medical personnel is appropriate in a mobile context

3.33 Tab Order

Evaluation procedure
Check if the tab order of links, form controls and objects follows a logical or thematic order.
Examples

Bad examples:

  • If a user is required to enter their first name, last name, address and contact number, the tab order jumps, for example, between the first name and the phone number and then back to the last name
  • If a pizza ordering service offers a choice of toppings and bases, the tab order jumps between those two categories when presenting the user with options
  • If the Submit button, or Sumbit and Cancel buttons, are not the last item(s) in the form

3.34 Tables Layout

Relevant device properties
Table support
Limitations of this test
As of this writing, CSS is itself deficient in that it does not support a grid layout that auto adjusts vertically. It must be recognized therefore that some layouts can currently only be achieved using tables.
Evaluation procedure
Check if tables are used in a fashion that could be achieved using CSS
Examples
An image or text which is spaced and positioned with the aid of a table

3.35 Tables Support

Relevant device properties
Table support
Evaluation procedure
Check if the table element is found within the source code although the device does not support tables

3.36 Testing

Evaluation procedure
  • Determine if this content been tested with emulators
  • Determine if this content been tested with actual devices
  • Determine if the markup has been validated
  • Determine if this content has beenn tested with the W3C Checker

3.37 URIs

Evaluation procedure
  • Check if the entry Domain, including main and subdomain, can be called up with less than 20 key presses on the device
  • Ensure that the entry URI does not require a file extension as in .html
  • Ensure that the entry URI does not require the www subdomain

3.38 Use of color

Evaluation procedure
  • Check that, excluding hyperlinks, the page does not include any other blue or purple text
  • Check if, when viewed on a monocrome screen, all content is still readable
  • Check that color is not used to represent information
Examples
  • Good Example: Using red text to represent an error message
  • Bad examples:
    • Bad example: Using blue or purple text within the page, other than for hyperlinks
    • Bad example: Telling the user that important information is highlighted in yellow