- From: Steve CHRISTIE <Steve.Christie@homeaffairs.gov.au>
- Date: Thu, 2 Oct 2025 07:47:30 +0000
- To: "public-maturity@w3.org" <public-maturity@w3.org>
- Message-ID: <207075527@mr04-v.homeaffairs.gov.au>
UNOFFICIAL Hi Tamsin et al, Thank you for the opportunity to provide feedback on your W3C Accessibility Maturity Model (AMM). Apologies for the lateness of this submission. I was pleased to see the release of the original versions of the AMM in 2023, and was keen to see it adopted for use in our organisation. However, I soon noticed that the AMM spreadsheet was a rather rudimentary document, and unsuitable to the purpose of its utilization by non-technical general administrative staff and managers across the agency. Therefore, in 2024 I set about on an attempt to re-format it into something more useful and understandable as a MS Word document. For your benefit, the results of my work are attached, although they are still somewhat incomplete. Please note that my re-formatted version was based on the November 2023 version of the AMM Assessment Template, so there may be some errors and omissions compared with the latest version. The various changes and modifications I have applied can be summarized into the following areas of suggested improvements for your consideration. Dimensions Terminology & Order I found that the terminology used to describe the 7 organisational dimensions of maturity were somewhat limited and academic. As well, the order of the dimensions did not reflect their natural association and relative importance, based on the number and distribution of Proof Points in each dimension.. I therefore decided to expand them into fuller descriptions, and to re-order them into a more coherent logic, as follows: 1. Communications, Services & Systems Access to information as it relates to an organizations mission, offerings and activities, as well as the accessibility of all internal / external communications systems and services 2. Support, Assistance & Help - Assistance provided to internal employees and external customers for access to information and services, especially for those with disabilities reliant on assistive technologies 3. ICT Development Life Cycle - Incorporation of user centred accessibility into web, software and hardware considerations in development processes - from user research, idea conception, to design, development, testing, ACR production, maintenance and obsolescence 4. ICT Procurement & Contracts - Strategic processes that concentrate on finding and acquiring accessible ITC products and services required by an organization. Activities may include: sourcing, negotiation, and selecting goods and services 5. Knowledge, Skills & Training - Ongoing education, training and outsourcing practices to fill knowledge, competency and skill gaps for accessibility operations 6. Recruitment & Personnel - Job descriptions, recruiting, disability-related employee resource groups, accommodations and adjustments necessary to support lived-experience for accessibility efforts 7. Governance & Culture - Attitudes, sensitivity, and behaviours around accessibility, including internal interactions, perceptions, and articulation into decision-making codes, roles and structures. Proof Points Outputs & Priorities I found that the Proof Points by themselves did not adequately indicate the type of output needed to satisfy the specific requirement. Therefore, I decided to categorise each of the Proof Points into several Output Types or Deliverables to make it clearer for non-technical staff to better grasp the concepts. Each of the 140+ Proof Points can be categorized into 6 types of Outputs that directly contribute to delivering digital accessibility Outcomes. These Outputs are the tangible products resulting from activities undertaken to implement the Proof Points. The Output Types for Proof Points are: 1. Systems Proof Points that require an organised provision of tangible benefits to support digital accessibility and inclusion e.g. helpful organisational structures supporting Outcome delivery 2. Services Proof Points that require an organised provision of intangible benefits to support digital accessibility and inclusion e.g. helpful organisational activities supporting Outcome delivery 3. People - Proof Points that require an organised provision of staffing resources and roles to deliver and support digital accessibility and inclusion e.g. helpful personnel supporting Outcome delivery 4. Processes Proof Points that require an organised and regular workflow pattern to be applied in support of digital accessibility and inclusion e.g. helpful workflow patterns, methods & techniques supporting Outcome delivery 5. Products Proof Points that require a digital system to be designed, developed and maintained to quality standards that support accessibility and inclusion e.g. helpful digital systems, websites & mobile apps supporting Outcome delivery 6. Documents Proof Points that require helpful information to be tabulated and distributed in support of digital accessibility and inclusion e.g. helpful information and guidance supporting Outcome delivery. I also noticed that the Proof Points by themselves did not indicate sufficiently their level of relative importance by their impact on the end users. I therefore decided to apply the MoSCoW rating scale to each of the Proof Points to better indicate their relative importance, improvement, immediacy, and impact from an end user perspective. Each of the 140+ Proof Points can be categorized into 4 types of Priorities that indicate their level of importance for delivering Outcomes that have an immediate and direct impact on Users. These Priorities show the relative value and significance of each Proof Point to the User. The MoSCoW Priorities for Proof Points are: 1. Must Haves primary, immediate and direct impact or improvement for the end user 2. Should Haves secondary, supportive and progressive impact or improvement for the end user 3. Could Haves tertiary, complementary and indirect impact or improvement for the end user 4. Will Not Haves no evidence of an attempt to make any impact or improvement for the end user. Scoring Scale Rules & Ratings While I acknowledge that the W3C AMM is informative (and not normative), nonetheless I found that the scoring system implemented was somewhat convoluted and skewed in its numerical weighting and results. For instance, depending on which dimension was being measured, as there not an equal number of Proof Points in each dimension, then the value of each Proof Point was not equal. This was masked by the use of percentages in the scoring method. I therefore attempted to devise a simpler scoring method that would provide an equal value or weight to each Proof Point, no matter which dimension was being measured. I have attached a sample of the scoring scheme. For each of the seven Dimensions, the degree of Maturity is calculated by rating each of the Proof Points against the following scoring scheme: 1. Priority of Proof Point Could Have (1 point), Should Have (2 points), Must Have (3 points), and Will Not Have (1-3 points) 2. Stage of Implementation Inactive (0 points), Launch (1 point), Integrate (2 points), and Optimise (3 points) 3. Level of Activity No Activity (0 points), Started (1 point), Partly Done (2 points), and Complete (3 points). I trust you will find that these suggestions can add helpful additional granularity to the W3C AMM, as an aid for improved clarity, understanding and adoption. For your consideration. Let me know if you need anything further. Thanks, Steve. ------------------------------------------------------------------------------ Stephen Christie Digital Accessibility Specialist Digital Experience (DEX) Section Digital Delivery Branch | ICT Division Australian Department of Home Affairs ------------------------------------------------------------------------------ UNOFFICIAL Important Notice: The content of this email is intended only for use by the individual or entity to whom it is addressed. If you have received this email by mistake, please advise the sender and delete the message and attachments immediately. This email, including attachments, may contain confidential, sensitive, legally privileged and/or copyright information. Any review, retransmission, dissemination or other use of this information by persons or entities other than the intended recipient is prohibited. The Department of Home Affairs, the ABF and the National Emergency Management Agency respect your privacy and have obligations under the Privacy Act 1988. Unsolicited commercial emails MUST NOT be sent to the originator of this email.
Received on Thursday, 2 October 2025 07:48:46 UTC