- From: David Fazio <dfazio@helixopp.com>
- Date: Thu, 11 Sep 2025 20:45:08 +0000
- To: Karl Groves <Karl.Groves@afixt.com>
- CC: "public-maturity@w3.org" <public-maturity@w3.org>
- Message-ID: <0B1AC631-5862-4114-B2D3-1C877DAB86B2@helixopp.com>
Thank you, so very much, Karl for your well thought out comments. This is incredibly helpful, for a multitude of reasons. We will discuss during our next task force meeting on the 24th. - David Fazio, Accessibility Maturity Model Task Force Co-Chair Sent from my iPhone On Sep 11, 2025, at 1:34 PM, Karl Groves <Karl.Groves@afixt.com> wrote: Hi there. I would like to submit the below as a response to the posting on LinkedIn: https://www.linkedin.com/feed/update/urn:li:activity:7369518455844167680/ Before answering the specific questions, I also wanted to address the question as to whether the AMM should exist in the first place. I’ve written on that topic here: https://afixt.com/reviewing-the-logic-and-value-of-the-w3cs-accessibility-maturity-model/ The TL; DR is that I do believe that there is value in having an accessibility-specific maturity model. Not specifically mentioned in the article is that I also think that it is appropriate for the WAI to have developed one. For some context: We (AFixt) offer strategic consulting services for customers, and those services include full assessment of the customer’s accessibility program, as measured against the dimensions and proof points of the W3C’s Accessibility Maturity Model. We chose the AMM specifically because it is a W3C thing, rather than being proprietary. We neither want to reinvent the wheel nor use a competitor’s wheel. While there are certainly nitpicks we have with the AMM, it is the most appropriate resource for our needs. We’ve even built a web-based reporting system to deliver the reports and we have created a “Program in a box” that helps customers meet the criteria of every proof point in the AMM. On to the specific questions: Is it clear how you might use the model? Mostly, yes. The AMM documentation explains the structure (seven “dimensions”), the role of proof points as evidence, and gives per-dimension “How to Evaluate … Maturity Level” steps (identify the proof points you’ll use; gather your org’s documentation against them, etc.). It also says outcomes that don’t apply can be marked N/A, and that levels are cumulative (you advance only after meeting lower-level criteria). There’s even an Excel assessment tool prototype to track the evaluation. However, the structure of the Excel file tends to be a little on the confusing side. In addition, we found that some of the scoring in the Excel file was broken. Unfortunately, we did not document the specific problems we found. We just fixed them and moved on. What’s missing or could be clearer: there’s no single “quick-start” flow that shows how to (a) pick a subset of dimensions for a first pass, (b) aggregate per-dimension results into an overall picture, or (c) weight dimensions by risk. The spreadsheet is flagged as experimental and Excel-only for now, so teams that need a web tool will have to roll their own (as I said above, we built one) Providing a one-page checklist and a worked example would help first-time users. Is it clear how levels are used to define organizational maturity? Yes. The AMM defines four levels—Inactive, Launch, Integrate, Optimize—and describes the progression plainly (e.g., Launch = recognized need and initial planning; Integrate = roadmap and organized approach; Optimize = embedded org-wide with continuous evaluation). It also notes the terms mirror PDAA for consistency and that levels are cumulative. Per-dimension sections then restate what proof would demonstrate each level. Nuances that could use tightening: the line between Integrate and Optimize could benefit from more quantitative thresholds or concrete “evidence examples” (e.g., cadence of KPI reviews, percent of portfolio covered by checks) and guidance on handling partial completion of proof points without over-promoting to the next level. Are the dimensions broad enough to encompass all aspects of organizational maturity? For ICT accessibility programs, yes. The seven dimensions are: Communications, Culture, ICT Development Lifecycle, Knowledge & Skills, Personnel, Procurement, Support. Each includes proof points and a “goals & metrics” concept, which together cover policy/governance, training, lifecycle integration, buying, and support—areas that many orgs miss. Potential edge gaps / clarifications: * Governance & strategy lives inside “Oversight & Culture” (financial commitment, policy) rather than as a standalone dimension. That’s fine, but some readers may expect a discrete “Governance” bucket. * Third-party lifecycle after purchase is implied under Procurement proof points. Calling out ongoing vendor management explicitly would help. There are plenty of organizations who only ever procure ICT, rather than developing their own, and so vendor management is a much bigger focus for them. * Cross-dimensional measurement exists as “Dimension Goals & Metrics,” but there’s no single, top-level measurement framework; that’s left to the assessor. Do you think the guidance works for different sizes and types of organizations? In principle, yes—and the text says so. The Abstract claims it scales from solo consultancies to large enterprises and government, and the Scope says you can assess sub-units if you document the limited scope. The N/A convention plus choose-your-proof-points approach makes it adaptable to different org sizes. That said, smaller orgs may struggle: many proof points assume formal documentation, training programs, and KPIs. For start-ups/micro-teams/ programs with limited scope, a lightweight “starter set” (minimal viable policy, one-pager process, role-based quick training) and examples of acceptable evidence would make adoption easier. For the most part, AMM really carries an assumption that the organization that implements it is huge. I’d estimate that it is most relevant to an organization that is $100M in revenue or higher. Below that, the barrier to entry is too high, which is why we created the “Program in a box”. Are there any concepts or terms that need better explanation? A few come to mind: * Proof points. They’re defined as evidence of maturity, but a small rubric or examples of “sufficient evidence” vs “insufficient” would reduce assessor variance. (Right now, partial completion is mentioned only broadly.) * Level criteria granularity. The high-level table is clear; adding per-dimension quantitative indicators (e.g., % of procurements with a11y language; % of digital products with pre-release a11y QA) would help calibrate “Integrate” vs “Optimize.” * Acronyms/terms. The text references ACR in the ICT lifecycle; expanding it inline to “Accessibility Conformance Report” (even if defined in the Glossary) would reduce ambiguity. Also, terms like “communities of practice” and the split between “organizational support” vs “external support” could use a sentence each in place. * N/A usage guardrails. The AMM rightly allows marking outcomes as not applicable; a caution about over-use of “N/A” and a few examples would protect against inflated maturity. The AMM gives you enough to run a defensible assessment now: pick relevant dimensions, gather proof points, determine each dimension’s level using the per-dimension guidance, and (optionally) track it in the Excel tool. Adding a short getting-started guide, example evidence, and aggregation guidance would make it even easier and more consistent across assessors. -- Karl Groves (he/ him) Digital Accessibility Consultant AFixt: Accessibility + Fixed karl.groves@afixt.com +1 443-333-1255 #800 [image001[37].png]<https://outlook.office.com/bookwithme/user/190b5beafb2e432fbb0718fb16b4bb71@afixt.com?anonymous&ep=signature> Book time to meet with me<https://outlook.office.com/bookwithme/user/190b5beafb2e432fbb0718fb16b4bb71@afixt.com?anonymous&ep=signature> This message and any attachment are confidential and are legally privileged. It is intended solely for the use of the individual or entity to whom it is addressed and others authorised to receive it. If you are not the intended recipient, please telephone or email the sender and delete this message and any attachment from your system. Please note that any dissemination, distribution, copying or use of or reliance upon the information contained in and transmitted with this email by or to anyone other than the recipient designated above by the sender is unauthorised and strictly prohibited.
Attachments
- image/png attachment: image001_37_.png
- image/png attachment: 02-image001_37_.png
Received on Thursday, 11 September 2025 20:45:19 UTC