- From: Marcos Cáceres via GitHub <noreply@w3.org>
- Date: Thu, 09 Apr 2026 00:17:35 +0000
- To: public-device-apis-log@w3.org
Thanks for the pointer to the toolchain, @anssiko. I've read the README and the 2014 report carefully. I could regenerate a report using wpt.live and submit it to w3c/test-results. But I don't think that would meet the Council's requirement — and the 2014 report in that repository illustrates the limitation of the generated format. ### The problem with the generated format A new run today via wpt.live would produce the same kind of output as the [2014 snapshot](https://w3c.github.io/test-results/vibration/20141118.html): pass/fail percentages across whatever browsers were tested. That format is designed for checking test conformance. It is not designed to answer the questions [W3C Process §6.3.2](https://www.w3.org/policies/process/#implementation-experience) requires the Team to consider when assessing adequate implementation experience. That format doesn't answer any of those questions. A score on a given date doesn't capture that Firefox later removed support entirely on desktop, or that its Android implementation was a stub returning `true` without performing vibration ([Bug 1653318](https://bugzilla.mozilla.org/show_bug.cgi?id=1653318)). Regenerating that format now would repeat the same structural limitation. ### What an implementation report actually needs to do The [Council Report](https://www.w3.org/2025/08/vibration2-council-report.html#recommendations) asked the WG to *document* what implementation experience the API **currently** has (verbatim: "document what implementation experience the API currently has (issue 33)"). The Process's §6.3.2 criteria require an authored document that addresses each of those questions — not a snapshot of test pass rates. PR #55 addresses each §6.3.2 question directly, with a dedicated section for each: - *Is each feature of the current specification implemented, and how is this demonstrated?* — yes, with automated and manual WPT results against the current spec. - *Are there independent interoperable implementations?* — answered honestly: no, single engine (Blink). - *Are implementations created by people other than the authors?* — yes, addressed. - *Are implementations publicly deployed?* — yes, addressed. - *Are there reports of difficulties or problems?* — yes, addressed. The [Geolocation implementation report](https://w3c.github.io/geolocation/reports/implementation.html) uses the same structure — a DAS + WebApps joint deliverable published 27 February 2026 for the October 2025 Recommendation. If that is adequate for Geolocation, it is the right model for Vibration. ### The charter gate Closing this PR without merging leaves [vibration#33](https://github.com/w3c/vibration/issues/33) open and the Council's condition unmet — a gate item for [charter-drafts#781](https://github.com/w3c/charter-drafts/issues/781). Submitting a pass/fail snapshot to w3c/test-results would not close that gate, because it does not answer what the Council asked. If there are concerns about the *content* of the report, those should be raised as review comments. I'd ask that the PR be reopened for content review. @reillyeon has been pinged. -- GitHub Notification of comment by marcoscaceres Please view or discuss this issue at https://github.com/w3c/vibration/pull/55#issuecomment-4210563389 using your GitHub account -- Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Thursday, 9 April 2026 00:17:36 UTC