Re: Chair Nomination: Wolfgang Wimmer

Wolfgang

Thank you for the welcome and for initiating this work, congrats for making
a start.

I have been working on closely related challenges -- including how websites
and AI agents negotiate capabilities, permissions, and identification in
machine-consumable ways.
*On the problem statement (Q1):*
The gap between what robots.txt can express and what AI agents actually
need to negotiate is well documented. robots.txt controls access but not
usage, has no semantic layer, and cannot distinguish between crawling for
training versus real-time retrieval versus agentic interaction. Proposals
like ai.txt and llms.txt address parts of this, but a structured JSON
policy file like siteai.json could provide the richer, machine-parseable
expressiveness that the current landscape lacks. Btw, do you think is
explainer is useful?
 https://github.com/w3c-cg/aikr/blob/main/robot_text_explainer.md

*On use cases and requirements (Q2):*
I ll be happy to contribute some interoperability dimensions that may
enrich the spec

*On related efforts to coordinate with (Q3):*
Several active efforts overlap with A2WF's scope:

W3C AIKR Community Group -- We have published Technical Notes on
machine-consumable specs and are actively working on WebMCP
interoperability. Our work on the MCP Model Card Specification addresses
how AI systems declare their own capabilities and constraints, which is the
complementary side of what siteai.json addresses.
W3C AI Agent Protocol Community Group -- Their work on agent discovery,
identification, and collaboration protocols is directly adjacent.
W3C Autonomous Agents on the Web (WebAgents) CG -- Focused on Web-based
multi-agent systems aligned with Web Architecture.
MCP (Model Context Protocol) ecosystem -- The emerging WebMCP work and
related IETF drafts (VCAP, ATEP) address protocol-level interoperability
that siteai.json would need to integrate with.

The IETF well-known URI registration for agent.json (
github.com/protocol-registries/well-known-uris/issues/66) -- relevant to
the discovery mechanism for siteai.json.
The ai.txt proposal and llms.txt convention -- both address subsets of the
same problem space.

In essence, from AIKR point of view, the proposed 2WF approach aligns with
knowledge representation in AI because a structured JSON policy file is
essentially a formal ontology of web-site intent -- it makes implicit human
expectations about agent behavior explicit, machine-parseable, and
reasonably inferrable, which is the core role for KR .


Look forward to hear others thoughts on everything

Best regards

Paola Di Maio, AI KR CG


On Tue, Mar 31, 2026 at 9:14 PM Wolfgang Wimmer <wwimmer@ssc-slovakia.com>
wrote:

> Hello everyone,
>
> As the original proposer of the A2WF Community Group, I would like to put
> my name forward as Chair.
>
> A bit of background: I initiated the A2WF project to address the lack of
> machine-readable governance for AI agent
> interactions on websites. The current draft specification is available at
> https://a2wf.org/specification/
> and the source repository is at https://github.com/a2wf/spec.
>
> As Chair, my priorities would be:
>
> - Establishing a regular meeting cadence (likely biweekly calls)
> - Collecting community input on the draft specification
> - Coordinating with related efforts at IETF (AIPREF), NIST (CAISI), and
> other W3C groups
> - Working toward a first Community Group Report
>
> If there are no objections, I would be happy to take on this role. If
> anyone else is interested in
> serving as Chair or Co-Chair, please speak up. Shared leadership is very
> welcome.
>
> Best regards,
>
> Wolfgang Wimmer
> --
>
>
> www.SSC-Slovakia.com
>
> SSC Sales consulting co.ks.
> Panonska cesta 47, 851 04 Bratislava, SK
>
>
> *Mobile: +43 676 455 34 85 *
>

Received on Friday, 3 April 2026 12:06:39 UTC