- From: Owen Ambur <Owen.Ambur@verizon.net>
- Date: Sat, 1 Sep 2018 23:13:36 -0400
- To: "'ProjectParadigm-ICT-Program'" <metadataportals@yahoo.com>, "'Dave Raggett'" <dsr@w3.org>
- Cc: <public-aikr@w3.org>
- Message-ID: <009501d4426a$f0aaa9b0$d1fffd10$@verizon.net>
Milton, I believe we have very similar objectives. With respect to digital dashboards, see https://www.linkedin.com/pulse/non-interoprable-dashboards-owen-ambur/ Regarding open science, if you don’t already know about the Center for Open Science, you can check them out at http://stratml.us/drybridge/index.htm#COS However, with regard to treaties, technical documents, etc., I hope we can avoid “paving the cowpath <http://designingsocialinterfaces.com/patterns/Pave_the_Cowpaths> .” Several years ago the World Bank discovered and admitted that hardly anyone was reading their reports: https://mises.org/wire/world-bank-admits-no-one-reads-its-research Meanwhile, the so-called *Open* Government Partnership continues to publish its plans <https://www.opengovpartnership.org/resources/ogp-process-step-2-develop-action-plan> and reports <https://www.opengovpartnership.org/resources/ogp-annual-reports> in PDF. We can and we should do better than that. See https://en.wikipedia.org/wiki/Machine-Readable_Documents Although I believe we have common and complementary objectives, we won’t really know that unless and until the AIKRCG has a performance plan along the lines of the one outlined at http://stratml.us/drybridge/index.htm#AIKRCG Too often, those who believe they are collaborating are really just talking past each other … because they don’t have a clear and shared understanding about the performance indicators that will tell the tale of whether they’re making any progress or not. Owen From: ProjectParadigm-ICT-Program <metadataportals@yahoo.com> Sent: Tuesday, August 28, 2018 5:30 PM To: Dave Raggett <dsr@w3.org>; Owen Ambur <Owen.Ambur@verizon.net> Cc: public-aikr@w3.org Subject: Re: AIKRCG & StratML Hi, My intake in narrow technical terms is to provide universal digital dashboards for (1) collaboration between stakeholders in sustainable development at national, regional and international levels. Realizing the Sustainable Development Goals of the United Nations depends heavily on so-called partnerships, which are strategic alliances with common objectives and concrete goals and timelines. (2) Open and inclusive science and innovation, which is both a main directive of the United Nations and the European Union. (3) Treaties monitoring, which requires treaty texts, technical committee documents, general assembly resolutions, working papers and half a dozen other types of deliverables and outputs from periodic meetings serve as a frame of reference for action, timelines and deliverables. Recapping: Collaboration, universal collaboration dashboards and universal hierarchies to deal with all the UN documents, and also with issues related to open science, innovation and stakeholder participation and interaction. Thus STRATML, conversational agents, big data mining for sustainable development, standardized ontologies for open and inclusive collaboration, and open science, and most of all, structured ontology tools for dealing with a myriad of verbose documents are necessary for my intake on AI and KR. The United Nations is HIGH on intent but LOW on technical content and it is mostly UNESCO and librarians pushing the technological agenda. But both the United Nations and the European Union stress the human(e) use of AI, ethics and socially correct use of AI. So I feel that within this AI-KR CG we can potentially provide valuable contributions towards achieving goals in line with UN and EU directives both also in general (commercial) progress in the fields of AI and KR. Milton Ponson GSM: +297 747 8280 PO Box 1154, Oranjestad Aruba, Dutch Caribbean Project Paradigm: Bringing the ICT tools for sustainable development to all stakeholders worldwide through collaborative research on applied mathematics, advanced modeling, software and standards development On Tuesday, August 28, 2018 1:13 PM, Dave Raggett <dsr@w3.org <mailto:dsr@w3.org> > wrote: Hello Owen, On 27 Aug 2018, at 03:28, Owen Ambur <Owen.Ambur@verizon.net <mailto:Owen.Ambur@verizon.net> > wrote: Dave, my interest is in helping people achieve their objectives. I joined the Artificial Intelligence Knowledge Representation Community Group to explore potential relationships to the StratML standard (ISO 17469-1), whose vision is: A worldwide web of intentions, stakeholders, and results. It seems to me the <http://stratml.us/carmel/iso/AIKRCGwStyle.xml#_63f01ac6-83e9-11e8-9a9a-23c8e53a5ccc> vision of the AIKRCG – knowledge is exchanged and reused to enable learning and participation – is closely related to the purposes of StratML. If you’d like to share the outlines of your proposal, I’d love to include it in the StratML collection. My approach is inspired by CMU’s long standing work on ACT-R as one of the most well known cognitive architectures, so you could start by looking at that, see: http://act-r.psy.cmu.edu/about/ How would you expect to include that? In respect to combining deep learning with symbolic processing, one reference is: "Towards Deep Symbolic Reinforcement Learning", 1 Oct 2016 Marta Garnelo, Kai Arulkumaran, Murray Shanahan https://arxiv.org/abs/1609.05518 The notion of conversational agents that help people achieve their objectives is particularly appealing. I am considering arranging a W3C Workshop next year on standardisation opportunities for conversational agents, with the idea of making it easier for organisations to embed text or voice based conversational agents within web pages. AIML is one example of a markup language for conversations. Amazon and Google have demonstrated the potential for intent based conversations for their respective smart speaker ecosystems. They all use simplistic approaches as a way to avoid handling meaning in a rich and general way. Best regards, Dave Raggett <dsr@w3.org <mailto:dsr@w3.org> > http://www.w3.org/People/Raggett W3C Data Activity Lead & W3C champion for the Web of things
Received on Sunday, 2 September 2018 03:13:59 UTC