- From: Filip Ilievski <filip.dbrsk@gmail.com>
- Date: Tue, 30 Jan 2024 13:46:06 +0100
- To: public-lod@w3.org
- Message-ID: <CANbunvgSg6hxY2TM5vyP_kGR-RdwYiGNiycQS7y7upDrRJWL0A@mail.gmail.com>
Call for Papers: GeNeSy – The first international workshop on Generative Neuro-Symbolic AI (@ESWC 2024) May 26/27, 2024, Hersonissos, Crete, Greece https://sites.google.com/view/genesy2024/ The *GeNeSy* workshop aims at gathering researchers in *Ge*nerative and *Ne* uro-*Sy*mbolic AI to combine expertise, perspectives, and pioneering works and pave the way towards novel methods and paradigms for Generative Neuro-Symbolic AI. GeNeSy will feature novel and already published papers on NeSy methods for reasoning and explanations in multiple modalities, benchmarks and evaluation methods, challenges like commonsense reasoning and human-AI teaming, and reflection on ethical and social implications of GenAI. *Important dates* Article submission: March 7th, 2024 Author notification: April 4th, 2024 Camera-ready version: April 18, 2024 Workshop day: May 26 or 27, 2024 *Speakers* Frank van Harmelen, VU University Amsterdam Efthymia Tsamoura, Samsung AI Centre, Cambridge Sungjin Ahn, Korea Advanced Institute of Science and Technology (KAIST) *Topics* We invite research on different topics and challenges at the intersection of Generative and Neuro-Symbolic AI. The following list of topics is illustrative, and not exhaustive. - Neuro-Symbolic (NeSy) approaches for data generation, including (but not limited to) text; images and videos; audio; time-series; and multimodal applications - Methods for knowledge graph completion and knowledge-augmented explanation - Neuro-symbolic methods for knowledge-augmented reasoning - Neuro-symbolic methods for generative commonsense reasoning - NeSy methods for data quality assessment and evaluation - Review of Generative NeSy architectures and tasks - Human-centric and cognitive Generative NeSy architectures - Trustworthy methods for computational creativity in art and science - Frameworks for the validation, verification, and adaptation of Generative AI outputs - Applications and expected challenges for Generative NeSy methods - Ethical, societal implications and case studies of Generative AI methods *Article types* In GeNeSy, we wish to stimulate the exchange of novel ideas and interdisciplinary perspectives. To do this, we will accept 4 different types of papers: technical, short, dissemination, and review. - Technical papers will be judged on their technical soundness and rigour, though allowances made for novel or experimental directions. - Short papers may be position papers or reports of new directions especially where less mature (but nonetheless technically sound) work will be considered. - Review articles of GeNeSy architectures and applications. - Dissemination articles include already published papers from top AI venues such as NeurIPS, WebConf, AAAI, ICML, ICLR, ACL, EMNLP that are relevant to the workshop. All papers must be formatted using the CEUR Workshop Proceedings template, and submitted electronically via EasyChair <https://easychair.org/conferences/?conf=genesy2024>. *Selected papers will be invited to submit to the Special Issue on “Knowledge Graphs and Neurosymbolic AI” of the **Neurosymbolic Artificial Intelligence* <https://neurosymbolic-ai-journal.com/>* journal*. Please, find more info on the CfP at this link <https://sites.google.com/view/genesy2024/call-for-papers>. *Location* GeNeSy is co-located at the Extended Semantic Web Conference (ESWC) 2024. More information on the venue as well as travel information can be found on the website: https://2024.eswc-conferences.org/ *Workshop organisers* Filip Ilievski (Vrije University Amsterdam), Jacopo de Berardinis (King’s College London, University of Manchester), Nitisha Jain (King’s College London), Jongmo Kim (King’s College London) For questions, you can reach the workshop organisers at genesyworkshop2024@googlegroups.com Kind regards, Filip, Jacopo, Nitisha, Jongmo https://sites.google.com/view/genesy2024/
Received on Tuesday, 30 January 2024 12:46:26 UTC