- From: Milton Ponson <rwiciamsd@gmail.com>
- Date: Mon, 24 Jun 2024 12:08:33 -0400
- To: paoladimaio10@googlemail.com
- Cc: W3C AIKR CG <public-aikr@w3.org>
- Message-ID: <CA+L6P4wyWXA6j_9PcrGGRqBpSL_YULYxabKgC8fHLGUce1pKNQ@mail.gmail.com>
Seems like more people have been exploring the same concepts. This GAIA model restricts itself to generative AI, but it can be generalized using category theory to all forms of knowledge representations. Combining constructibility theory, category theory, symplectic geometry and algebraic topology we can define universes of discourse that cover most of the knowledge representation methodologies using generalized graph concepts. Milton Ponson Rainbow Warriors Core Foundation CIAMSD Institute-ICT4D Program +2977459312 PO Box 1154, Oranjestad Aruba, Dutch Caribbean On Fri, Jun 21, 2024 at 11:43 PM Paola Di Maio <paola.dimaio@gmail.com> wrote: > > Greetings, W3C AI KR CG, Happy Solstice > > a great way to start the summer by reading this paper > In my view this is a contribution towards neurosymbolic AI/KR and a clear > signal > > Enjoy > ---------------------------------------------------------------- > > GAIA: Categorical Foundations of Generative AI > by Sridhar Mahadeva > https://arxiv.org/pdf/2402.18732 > > In this paper, we propose GAIA, a generative AI architecture based on > category theory. GAIA is based on a hierarchical model where modules are > organized as a simplicial complex. Each simplicial complex updates its > internal parameters biased on information it receives from its superior > simplices and in turn relays updates to its subordinate sub-simplices. > Parameter updates are formulated in terms of lifting diagrams over > simplicial sets, where inner and outer horn extensions correspond to > different types of learning problems. Backpropagation is modeled as an > endofunctor over the category of parameters, leading to a coalgebraic > formulation of deep learning. >
Received on Monday, 24 June 2024 16:08:50 UTC