Re: consciousness, and AI KR

@Dave, we are not trying to convince anyone about anything, afaik :p
everyone should satisfy themselves as they choose (....) we try to make
plausible assertions that can be useful to navigate the complexity of this
discussion space
@Patrick  thank you , please pick out some terms that you think should be
defined to support the discussion and lets start a thread. lets pin
something down, if we could manage to define some terminology we can
consider that a resource and  take a step forward

On Tue, Oct 24, 2023 at 12:33 AM Patrick Logan <patrickdlogan@gmail.com>
wrote:

> There are several terms here without even semi-formal definitions that are
> doing a lot of work, i.e. your claims are vague and difficult to discuss
> clearly let alone measure and assess.
>
> Given the wide berth of interpretation it's especially bold to claim a
> false dichotomy of either one agrees with your "facts" or one is relying on
> "faith".
>
> On Mon, Oct 23, 2023, 10:42 AM Dave Raggett <dsr@w3.org> wrote:
>
>> From the AI KR and computational view consciousness isn’t a hard
>> problem.  Subjective experience distils to information processing with
>> systems of neurons. Redness is just a vector of neural activation. Agents
>> have situational awareness, i.e. a model of their current environment and
>> goals, enabling them to decide on what actions to take. This also includes
>> models of other agent’s beliefs and goals, i.e. a theory of mind.  Agents
>> also benefit from a model of past, present and future, i.e. a functional
>> episodic memory that complements encyclopaedic memory, such as birds fly
>> and dogs bark. Episodic memory enables agents to reason about cause and
>> effect, to understand intent, and to create and adapt plans.
>>
>> However, this won’t convince everyone.  Plenty of people have beliefs
>> that are a matter of faith rather than of facts. That’s fine. But
>> engineering and science doesn’t work that way!  AI will continue to evolve
>> and AGI is just a matter of time.  I attach a picture that makes the point.
>> A stochastic synthesis of ideas as evidence that artistic sensibility can
>> be reduced to neural processing.
>>
>>
>> > On 22 Oct 2023, at 05:38, Paola Di Maio <paola.dimaio@gmail.com> wrote:
>> >
>> >
>> > Consciousness is too huge a topic . Undecidable, too much can be said
>> about without ever reaching any conclusion, possibly because no single
>> theory or point of view can exhaust the subject. However
>> > I d like to suggest simply that it is tackled only in relation to AI
>> KR. Surely. consciousness is relevant to AI and to KR discussion and
>> potential standards. We should keep that in mind where possible and
>> parsimoniously limit our considerations accordingly
>> >
>> > I ll leave it to Carl to liaise with the WoT group, since he is a
>> member there and brought up the subject.
>> > I ll work on tidying up some of the resources shared on the list into
>> some form of coherent narrative when I can, that is my next task
>>
>> Dave Raggett <dsr@w3.org>
>>
>>
>>

Received on Tuesday, 24 October 2023 00:33:53 UTC