Re: Why LLMs are bound to fail

On Fri, 9 Aug 2024 at 03:09, Dave Raggett <dsr@w3.org> wrote:

>
> On 8 Aug 2024, at 17:49, Timothy Holborn <timothy.holborn@gmail.com>
> wrote:
>
> I don't think it's safe to work on consciousness tech.  Good application
> development seems to get shut down, whilst the opposite appears to be the
> case for exploitative commodification use cases.
>
>
> I don’t agree, based upon a different conceptualisation of what it might
> mean for an AI system to be sentient, i.e. a system that aware of its
> environment, goals and performance. Such systems need to perceive their
> environment, remember the past and be able to reflect on how well they are
> doing in respect to their goals when it comes to deciding on their actions.
>
> That is pretty concrete in respect to technical requirements. It is also
> safe in respect to the limitations of AI systems to grow their
> capabilities. Good enough AI systems won’t need huge resources as they will
> be sufficient for the tasks they are designed for, just as a nurse working
> in a hospital doesn’t need Ph.D level knowledge of biochemistry.
>
> Dave Raggett <dsr@w3.org>
>
>
I support your work, i think its important. obviously done within w3c,
etc.   disagreement, the capacity to, is a good and important part of
qualia overall...  a feature, not a bug.

relative, john carew eccles, started me on a journey from 2000 working on
applying theory on 'online data storage', as did overtime lead me here...
although the field of interest from a sociology / social sciences point of
view, goes back several years earlier;  a bit like living with a prosthetic
eye since the inability to diagnose the problem without more advanced tech
in 1979/80, experiences influences - life, areas of interests,
perspectives, etc.

some notes;

https://cdn.knightlab.com/libs/timeline3/latest/embed/index.html?source=1r-bo83ImIEjSCmOFFMcT7F79OnCHDOGdkC_g9bOVFZg&font=Default&lang=en&hash_bookmark=true&initial_zoom=4&height=750#event-consciousness-qm-ai-studies-video-edition

best.

timo.

Received on Thursday, 8 August 2024 17:17:26 UTC