Re: AI works best when humans are there to hold its hand.

> On 28 Jun 2020, at 11:22, Paola Di Maio <paola.dimaio@gmail.com> wrote:
> 
> David 
> 
> just seen this - tried to join your CG a couple of times but its not happening 
> Pinged the sysadmin today, So swamped-
> 
> Now, to that diagram, how fun!! where did you get the https://www.w3.org/Data/demos/chunks/robot/ <https://www.w3.org/Data/demos/chunks/robot/>sound from????

I found free to use sound clips via web search and modified them to suit use in a web browser.

> its the sound effect that does the trick

Thanks.

> However, I don't see the cognitive level, I must admit, Perhaps you could tell more about the cognitive aspect of this robot?
> where is the cognitive modelling?

Cognition encompasses declarative and procedural knowledge. In this demo I focused on modelling the behaviour, but I also sketched the associated declarative knowledge - trying expanding the facts graph to view it.  This could be used for validating rules as well as for synthesising rules to fulfil new requirements when reconfiguring the factory.

Note that the demo includes external functions that essentially correspond to things that would be handled by the cortico-cerebellar circuit. The movement of the robot arm involves real-time coordination of 3 joints as well as the gripper.  You wouldn’t be able to play the piano if you had to consciously think about the position of each finger. The portico-basal ganglia circuit devolves responsibility for actions to the cortico-cerebellar circuit which involves real-time control based upon access to sensory input in the cortex and independent of conscious thought.

The cognitive model includes the means to concurrently wait for conditions to become true whilst reasoning about other things.  As an example, the robot arm needs to wait for a bottle to reach the end of the belt before grasping it. The bottle may already be at the end of the belt or this may happen at some time in the future.

That is like being able to handle an event that may have already happened or will happen in the future. The cognitive agent signals that it is waiting, and when the condition becomes true, a chunk is pushed to the goal queue to trigger the appropriate follow on behaviour. The robot arm is treated similarly in that the cognitive agent signals the desired location and orientation of the gripper, and the subgoal to be queued when that has been realised.

There are more details at:
 https://github.com/w3c/cogai/blob/master/demos/robot/README.md <https://github.com/w3c/cogai/blob/master/demos/robot/README.md>

A good question is how to acquire procedural knowledge in the form of rules. There has been plenty of research. In some cases, people start by a creating and refining a declarative model, and when that has been found to work well, compiling into rules. See:

 https://www.w3.org/Data/demos/chunks/chunks.html#compilation <https://www.w3.org/Data/demos/chunks/chunks.html#compilation>

And take a look at the diagram for the theory of skill retention with the distinction between declarative knowledge and procedural knowledge, and the importance of practice for reinforcing skills. 

> To me, this is pure mechanical automation, I do not see any bit of intelligence or any creativity in such a process Mechanical automation has become very sophisticated these days, and very fast!!!

Try teaching a robot to dance and not fall over, or take a close look at very young infants learning to move, grab things and keep their balance!  This involves a wide range of systems, including cognition, proprioception, learning declarative and procedural knowledge, and the acquisition of “muscle memory” through repetition.

> 
> https://www.youtube.com/watch?v=4DKrcpa8Z_E <https://www.youtube.com/watch?v=4DKrcpa8Z_E>
> 
> Your simulation is fun, but it is nowhere near the state of the art in the real world afaik but maybe you can say a bit more..

It is only a simple demo, but shows the potential for a general purpose cognitive agent. How many RDF systems are used for real-time control? The longer term technical aims for Cognitive AI are listed at:

 https://github.com/w3c/cogai/blob/master/README.md#technical-aims

Each demo is a small step along the path.  

> 
> I am interested in automating higher cognitive functions, for example, one of the challenges would be to create new designs and no, I don't think ANNs can do that they only spit out a probabilist remodelling of some input

You would be very welcome to help with work on skill acquisition. However, I would defer work on artistic creativity until we have first mastered other areas, including the role of emotions for controlling cognition, as noted by Marvin Minsky, given the importance of emotion for creativity.

I have an outline of a framework for emotions, involving a feedforward network and back propagation, and plan to work on it after progressing work on natural language understanding and machine learning. The main reason for doing things in that order, is that human emotions are usually related to social interactions, so we need to model that first.

> 
> Tell us more about the cognitive model behind your wine filling robotic arm
> 
> Feature request: a robot that can fold origami following the algo
> 
> p
> 
> On Tue, Jun 16, 2020 at 4:22 AM Dave Raggett <dsr@w3.org <mailto:dsr@w3.org>> wrote:
> See also:
> 
>> An understanding of AI’s limitations is starting to sink in
>> After years of hype, many people feel AI has failed to deliver, says Tim Cross
> 
> https://www.economist.com/technology-quarterly/2020/06/11/an-understanding-of-ais-limitations-is-starting-to-sink-in <https://www.economist.com/technology-quarterly/2020/06/11/an-understanding-of-ais-limitations-is-starting-to-sink-in>
> 
> Including:
> 
>> Real managers in real companies are finding AI hard to implement and that enthusiasm is cooling
> 
> and this:
> 
>> They are powerful pattern-recognition tools, but lack many cognitive abilities that biological brains take for granted. They struggle with reasoning, generalising from the rules they discover, and with the general-purpose savoir faire that researchers, for want of a more precise description, dumb “common sense”. The result is an artificial idiot savant that can excel at well-bounded tasks, but can get things very wrong if faced with unexpected input.
> 
> That’s why the W3C Cognitive AI CG is focusing on mimicking the human brain at a functional level, and benefiting from hundreds of millions of years of evolution. This has involved a shift in mindset from logic and formal semantics to a more cognitive approach.
> 
> Manual development of symbolic AI doesn’t scale either, but a combination of symbolic and statistical approaches paves the way to cognitive agents can that can learn from experience guided by human collaborators.
> 
> The immediate challenge is to open up the use of natural language through incremental concurrent processing of syntax and semantics as a basis for addressing the abundant ambiguity in natural language and paving the way for teaching cognitive agents everyday skills. 
> 
> This is a lot easier to arrange in a cognitive architecture as it is trivial to launch cognitive processes by setting goals that trigger reasoning. You can get a first glimpse of a very simple demo at
> 
>  https://www.w3.org/Data/demos/chunks/nlp/toh/ <https://www.w3.org/Data/demos/chunks/nlp/toh/>
> 
> On Chrome it also supports speech recognition - click the microphone then hit enter if the text that appears after a second or two looks okay. This demo invokes cognition after generating the word dependency graph. The next demo will use fully concurrent processing of syntax and semantics. 
> 
> Whilst Google’s speech recognition is pretty good, today’s neural network based speech recognition lacks context, and real-time integration with semantics that would make it much more effective. In the longer term, integration with emotional processing will allow further for natural human machine interaction.
> 
> Here is a demo that shows how modelling the cortico basal ganglia circuit can support real-time control of factory machinery:
> 
>  https://www.w3.org/Data/demos/chunks/robot/ <https://www.w3.org/Data/demos/chunks/robot/>
> 
> The log shows a trace of goals and rule execution.
> 
> This is just a few tiny steps along the road to strong AI, and I am hoping to complete a number of demos on NLP and various forms machine learning over the rest of this year.
> 
> A formal spec is in preparation.
> 
> Dave Raggett <dsr@w3.org <mailto:dsr@w3.org>> http://www.w3.org/People/Raggett <http://www.w3.org/People/Raggett>
> W3C Data Activity Lead & W3C champion for the Web of things 
> 
> 
> 
> 

Dave Raggett <dsr@w3.org> http://www.w3.org/People/Raggett
W3C Data Activity Lead & W3C champion for the Web of things 

Received on Sunday, 28 June 2020 16:25:35 UTC