Re: Different kinds of memory

> On 10 Jul 2024, at 20:39, Jacob Everist <jacob.everist@gmail.com> wrote:
> 
>> Memory traces fade with time unless reinforced by replay.
> 
> So far so good, but it was at this point I started to lose track.  What do you mean here by "memory traces"?   What is their structure, how do they fade, and what is their function?  

A Memory Trace is a neurological concept that refers to the physical representation or encoding of a memory within the brain.  (From: https://psychology.tips/memory-trace/)

> Going further, can you define the function and structure of your version of the following:
> 1) phonological loop
> 2) short-term memory (especially, how it is different from above)
> 3) long-term memory

The phonological loop was introduced by Baddeley and Hitch in 1974 as a model of how humans handle auditory information.  They also talked about the visuospatial sketch pad for visual and spatial information. Both are considered as kinds of sensory memory which is very short lived.  A summary of their work can be found at: https://www.simplypsychology.org/working-memory.html

You can find an introduction to different kinds of memory at: https://www.psychologytoday.com/us/basics/memory/types-of-memory

>  
>> In essence, this treats memory as a sum over traces where each trace is a circular convolution of the item and its temporal position.  
> 
> A trace is some kind of vector?  Perhaps you can define the circular convolution and temporal position as well.  That way, I can get your definition of "memory".   Although, which one?

Artificial neural network models usually consist of a sequence of interconnected layers. The neural outputs for a layer can be considered as a vector of numbers.  Vectors and more generally tensors can be efficiently processed using parallel computing hardware, e.g. GPUs and TPUs.

Circular convolution is a mathematical operation on a pair of vectors that yields a new vector. The vectors are all of the same length, i.e. they have the same number of items. Temporal position is a vector that corresponds to a point in time.  A very similar approach is used in Transformer models for LLMs in relation to the attention paid by one word to other words in the input text.

> 
>> Trace retrieval will be noisy,
> 
> What are we retrieving a trace from?  Where is it going to?

Memory recall involves using a cue to retrieve a previously stored vector.  The use of circular convolution introduces some noise into the result, necessitating a clean up step.

Hoping that helps.

Dave Raggett <dsr@w3.org>

Received on Thursday, 11 July 2024 13:32:01 UTC