Re: GPT Builder - portability / import/export

Hi Dave,

I was experimenting with an idea of creating a GPT to help with solid stuff
generally...  so, then, looking at how to create the GPT - immediate
consideration was, where's the import/export file / schema /
specifications...  can i upload an RDF doc and / or create one...

answer atm, is no.  so, then next problem that i considered was about
whether and/or how any such form of config file could be created to be
loaded into a users fav. large learning model (llm)...

other than that; i do understand its different to your broader (far more
complex) focus..  me too!

nonetheless, seems like something that a w3c cg should do.. I'll write
something for the human centric ai cg, and/or maybe update the notes I made
about the notion today and post it somewhere.

NB: I am personally not very interested in the field of tech based upon
thin-client (sending every keystroke to others 'cloud') models; far more
interested in advancing personal cloud.  but this isn't necessarily the
case with others. portability / interop / compatibility, has aspects that
impact my notions of human centric outcomes (ie: protection against
platform lock in); but idk.  there's also the interface aspects between
personal stuff, and stuff that needs serious hardware, etc...  or might be
operated by a workplace, or provider (ie: hospital).

i'll think about it more.

best.

tim.h.(TCH)

.

On Sat, 25 Nov 2023 at 21:31, Dave Raggett <dsr@w3.org> wrote:

> Hi Tim,
>
> There is a lot of work around the current architecture for LLMs, e.g. fine
> tuning, use of retrieval augmented generation, etc. and de facto standards
> are likely to emerge naturally as a result of the dominant roles of the big
> players, and also through re-use of open source libraries.
>
> I am more interested in novel architectures that can integrate episodic
> and encyclopaedic memory, along with explicit handling of Type 1 & 2
> processing:
>
> Type 1 processing is fast, automatic, and opaque, e.g. recognising a cat
> in a photograph or a traffic sign when driving a car.
>
> Type 2 processing is slow, deliberative, and open to introspection, e.g.
> mental arithmetic. It  is formed by chaining Type 1 processes using working
> memory.
>
> Kind regards,
>      Dave
>
> On 25 Nov 2023, at 03:04, Timothy Holborn <timothy.holborn@gmail.com>
> wrote:
>
> Hi Dave,
>
> There's a few 'Create a GPT'  options appearing,
>
> https://help.openai.com/en/articles/8554397-creating-a-gpt
>
>
> https://powervirtualagents.microsoft.com/en-us/blog/microsoft-power-virtual-agents-now-part-of-microsoft-copilot-studio/
>
> I haven't tried the microsoft one (not sure if i have access) but have had
> a bit of a look at the OpenAI one - it says, i can upload a file to provide
> it additional knowledge, but i don't have a clear idea of the
> specifications that relate to what files i could upload.
>
> I think you are far more interested in these Large Learning Model platform
> GPTs, etc...  I was thinking of a specification for a file that could
> enable portability, import/export, a bit like 'data portability' but more
> GPT portability or an ability to have a file that defines a set of
> characteristics that then - kinda work, in models that support that
> recommendation / standard...
>
> IE: a 'solid bot' or a 'credentials bot' or indeed, a bot that is uploaded
> with different 'packages'...
>
> ie: https://www.youtube.com/watch?v=OrzgxUhnYjY
>
> FWIW: plan has been to do something similar for HumanCentricAI agents;
> but, the operation would be local, not via these platform focused
> alternatives....
>
> SO, I thought you might be interested in having a think about it via CogAI?
>
>
>
> Tim.H. (TCH)
>
>
> Dave Raggett <dsr@w3.org>
>
>
>
>

Received on Saturday, 25 November 2023 14:07:14 UTC