Re: GPT-3 - KR requirements

Hello,

For your reference:

GPT3 -> https://arxiv.org/pdf/2005.14165.pdf
GPT2 ->
https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf

GPT3 is a generalization of a bigger GPT2 where the transformers are bigger.

https://transformer.huggingface.co/
https://github.com/huggingface/transformers

Thanks,

Adeel





On Mon, 27 Jul 2020 at 01:17, Paola Di Maio <paola.dimaio@gmail.com> wrote:

> What I am trying to say is that KR requirements are relevant, but I dont
> think they belong to the Goals
> (to be discussed)
>
> An example that could benefit from explicit/symbolic KR:
>
> GPT-3
>
> https://www.wired.com/story/ai-text-generator-gpt-3-learning-language-fitfully/
>
>
> Given tha available system (the demo and a paper somewhere) can anyone
> figure out the AI system architecture of GPT-3?
>
> Has anyone on this list been able to access the demo and future out which
> concepts and rules the system
> reasons with?
> It would be fantastic if someone could figure it out
>
>  Explicit, symbolic (expressed in natural language and as schematics with
> diagram) AI KR can help to make explicit the workings behind ML algorithms
>
>
> P
>
>

-- 
Thanks,

Adeel Ahmad
m: (+44) 7721724715
e: aahmad1811@gmail.com

Received on Monday, 27 July 2020 01:09:15 UTC