Re: GPT-3 - KR requirements

Thank you Adeel, useful references

I was looking at paper on arxiv but had not realized it 75 pages!!
Long read

Please comment on the implications  if you have a big brain and one
nanosecond to compute all that info-
can the knowledge/system/model be made more explicit by using symbolic KR ?

We should definitely study this paper and see if we can figure out what can
we learn about KR from this
groundbreaking work


On Mon, Jul 27, 2020 at 9:09 AM Adeel Ahmad <aahmad1811@gmail.com> wrote:

> Hello,
>
> For your reference:
>
> GPT3 -> https://arxiv.org/pdf/2005.14165.pdf
> GPT2 ->
> https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf
>
> GPT3 is a generalization of a bigger GPT2 where the transformers are
> bigger.
>
> https://transformer.huggingface.co/
> https://github.com/huggingface/transformers
>
> Thanks,
>
> Adeel
>
>
>
>
>
> On Mon, 27 Jul 2020 at 01:17, Paola Di Maio <paola.dimaio@gmail.com>
> wrote:
>
>> What I am trying to say is that KR requirements are relevant, but I dont
>> think they belong to the Goals
>> (to be discussed)
>>
>> An example that could benefit from explicit/symbolic KR:
>>
>> GPT-3
>>
>> https://www.wired.com/story/ai-text-generator-gpt-3-learning-language-fitfully/
>>
>>
>> Given tha available system (the demo and a paper somewhere) can anyone
>> figure out the AI system architecture of GPT-3?
>>
>> Has anyone on this list been able to access the demo and future out which
>> concepts and rules the system
>> reasons with?
>> It would be fantastic if someone could figure it out
>>
>>  Explicit, symbolic (expressed in natural language and as schematics with
>> diagram) AI KR can help to make explicit the workings behind ML algorithms
>>
>>
>> P
>>
>>
>
> --
> Thanks,
>
> Adeel Ahmad
> m: (+44) 7721724715
> e: aahmad1811@gmail.com
>
>
>
>

Received on Monday, 27 July 2020 01:36:40 UTC